Research

Working Papers

Abstract
Decision-makers that choose information strategies instead of concrete actions elect stochastic choice rules that leave open the potential for errors, which can obfuscate the strategic interactions of players. This article establishes that dynamic, stochastic, games with rationally inattentive agents have Nash equilibria in which players coordinate their choice rules. When these choice rules are compliant with the predisposition of players towards particular actions, the Nash equilibria can be expressed in terms of dynamic logit rules. This result reduces finding Nash equilibria of this type to establishing a joint distribution of actions and states that accommodates the optimal behavior of all players. Logit rule Nash equilibria are used to study an example of strategic coordination, and an example of a zero-sum, strategic conflict game with sequential moves. The resulting equilibria of the second example involve strategies that exhibit conditioning characteristics and can be relevant for applications in principal-agent problems.
Abstract
Market models constitute a major cornerstone of empirical research in industrial organization and macroeconomics. Previous literature in these fields has proposed a variety of estimation methods both for markets in equilibrium, which typically entail a market-clearing condition, and in disequilibrium, in which the primary identification condition comes from the short-side rule. Although methodologically attractive, the estimation methods of such models, in particular of the disequilibrium models, is computationally demanding and software providing simple, out-of-the-box methods for estimating them is scarce. Econometricians, therefore, mostly rely on their own implementations for estimating these models. This article presents the R package diseq, which provides functionality to simplify the estimation of models for markets in equilibrium and disequilibrium using full information maximum likelihood methods. The basic functionality of the package is presented based on the data and the classic analysis originally performed by Fair & Jaffee (1972). The article also gives an overview of the design of the package, presents the post-estimation analysis capabilities that accompany it, and provides statistical evidence of the computational performance of its functionality gathered via large-scale benchmarking simulations. Diseq is free software that is distributed under the MIT license as part of the R software project. It comprises a set of estimation tools, which are to a large extend not available from either alternative R packages or other statistical software projects.
Abstract
Broad, long-term financial, and economic datasets are a scarce resource, particularly in the European context. In this paper, we present an approach for an extensible data model that is adaptable to future changes in technologies and sources. This model may constitute a basis for digitised and structured long-term historical datasets. The data model covers the specific peculiarities of historical financial and economic data and is flexible enough to reach out for data of different types (quantitative as well as qualitative) from different historical sources, hence, achieving extensibility. Furthermore, based on historical German firm and stock market data, we discuss a relational implementation of this approach.
Abstract
Shortages and surpluses appear in many markets both under exceptional and typical circumstances. This article proposes an assessment of the appropriateness of market-clearing in econometric modeling. The methodology allows the comparison of equilibrium and disequilibrium models with known likelihoods. Its performance is examined in a controlled environment using large-scale simulations of five market models. An application of the methodology using US retail and scanner deodorant data shows that, during times of distress, exogenous shocks can improve the effectiveness of the price mechanism. The results of this article may serve as empirical justifications of deviations from market-clearing.
Abstract
This study investigates how financing conditions were affected during the 2009 financial crisis using German firms financial statement data gathered for regulatory purposes. Policies addressing financing difficulties often presume a higher vulnerability of small firms based on various proxies of financial constraints that use size as one key constituent. In contrast, our analysis is based on structural estimations of financial constraints that disentangle it from size, which allows us to study the relativistic effects between small and large firms. We show that the worsening of financing conditions during the crisis did not depend on a firm's size. Instead, it was the unavailability of financing alternatives, the limited capacity in providing collateral, and the operational risk that intensified financial constraints. Our results suggest that policies that address these underlying causes of vulnerability during financial contractions instead of in general enhancing access to funding for SMEs are potentially more efficient.
Abstract
Previous research on the allocation of attention of cognitively constrained individuals has focused on the filtering aspects of their decision processes, according to which the attentional intensity is consciously elected. Human attention, however, is not merely of voluntary intensity; it is rather also of volitional placement nature. This article proposes a framework that takes into account such volitional elements of attention and is consistent with the dynamic behavior of individuals who prefer to maintain the status quo of their consideration context when making decisions. A consumer search application using US store and scanner data indicates that decision framing costs in retail markets are state-dependent and non-linear. Costs are found not to exclusively depend on the cardinality of consideration sets, but also experience and attentional associations between considered items. Neglecting, thereby, these aspects potentially leads to misidentification of product-preferences. The results of this article give an endogenous explanation of narrow framing and home bias phenomena in consumption and investment choices.

Technical Reports

Abstract
The second report of Work Package 5 completes the discussion of the preliminary back-end design concepts of the common data model. The approach of the report is characterized by the principle of least intrusiveness. The proposed solutions respect national idiosyncrasies and allow national centers to advance in a collaborative but independent manner. The report starts by reviewing the data formats of the countries of the consortium. It draws from identification theory and proposes appropriate principles and requirements for the common model's identification design. It examines the functional and informational requirements for identifying various data items and for linking historical data from within the consortium to external databases with contemporary data. It outlines that the common model's implementation mostly benefits from employing both relational and non-relational technologies to address different issues. It highlights appropriate, subsequent steps for cross-country harmonization, firm-linking, data transformation processes, and data governance.
Abstract
The report reviews a selection of existing micro-level data-model implementations both from within as well as outside the consortium's countries and identifies best design practices. It proposes preliminary model concepts for EURHISFIRM's metadata scheme and evaluation criteria for assessing the effectiveness of historical, cross-country, company-level data models. Since there is no precedence in designing such models, the report methodologically introduces a conceptual 2-dimensional separation on the information space that EURHISFIRM's model aims to cover and reviews representative implementations from each subpart. The first dimension concerns the time domain. In this dimension, the reviewed models are classified either as contemporary or as historical. The second dimension concerns the cross-country domain. Models here are classified either as national or as international. The analysis constitutes one fundamental block upon which the process of synthesizing national models into a unified European common model builds.