Research

Working Papers

Abstract
The limited availability of methodologies that evaluate departures from market-clearing discourages the application of estimation methods that allow them. Nevertheless, shortages and surpluses appear in a plethora of markets not only under exceptional but also under normal circumstances. In this article, I propose a statistical assessment of the market-clearing condition, which allows the comparison of equilibrium and disequilibrium models for which the likelihoods are known. An application of the methodology using US retail and scanner deodorant data provides evidence that, during times of distress, exogenous shocks can strengthen the price mechanism of markets that are not characterized by some innate failure. In prosperous times, market participants may become more complacent and, as a result, the price mechanism can be weakened. Analytic expressions, simulations, and computational benchmarks for five market models are also presented. The results of this article may serve in empirical justifications of deviations from market-clearing.
Abstract
This study investigates how financing conditions were affected during the 2009 financial crisis using German firms financial statement data gathered for regulatory purposes. Policies addressing financing difficulties often presume a higher vulnerability of small firms on the basis of various proxies of financial constraints that use size as one key constituent. In contrast, our analysis is based on structural estimations of financial constraints that disentangle it from size, which allows us to study the relativistic effects between small and large firms. We show that the worsening of financing conditions during the crisis did not depend on a firm s size. Instead, it was the unavailability of alternatives, the limited capacity in providing collateral, and the operational risk that intensified financial constraints. Our results suggest that it is more efficient for policies to address these underlying causes of vulnerability during financial contractions instead of enhancing access to funding for SMEs in general.
Abstract
Dynamic choices of cognitively constrained individuals have been ascribed, by the rational inattention theory, as being a filtering problem, in which the attentional intensity is consciously elected. Human attention, however, is not merely of voluntary intensity; it is rather of volitional placement nature. In this article, I study the implications of a dynamic theory of volitional attention describing individuals who choose among the alternatives that are induced by their attentional choices. Firstly, a consumer search application using US scanner data indicates that attentional choices are characterized by complementarities. Costs do not exclusively depend on the cardinality of consideration sets, but also on experience and similarities between considered items. Neglecting, thereby, attentional placement aspects potentially leads to misidentification of preference characteristics. Secondly, participation inertia in financial markets is revisited under the scope of volitional attention and a cultural-based argument of why it can be optimal not to participate is provided.

Technical Reports

Abstract
The second report of Work Package 5 completes the discussion of the preliminary back-end design concepts of the common data model. The approach of the report is characterized by the principle of least intrusiveness. The proposed solutions respect national idiosyncrasies and allow national centers to advance in a collaborative but independent manner. The report starts by reviewing the data formats of the countries of the consortium. It draws from identification theory and proposes appropriate principles and requirements for the common model s identification design. It examines the functional and informational requirements for identifying various data items and for linking historical data from within the consortium to external databases with contemporary data. It outlines that the common model s implementation mostly benefits from employing both relational and non-relational technologies to address different issues. It highlights appropriate, subsequent steps for cross-country harmonization, firm-linking, data transformation processes, and data governance.
Abstract
The report reviews a selection of existing micro-level data-model implementations both from within as well as outside the consortium s countries and identifies best design practices. It proposes preliminary model concepts for EURHISFIRM s metadata scheme and evaluation criteria for assessing the effectiveness of historical, cross-country, company-level data models. Since there is no precedence in designing such models, the report methodologically introduces a conceptual 2-dimensional separation on the information space that EURHISFIRM s model aims to cover and reviews representative implementations from each subpart. The first dimension concerns the time domain. In this dimension, the reviewed models are classified either as contemporary or as historical. The second dimension concerns the cross-country domain. Models here are classified either as national or as international. The analysis constitutes one fundamental block upon which the process of synthesizing national models into a unified European common model builds.