Even though gender wage inequality is present among countries across all levels of economic development, the gap in years of schooling between females and males diminishes with the countries income level. Thus, wage differences cannot explain the gender differentials in formal education for countries with different incomes. We calibrate a general equilibrium, multi-sector, -gender, and -production technology model to show that gender-specific sectoral comparative advantages explain a large portion of the schooling gap decline as income increases. Due to these comparative advantages, relative female hours in paid work are greater in high-income countries, incentivizing female education. We additionally show that non-homothetic preferences are essential to explain diminishing schooling differences correctly. Ignoring the income channel stemming from non-homotheticities leads to overestimating the decline of schooling in developed countries. Our results support that the de-invisibiliation of female work observed in developed countries can be attributed to the rise of female labor specialization via education to meet the expansion of modern service sectors.
Decision-makers that choose information strategies instead of concrete actions elect stochastic choice rules that leave open the potential for errors, which can obfuscate the strategic interactions of players. This article establishes that dynamic, stochastic, games with rationally inattentive agents have Nash equilibria in which players coordinate their choice rules. When these choice rules are compliant with the predisposition of players towards particular actions, the Nash equilibria can be expressed in terms of dynamic logit rules. This result reduces finding Nash equilibria of this type to establishing a joint distribution of actions and states that accommodates the optimal behavior of all players. Logit rule Nash equilibria are used to study an example of strategic coordination, and an example of a zero-sum, strategic conflict game with sequential moves. The resulting equilibria of the second example involve strategies that exhibit conditioning characteristics and can be relevant for applications in principal-agent problems.
Market models constitute a major cornerstone of empirical research in industrial organization and macroeconomics. Previous literature in these fields has proposed a variety of estimation methods both for markets in equilibrium, which typically entail a market-clearing condition, and in disequilibrium, in which the primary identification condition comes from the short-side rule. Although methodologically attractive, the estimation methods of such models, in particular of the disequilibrium models, is computationally demanding and software providing simple, out-of-the-box methods for estimating them is scarce. Econometricians, therefore, mostly rely on their own implementations for estimating these models. This article presents the R package markets, which provides functionality to simplify the estimation of models for markets in equilibrium and disequilibrium using full information maximum likelihood methods. The basic functionality of the package is presented based on the data and the classic analysis originally performed by Fair & Jaffee (1972). The article also gives an overview of the design of the package, presents the post-estimation analysis capabilities that accompany it, and provides statistical evidence of the computational performance of its functionality gathered via large-scale benchmarking simulations. Markets is free software that is distributed under the MIT license as part of the R software project. It comprises a set of estimation tools, which are to a large extend not available from either alternative R packages or other statistical software projects.
Shortages and surpluses appear in many markets both under exceptional and typical circumstances. This article proposes an assessment of the appropriateness of market-clearing in econometric modeling. The methodology allows the comparison of equilibrium and disequilibrium models with known likelihoods. Its performance is examined in a controlled environment using large-scale simulations of five market models. An application of the methodology using US retail and scanner deodorant data shows that, during times of distress, exogenous shocks can improve the effectiveness of the price mechanism. The results of this article may serve as empirical justifications of deviations from market-clearing.
This study investigates how financing conditions were affected during the 2009 financial crisis using German firms financial statement data gathered for regulatory purposes. Policies addressing financing difficulties often presume a higher vulnerability of small firms based on various proxies of financial constraints that use size as one key constituent. In contrast, our analysis is based on structural estimations of financial constraints that disentangle it from size, which allows us to study the relativistic effects between small and large firms. We show that the worsening of financing conditions during the crisis did not depend on a firm's size. Instead, it was the unavailability of financing alternatives, the limited capacity in providing collateral, and the operational risk that intensified financial constraints. Our results suggest that policies that address these underlying causes of vulnerability during financial contractions instead of in general enhancing access to funding for SMEs are potentially more efficient.
Previous research on the allocation of attention of cognitively constrained individuals has focused on the filtering aspects of their decision processes, according to which the attentional intensity is consciously elected. Human attention, however, is not merely of voluntary intensity; it is rather also of volitional placement nature. This article proposes a framework that takes into account such volitional elements of attention and is consistent with the dynamic behavior of individuals who prefer to maintain the status quo of their consideration context when making decisions. A consumer search application using US store and scanner data indicates that decision framing costs in retail markets are state-dependent and non-linear. Costs are found not to exclusively depend on the cardinality of consideration sets, but also experience and attentional associations between considered items. Neglecting, thereby, these aspects potentially leads to misidentification of product-preferences. The results of this article give an endogenous explanation of narrow framing and home bias phenomena in consumption and investment choices.
Broad, long-term financial, and economic datasets are scarce resources, particularly in the European context. In this article, we present an approach for an extensible data model that is adaptable to future changes in technologies and sources. This model may constitute a basis for digitized and structured long-term historical datasets for different jurisdictions and periods. The data model covers the specific peculiarities of historical financial and economic data and is flexible enough to reach out for data of different types (quantitative as well as qualitative) from different historical sources, hence, achieving extensibility. Furthermore, we outline a relational implementation of this approach based on historical German firm and stock market data from 1920 to 1932.
This chapter analyzes a central part of an EU-funded, seven-nations development project for the comprehensive interdisciplinary design of a European system to collect and collate historical financial and firm data (named EurHisFirm)—the responsibility of the authors was the design of a Common Data Model (CDM). Against the background that successful information systems are of the type “sociotechnical systems” between human applicants and information technology—mutually driving each other but likewise also depending on the input of the respective opposite side—we have strong indications that in complex decision situations human cooperation deficiencies substantially outweigh expectable exponential advancements of the information technology. The reason is presumably that amongst diverse and self-confident nations—actually persons—(likewise in important sub-national groups of responsibility, e.g., communal authorities or firms) reaching an agreement on data and other standards is an overly lengthy process that often ends with foul compromises. We understand our contribution to bundle substantial indications toward a possible enhancement of the state-of-the-art—however, fellow researchers should thoroughly investigate the approach.
Conference Papers and Proceedings
This paper reports results from the design phase of EurHisFirm. Its goal is to integrate isolated and badly accessible financial data sets on 19 th and 20 th century European companies so that users can query the data as if they reside in one large database. In addition, it wants to stimulate database construction by providing not only methodology and tools to connect to and collaborate with existing ones, but also a collaborative platform, based on machine learning and artificial intelligence, that allows harvesting data in a semi-automatic way. We present the proof-of-concept results of this platform in addition to the performance of matching algorithms, which are necessary to connect and collate the different constituent databases as well as to connect them to contemporary commercial databases.
Broad, long-term financial, and economic datasets are a scarce resource, particularly in the European context. In this paper, we present an approach for an extensible data model that is adaptable to future changes in technologies and sources. This model may constitute a basis for digitised and structured long-term historical datasets. The data model covers the specific peculiarities of historical financial and economic data and is flexible enough to reach out for data of different types (quantitative as well as qualitative) from different historical sources, hence, achieving extensibility. Furthermore, based on historical German firm and stock market data, we discuss a relational implementation of this approach.
The fourth report of Work Package 5 provides the latest revisions of the Common Data Model standard specifications. The different foundational elements of the Common Data Model are presented and explained. The report also summarisesthe results of stakeholder feedback and describes their implications on the Common Data Model. Finally, we give an outlook on the further development of the Common Data Model and its components.
The second report of Work Package 5 completes the discussion of the preliminary back-end design concepts of the common data model. The approach of the report is characterized by the principle of least intrusiveness. The proposed solutions respect national idiosyncrasies and allow national centers to advance in a collaborative but independent manner. The report starts by reviewing the data formats of the countries of the consortium. It draws from identification theory and proposes appropriate principles and requirements for the common model's identification design. It examines the functional and informational requirements for identifying various data items and for linking historical data from within the consortium to external databases with contemporary data. It outlines that the common model's implementation mostly benefits from employing both relational and non-relational technologies to address different issues. It highlights appropriate, subsequent steps for cross-country harmonization, firm-linking, data transformation processes, and data governance.
The report reviews a selection of existing micro-level data-model implementations both from within as well as outside the consortium's countries and identifies best design practices. It proposes preliminary model concepts for EURHISFIRM's metadata scheme and evaluation criteria for assessing the effectiveness of historical, cross-country, company-level data models. Since there is no precedence in designing such models, the report methodologically introduces a conceptual 2-dimensional separation on the information space that EURHISFIRM's model aims to cover and reviews representative implementations from each subpart. The first dimension concerns the time domain. In this dimension, the reviewed models are classified either as contemporary or as historical. The second dimension concerns the cross-country domain. Models here are classified either as national or as international. The analysis constitutes one fundamental block upon which the process of synthesizing national models into a unified European common model builds.
My thesis discusses topics that range from the behavioral characteristics of dynamic decisionmaking to the efficiency of the collective forces in free markets. The purpose of the thesis is not to provide a universal link of these topics, and the contributions of each chapter focus on issues that are of particular interest on their own, the empirical example of the second chapter demonstrates the importance of combining the economic thinking of the introspective and the detached. I hope that the reader will appreciate both the comprehensive approach of the thesis and the detailed-oriented analysis of the topics of each chapter.