Unbundling Macroeconomics via Heterogeneous Agents and Input-Output Networks -- by David Rezza Baqaee, Emmanuel Farhi
The goal of this paper is to simultaneously unbundle two interacting reduced-form building blocks of traditional macroeconomic models: the representative agent and the aggregate production function. We introduce a broad class of disaggregated general equilibrium models with Heterogeneous Agents and Input-Output networks (HA-IO). We elucidate their properties through two sets of results describing the propagation and aggregation of shocks. First, we characterize how shocks affect prices and quantities of goods and factors. Even with purely microeconomic shocks, the mapping from structural primitives to observed effects is complicated by "local" general equilibrium forces. Our framework shows how to account for these forces, and helps interpret IV-based cross-sectional regression results. We also uncover a surprising property of a large class of efficient representative agent models: they feature symmetric propagation in that a shock to producer i affects the sales of producer j in exactly the same way that a shock to j affects the sales of i. This improbable symmetry breaks in the presence of heterogeneous agents or distortions. Second, we provide aggregation results characterizing the responses of industry-level variables such as markups and productivity. The behavior of these aggregates is particularly delicate in inefficient economies: they respond to microeconomic shocks outside of the industry; and they can give rise to fallacies of composition whereby aggregates move in the opposite direction of their microeconomic counterparts. Our results shed light on many seemingly disparate applied questions, such as: sectoral co-movement in business cycles; factor-biased technical change in task-based models; structural transformation; the effects of corporate taxation; and the dependence of fiscal multipliers on the composition of government spending.
The conduct of adjudication is often influenced by motions--requests made by litigants to modify the course of adjudication. The question studied in this article is why adjudication is designed so as to permit the use of motions. The answer developed is that litigants will naturally know a great deal about their specific matter, whereas a court will ordinarily know little except to the degree that the court has already invested effort to appreciate it. By giving litigants the right to bring motions, the judicial system leads litigants to efficiently provide information to courts that is relevant to the adjudicative process.
Forced Migration and Human Capital: Evidence from Post-WWII Population Transfers -- by Sascha O. Becker, Irena Grosfeld, Pauline Grosjean, Nico Voigtlaender, Ekaterina Zhuravskaya
We exploit a unique historical setting to study the long-run effects of forced migration on investment in education. After World War II, the Polish borders were redrawn, resulting in large-scale migration. Poles were forced to move from the Kresy territories in the East (taken over by the USSR) and were resettled mostly to the newly acquired Western Territories, from which Germans were expelled. We combine historical censuses with newly collected survey data to show that, while there were no pre-WWII differences in education, Poles with a family history of forced migration are significantly more educated today. Descendants of forced migrants have on average one extra year of schooling, driven by a higher propensity to finish secondary or higher education. This result holds when we restrict ancestral locations to a subsample around the former Kresy border and include fixed effects for the destination of migrants. As Kresy migrants were of the same ethnicity and religion as other Poles, we bypass confounding factors of other cases of forced migration. We show that labor market competition with natives and selection of migrants are also unlikely to drive our results. Survey evidence suggests that forced migration led to a shift in preferences, away from material possessions and towards investment in a mobile asset - human capital. The effects persist over three generations.
The federal government owns and administers 472, 892,659 acres or 21% of the land area of the lower US, making it both the country's largest land owner and among the largest by a central government among western democracies. This condition is surprising, given that the US generally is viewed as more oriented toward private property rights and markets. The land largely is managed by the US Forest Service and the Bureau of Land Management, staffed by unelected, career civil servants who hold tenure to their positions. Access and use regulations are administered by agency officials who have wide latitude under all-purpose legislation passed by Congress. Their actions are influenced by bureaucratic incentives and by lobby groups seeking to influence federal land policy. General citizens have little information about how policies are determined and only costly recourse to challenge them. Other than the comparatively small, 27,400,000 acres in National Parks, most of the land has no important amenity values nor apparent major externalities associated with use. These lands were to be transferred to private claimants under 19th century land laws. This paper examines how this vast area came to be withheld by the federal government and the role of the environmental movement in the process. Market failure and externalities were asserted justifications, but there is no strong supportive evidence. Although externalities were possible, the most obvious solution was to define property rights more completely. This option was and remains rejected by politicians, agency officials, and those lobby groups that sought permanent management and control over federal lands. Sustained-yield was an overarching objective, but it is a biological and not an economic concept and the human welfare outcomes of bureaucratic management may be large.
Government Guarantees and the Valuation of American Banks -- by Andrew G. Atkeson, Adrien d'Avernas, Andrea L. Eisfeldt, Pierre-Olivier Weill
Banks' ratio of the market value to book value of their equity was close to 1 until the 1990s, then more than doubled during the 1996-2007 period, and fell again to values close to 1 after the 2008 financial crisis. Sarin and Summers (2016) and Chousakos and Gorton (2017) argue that the drop in banks' market-to-book ratio since the crisis is due to a loss in bank franchise value or profitability. In this paper we argue that banks' market-to-book ratio is the sum of two components: franchise value and the value of government guarantees. We empirically decompose the ratio between these two components and find that a large portion of the variation in this ratio over time is due to changes in the value of government guarantees.
The IT Revolution and the Globalization of R&D -- by Lee G. Branstetter, Britta M. Glennon, J. Bradford Jensen
Since the 1990s, R&D has become less geographically concentrated, and has seen especially fast growth in emerging markets. One of the distinguishing features of the R&D globalization phenomenon is its concentration within the software/IT domain; the increase in foreign R&D has been largely concentrated within software and IT-intensive multinationals, and new R&D destinations are also more software and IT-intensive multinationals than traditional R&D destinations. In this paper we document three important phenomena: (1) the globalization of R&D, (2) the growing importance of software and IT to firm innovation, and (3) the rise of new R&D hubs. We argue that the shortage in software/IT-related human capital resulting from the large IT- and software-biased shift in innovation drove US MNCs abroad, and particularly drove them abroad to "new hubs" with large quantities of STEM workers who possessed IT and software skills. Our findings support the view that the globalization of US multinational R&D has reinforced the technological leadership of US-based firms in the information technology domain and that multinationals' ability to access a global talent base could support a high rate of innovation even in the presence of the rising (human) resource cost of frontier R&D.
Termination Risk and Agency Problems: Evidence from the NBA -- by Alma Cohen, Nadav Levy, Roy Sasson
When organizational structures and contractual arrangements face agents with a significant risk of termination in the short term, such agents may under-invest in projects whose results would be realized only in the long term. We use NBA data to study how risk of termination in the short term affects the decision of coaches. Because letting a rookie play produces long-term benefits on which coaches with a shorter investment horizon might place lower weight, we hypothesize that higher termination risk might lead to lower rookie participation. Consistent with this hypothesis, we find that, during the period of the NBA's 1999 collective bargaining agreement (CBA) and controlling for the characteristics of rookies and their teams, higher termination risk was associated with lower rookie participation and that this association was driven by important games. We also find that the association does not exist for second-year players and that the identified association disappeared when the 2005 CBA gave team owners stronger incentives to monitor the performance of rookies and preclude their underuse.
In a multiperiod investment framework, firms with high expected growth earn higher expected returns than firms with low expected growth, holding investment and expected profitability constant. This paper forms cross-sectional growth forecasts, and constructs an expected growth factor that yields an average premium of 0.82% per month (t = 9.81). The q5-model, which augments the Hou-Xue-Zhang (2015) q-factor model with the new factor, shows strong explanatory power in the cross section, and outperforms other recently proposed factor models such as the Fama-French (2018) six-factor model.
Inequality, Business Cycles, and Monetary-Fiscal Policy -- by Anmol Bhandari, David Evans, Mikhail Golosov, Thomas J. Sargent
We study optimal monetary and fiscal policy in a model with heterogeneous agents, incomplete markets, and nominal rigidities. We develop numerical techniques to approximate Ramsey plans and apply them to a calibrated economy to compute optimal responses of nominal interest rates and labor tax rates to aggregate shocks. Responses differ qualitatively from those in a representative agent economy and are an order of magnitude larger. Taylor rules poorly approximate the Ramsey optimal nominal interest rate. Conventional price stabilization motives are swamped by an across person insurance motive that arises from heterogeneity and incomplete markets.
Characterization, Existence, and Pareto Optimality in Insurance Markets with Asymmetric Information with Endogenous and Asymmetric Disclosures: Revisiting Rothschild-Stiglitz -- by Joseph E. Stiglitz, Jungyoll Yun, Andrew Kosenko
We study the Rothschild-Stiglitz model of competitive insurance markets with endogenous information disclosure by both firms and consumers. We show that an equilibrium always exists, (even without the single crossing property), and characterize the unique equilibrium allocation. With two types of consumers the outcome is particularly simple, consisting of a pooling allocation which maximizes the well-being of the low risk individual (along the zero profit pooling line) plus a supplemental (undisclosed and nonexclusive) contract that brings the high risk individual to full insurance (at his own odds). We show that this outcome is extremely robust and Pareto efficient.
On a levelized-cost basis, solar and wind power generation are now competitive with fossil fuels. But supply of these renewable resources is variable and intermittent, unlike traditional power plants. As a result, the cost of using flat retail pricing instead of dynamic, marginal-cost pricing--long advocated by economists--will grow. We evaluate the potential gains from dynamic pricing in high-renewable systems using a novel model of power supply and demand in Hawai'i. The model breaks new ground in integrating investment in generation and storage capacity with chronological operation of the system, including an account of reserves, a demand system with different interhour elasticities for different uses, and substitution between power and other goods and services. The model is open source and fully adaptable to other settings. Consistent with earlier studies, we find that dynamic pricing provides little social benefit in fossil-fuel-dominated power systems, only 2.6 to 4.6 percent of baseline annual expenditure. But dynamic pricing leads to a much greater social benefit of 8.5 to 23.4 percent in a 100 percent renewable power system with otherwise similar assumptions. High renewable systems, including 100 percent renewable, are remarkably affordable. The welfare maximizing (unconstrained) generation portfolio under the utility's projected 2045 technology and pessimistic interhour demand flexibility uses 79 percent renewable energy, without even accounting for pollution externalities. If overall demand for electricity is more elastic than our baseline (0.1), renewable energy is even cheaper and variable pricing can improve welfare by as much as 47 percent of baseline expenditure.
Platforms, Promotion, and Product Discovery: Evidence from Spotify Playlists -- by Luis Aguiar, Joel Waldfogel
Digitization has vastly increased the amount of new music produced and available directly to consumers. While this has levelled the playing field between already-prominent and new artists, creators may now be dependent on platform decisions about which songs and artist to promote. With Spotify emerging as a major interactive streaming platform, this paper explores the effect of Spotify's playlists on both the promotion of songs and the discovery of music by new artists, using four approaches. First, we examine songs' streaming volumes before and after their addition to, and removal from, major global playlists. Second, we compare streaming volumes for songs just on, and just off, algorithmic top 50 playlists. Third, we make use of cross-country differences in inclusion on New Music Friday lists, using song fixed effects to explain differences in streaming. Fourth, we develop an instrumental variables approach to explaining cross-country New Music Friday rank differentials based on home bias. Being added to Today's Top Hits, a list with 18.5 million followers during the sample period, raises streams by almost 20 million and is worth between $116,000 and $163,000. Inclusion on New Music Friday lists substantially raises the probability of song success, including for new artists.