How Well Do Automated Methods Perform in Historical Samples? Evidence from New Ground Truth -- by Martha Bailey, Connor Cole, Morgan Henderson, Catherine Massey
New large-scale data linking projects are revolutionizing empirical social science. Outside of selected samples and tightly restricted data enclaves, little is known about the quality of these "big data" or how the methods used to create them shape inferences. This paper evaluates the performance of commonly used automated record-linking algorithms in three high quality historical U.S. samples. Our findings show that (1) no method (including hand linking) consistently produces samples representative of the linkable population; (2) automated linking tends to produce very high rates of false matches, averaging around one third of links across datasets and methods; and (3) false links are systematically (though differently) related to baseline sample characteristics. A final exercise demonstrates the importance of these findings for inferences using linked data. For a common set of records, we show that algorithm assumptions can attenuate estimates of intergenerational income elasticities by almost 50 percent. Although differences in these findings across samples and methods caution against the generalizability of specific error rates, common patterns across multiple datasets offer broad lessons for improving current linking practice.
The Information Pharms Race and Competitive Dynamics of Precision Medicine: Insights from Game Theory -- by Ernst R. Berndt, Mark R. Trusheim
Precision medicines inherently fragment treatment populations, generating small-population markets, creating high-priced "niche busters" rather than broadly prescribed "blockbusters". It is plausible to expect that small markets will attract limited entry in which a small number of interdependent differentiated product oligopolists will compete, each possessing market power. Multiple precision medicine market situations now resemble game theory constructs such as the prisoners' dilemma and Bertrand competition. The examples often involve drug developer choices created by setting the cut-off value for the companion diagnostics to define the precision medicine market niches and their payoffs. Precision medicine game situations may also involve payers and patients who attempt to change the game to their advantage or whose induced behaviors alter the payoffs for the developers. The variety of games may predictably array themselves across the lifecycle of each precision medicine indication niche and so may become linked into a sequentially evolving meta-game. We hypothesize that certain precision medicine areas such as inflammatory diseases are becoming complex simultaneous multi-games in which distinct precision medicine niches compete. Those players that learn the most rapidly and apply those learnings the most asymmetrically will be advantaged in this ongoing information pharms race.
Orphan Drug Designations as Valuable Intangible Assets for IPO Investors in Pharma-Biotech Companies. -- by Philippe Gorry, Diego Useche
Orphan Drug (OD) legislation has been implemented with regulatory and financial incentives to encourage drug innovation in order to treat rare diseases. This study aims to test whether OD Designations (ODD) granted by the Food and Drug Administration (FDA) to pharmaceutical and biotechnology start-up companies may be considered as relevant signals in attracting entrepreneurial finance and increasing the amount invested at the time of the Initial Public Offering (IPO) in the US stock markets. We found that the signaling power of ODD is positively and statistically significant for IPO investors in stock markets. Regression results also suggest that ODDs are stronger than patent applications in attracting IPO investors. Scholarly and policy implications are discussed in the light of the signaling theory and drug development policies.
The 'China Shock', Exports and U.S. Employment: A Global Input-Output Analysis -- by Robert C. Feenstra, Akira Sasahara
We quantify the impact on U.S. employment from imports and exports during 1995-2011, using the World Input-Output Database. We find that the growth in U.S. exports led to increased demand for 2 million jobs in manufacturing, 0.5 million in resource industries, and a remarkable 4.1 million jobs in services, totaling 6.6 million. One-third of those service sector jobs are due to the intermediate demand from merchandise (manufacturing and resource) exports, so the total labor demand gain due to merchandise exports was 3.7 million jobs. In comparison, U.S. merchandise imports from China led to reduced demand of 1.4 million jobs in manufacturing and 0.6 million in services (with small losses in resource industries), with total job losses of 2.0 million. It follows that the expansion in U.S. merchandise exports to the world relative to imports from China over 1995-2011 created net demand for about 1.7 million jobs. Comparing the growth of U.S. merchandise exports to merchandise imports from all countries, we find a fall in net labor demand due to trade, but comparing the growth of total U.S. exports to total imports from all countries, then there is a rise in net labor demand because of the growth in service exports.
Missing Growth from Creative Destruction -- by Philippe Aghion, Antonin Bergeaud, Timo Boppart, Peter J. Klenow, Huiyu Li
Statistical agencies typically impute inflation for disappearing products based on surviving products, which may result in overstated inflation and understated growth. Using U.S. Census data, we apply two ways of assessing the magnitude of "missing growth" for private nonfarm businesses from 1983-2013. The first approach exploits information on the market share of surviving plants. The second approach applies indirect inference to firm-level data. We find: (i) missing growth from imputation is substantial - at least 0.6 percentage points per year; and (ii) most of the missing growth is due to creative destruction (as opposed to new varieties).
Bid Shading and Bidder Surplus in the U.S. Treasury Auction System -- by Ali Hortacsu, Jakub Kastl, Allen Zhang
We analyze bidding data from uniform price auctions of U.S. Treasury bills and notes between July 2009-October 2013. Primary dealers consistently bid higher yields compared to direct and indirect bidders. We estimate a structural model of bidding that takes into account informational asymmetries introduced by the bidding system employed by the U.S. Treasury. While primary dealers' estimated willingness-to-pay is higher than direct and indirect bidders', their ability to bid-shade is even higher, leading to higher yield/lower price bids. Total bidder surplus averaged to about 3 basis points across the sample period along with efficiency losses around 2 basis points.
Pass-Through of Input Cost Shocks Under Imperfect Competition: Evidence from the U.S. Fracking Boom -- by Erich Muehlegger, Richard L. Sweeney
The advent of hydraulic fracturing lead to a dramatic increase in US oil production. Due to regulatory, shipping and processing constraints, this sudden surge in domestic drilling caused an unprecedented divergence in crude acquisition costs across US refineries. We take advantage of this exogenous shock to input costs to study the nature of competition and the incidence of cost changes in this important industry. We begin by estimating the extent to which US refining's divergence from global crude markets was passed on to consumers. Using rich microdata, we are able to decompose the effects of firm-specific, market-specific and industry-wide cost shocks on refined product prices. We show that this distinction has important economic and econometric significance, and discuss the implications for prospective policy which would put a price on carbon emissions. The implications of these results for perennial questions about competition in the refining industry are also discussed.
Characterizing the Drug Development Pipeline for Precision Medicines -- by Amitabh Chandra, Craig Garthwaite, Ariel Dora Stern
Precision medicines - therapies that rely on genetic, epigenetic, and protein biomarkers - create a better match between individuals with specific disease subtypes and medications that are more effective for those patients. These treatments are expected to be both more effective and more expensive than conventional therapies, implying that their introduction is likely to have a meaningful effect on health care spending patterns. In addition, precision medicines can change the expected profitability of therapies both by allowing more sophisticated pricing systems and potentially decreasing the costs of drug development through shorter and more focused trials. As a result, this could change the types of products that can be profitably brought to market. To better understand the landscape of precision medicines, we use a comprehensive database of over 130,000 global clinical trials, over the past two decades. We identify clinical trials for likely precision medicines (LPMs) as those that use one or more relevant biomarkers. We then further segment trials based on the nature of the biomarker(s) used and other trial features with economic implications. Given potential changes in the incentives for bringing products to market, we also examine the relative importance of public agencies such as the National Institutes of Health (NIH) and different types of private firms in developing precision medicines.
Recent decades have seen the emergence of global value chains (GVCs), in which production stages for individual goods are broken apart and scattered across countries. Stimulated by these developments, there has been rapid progress in data and methods for measuring GVC linkages. The macro-approach to measuring GVCs connects national input-output tables across borders using bilateral trade data to construct global input-output tables. These tables have been applied to measure trade in value added, the length of and location of producers in GVCs, and price linkages across countries. The micro-approach uses firm-level data to document firms' input sourcing decisions, how import and export participation are linked, and how multinational firms organize their production networks. In this review, I evaluate progress on these two tracks, highlighting points of contact between them and areas that demand further work. I argue that further convergence between them can strengthen both, yielding a more complete empirical portrait of GVCs.
Management Quality in Public Education: Superintendent Value-Added, Student Outcomes and Mechanisms -- by Victor Lavy, Adi Boiko
We present evidence about the ways that school superintendents add value in Israel's primary and middle schools. Superintendents are the CEOs of a cluster of schools with powers to affect the quality of schooling, and we extend the approach used in recent literature to measure teachers' value added, to assess school superintendents. We exploit a quasi-random matching of superintendent and schools, and estimate that superintendent value added has positive and significant effects on primary and middle school students' test scores in math, Hebrew, and English. One standard deviation improvement in superintendent value added increases test scores by about 0.04 of a standard deviation in the test score distribution. The effect doesn't vary with students' socio-economic background, is highly non-linear, increases sharply for superintendents in the highest-quartile of the value added distribution, and is larger for female superintendents. We explore several mechanisms for these effects and find that superintendents with higher value added are associated with more focused school priorities and more clearly defined working procedures, but no effect on school resources and no effect on total teachers' on the job and external training, although there is a significant effect on the composition of the former. Another important effect is that schools with higher quality superintendents are more likely to address school climate, violence and bullying, and implement related interventions which lead to lower violence in school. A new superintendent is also associated with a higher likelihood that the school principal is replaced.
Firm-Level Political Risk: Measurement and Effects -- by Tarek A. Hassan, Stephan Hollander, Laurence van Lent, Ahmed Tahoun
We adapt simple tools from computational linguistics to construct a new measure of political risk faced by individual US firms: the share of their quarterly earnings conference calls that they devote to political risks. We validate our measure by showing that it correctly identifies calls containing extensive conversations on risks that are political in nature, that it varies intuitively over time and across sectors, and that it correlates with the firm's actions and stock market volatility in a manner that is highly indicative of political risk. Firms exposed to political risk retrench hiring and investment and actively lobby and donate to politicians. Interestingly, we find that the incidence of political risk across firms is far more heterogeneous and volatile than previously thought. The vast majority of the variation in our measure is at the firm-level rather than at the aggregate or sector-level, in the sense that it is neither captured by time fixed effects and the interaction of sector and time fixed effects, nor by heterogeneous exposure of individual firms to aggregate political risk. The dispersion of this firm-level political risk increases significantly at times with high aggregate political risk. Decomposing our measure of political risk by topic, we find that firms that devote more time to discussing risks associated with a given political topic tend to increase lobbying on that topic, but not on other topics, in the following quarter.
Probabilistic States versus Multiple Certainties: The Obstacle of Uncertainty in Contingent Reasoning -- by Alejandro Martinez-Marquina, Muriel Niederle, Emanuel Vespa
We propose a new hypothesis, the Power of Certainty, to help explain agents' difficulties in making choices when there are multiple possible payoff-relevant states. In the probabilistic 'Acquiring-a-Company' problem an agent submits a price to a firm before knowing whether the firm is of low or high value. We construct a deterministic problem with a low and high value firm, where the agent submits a price that is sent to each firm separately. Subjects are much more likely to use dominant strategies in deterministic than in probabilistic problems, even though computations for profit maximization are identical for risk-neutral agents.