Hard to believe it's been a year.
10 hours ago
"Roberts: But I think as economists we should be careful about what the causal mechanism is. It matters a lot.
Piketty: Oh, yes, yes, yes. But this is why my book is long, because I talk a lot about this mechanism."
"It is difficult to overstate how uncontroversial it is in the field of labor market policy evaluation to assert the superiority of matching methods to the nonmatching approaches described above.9 The seminal evaluations of the effects of job training programs, work-sharing arrangements, employment tax credits, educational interventions, and housing vouchers all use at least some sort of matching method, if not an actual randomized experiment. In their widely cited survey article on non-experimental evaluation, Blundell and Costa Dias (2000) do not even mention state-level fixed-effects models when they list the five major categories of evaluation methods. In a similar article, Imbens and Wooldridge (2009) do mention fixed-effects models as a tool for policy evaluation, but clarify that these were used before more advanced methods were developed, noting that the modern use of fixed-effects models is typically in combination with other more sophisticated techniques. For example, Dube, Lester, and Reich (2010) also use a fixed-effects model, but more importantly it is a fixed-effects model that utilizes rigorous matching strategy to identify the effect of the minimum wage. Sometimes fixed-effects models are the best available option if no natural experiment or other matching opportunity emerges to provide a more rigorous approach. Well specified fixed-effects models can still be informative. But faced with the choice between a well matched comparison group and a fixed-effects model, the former is unambiguously the stronger study design."
Daniel Kuehn is a doctoral candidate and adjunct professor in the Economics Department at American University. He has a master's degree in public policy from George Washington University.