He was recently interviewed by the Minneapolis Fed. Here he is on the proper use of modeling:
"I’m about to teach a course in which I will, in the introduction, talk briefly about this methodological issue. But I still need to teach the basic models. That won’t change. In fact, I think it is very important to clarify that I am not antimodel. On the contrary, the economy is so complex that there is little hope of understanding much without models. I just don’t want these models to acquire a life that is independent from the purpose they are ultimately designed to serve, which is to understand the functioning of real economies.
The critique part of the paper you refer to argued that the current core of macroeconomics has become so mesmerized with its own internal logic that it begins to confuse the precision it has achieved about its own world with the precision it has about the real one.
There is absolutely nothing wrong with building stylized structures as just one more tool to understand a piece of the complex problem. My problems with this start when these structures take on a life on their own, and researchers choose to “take the model seriously”—a statement that signals the time to leave a seminar, for it is always followed by a sequence of naïve and surreal claims."
A lot of people criticize economic models for being an imperfect representation of the economy - for not being "true", and they act condescendingly towards the simpletons that would employ models. The fact is this is a fundamental misunderstanding of why we model, which I think Caballero expresses brilliantly here. We don't model to reproduce reality - that is impossible. Reality is too complex. We model to get a sense of the dynamics of specific processes or mechanisms that we think are important. You can't just talk your way through the implications of most of these processes. It's too complex. Math helps you coordinate a lot of moving parts that prose could never coordinate. But you always have to keep in mind that you're just trying to understand the processes at work in the consideration of multiple interacting variables, and this is a tool for doing that. You're not reproducing reality. Science isn't about uncovering "truth" with a capital T - it's about uncovering useful knowledge.
He goes through uncertainty issues too (he calls it "Knightian uncertianty"). He provides an interesting discussion of complexity and presents and good case for why complexity makes inference from microfoundations to macrodynamics hard:
"The economy is an incredibly complex object—and I mean “complex” in the sense of very hard to understand. This complexity is not something we can just get rid of in the process of writing simple models, for it is central to economic behavior during crises. My work with Alp is an attempt to capture a small part of this complexity problem and its role during financial crises.
The basic idea is that the economy is a very complicated network of connections, but most of the time economic agents can go about their daily activities without worrying about those complications. To succeed, you—or financial institutions in our model—just need to be good at understanding your local environment.
However, as crises cross a certain threshold, all of a sudden it is no longer enough to understand the local environment. You begin to worry about indirect hits through the network. So what was a relatively simple optimization problem quickly becomes an immensely complex one. At that point, we have moved from a world of more or less well-defined risks to one of (Knightian) uncertainty and, as we discussed earlier, decision makers then become ultraconservative. And the most attractive individual decision is simply to withdraw."
He makes the skepticism about microfoundations more explicit here:
"The quantitative implications of this core approach, which are built on supposedly “micro-founded” calibrations of key parameters, are definitely on the surreal side. Take, for example, the preferred “micro-foundation” of the supply of capital in the workhorse models of the core approach. A key parameter to calibrate in these models is the intertemporal substitution elasticity of a representative agent, which is to be estimated from micro-data. A whole literature develops around this estimation, which narrows the parameter to certain values, which are then to be used and honored by anyone wanting to say something about “modern” macroeconomics.
This parameter may be a reasonable estimate for an individual agent facing a specific micro decision, but what does it have to do with the aggregate?"
Here I think he's being a little unfair to the discipline (although making an excellent point). If you know the literature on intertemporal substitution elasticities, then you know that this question about the difference between micro and macro elasticities is absolutley central to the discussion. The (modern) literature turns on the question of comparing micro and macro estimates and explaining why macro elasticities are so much larger than micro elasticities. The same goes with the wage cyclicality literature (actually, the two literatures are related). Macroeconomists don't just blindly plug in micro-elasticities. They understand this issue. It would be nice if as a result they would pay less attention to microfoundations. Instead they've generally tried to bridge the gap. That's a good effort too, I suppose. The point is, they're not naive, and the point is Caballero seems to have healthy skepticism of "microfoundation" approaches which I like.
The point is this - if you read some self-styled Hayekians (I have Russ Roberts in mind) you get a few takeaways from them:
- Microfoundations are essential - in fact, macroeconomics is nothing but microeconomics
- Modeling is highly misleading - prose is better
- Look at Caballero! Look! He's quoting Hayek and agrees with us!
I think it should be clear that (1.) appreciating Hayek does not require you to devolve into this Luddite approach to macroeconomic modeling, (2.) Caballero has a very pro-model view, but he views models much the way I've presented them on F&OST - as a way of understanding the market better not reproducing "reality", and (3.) he's skeptical of microfoundations which is an excellent prejudice for a macroeconomist.
UPDATE: On facebook, a former public policy professor of mine writes of this interview, "When I received my graduate training, macroeconomics was still part of the required content of the field of public finance. I like to quip that I know (or don't know) enough macro to be dangerous. This part of the interview with the current chair of the econ department at MIT makes some interesting points generally about the proper use of models in economic analysis and thinking. My own experience is that students -- esp. public policy students -- actually become more receptive to the use of models as analytic tools when models are presented as useful ways of tackling compex questions that provide insights, if not literal answers." [emphasis mine]