Noah Smith has an article on this transition.Big data followers have always suggested that soon all your micro/macro theories will be history. All people will look at is evidence. As more and more data becomes available, we will get trends real time. If the data analysis matches the theory, the theory is lucky else the theory does not matter. The broad idea is earlier the onus was on data to match the theory as former was scarce. Now it will be the reverse – the theory has to match the data.
“It works in practice, but does it work in theory?” This joke is so commonly applied to economists that no one even knows who said it originally. The idea fits with the stereotype of economists as out-of-touch theory-obsessed philosophers, wasting time arguing about what would happen in a made-up world of their own devising, while all around them the real world unfolds in ways that are totally different.
Whether this stereotype was ever true, it certainly isn’t true today. Empirical papers are taking over from theoretical papers in the economic literature, and the methods used for untangling cause and effect are getting more and more scientific. The big driver is information technology, which has made the entire world into economists’ laboratory. So econ, in general, is becoming more like a science and less like philosophy.
Now, in macroeconomics — the study of business cycles and long-term growth — this shift has been a lot harder and slower. The reason is that in macro, there’s just much less data. History only happens once, and it’s hard to know to what degree the future will be like the past. To make matters worse, there’s just not that much data out there — we’ve been keeping good records for less than a century.
So where there’s less data, we would expect theory to expand to fill the gap. And in fact, macro is the main battleground for a perpetual argument about whether theory or evidence should be in the driver’s seat. Ed Prescott, a Nobel-winning economist, famously declared in 1986 that theory was “ahead” of measurement in business cycle theory. But since the financial crisis, there has been a clamor to elevate data to its rightful place.
Prof Smith also sums up some interesting research on the way. This one in particular:
Another example is the work of John Haltiwanger of the University of Maryland. Haltiwanger is a pioneer of the use of microeconomic data to yield macroeconomic insights. Haltiwanger is incredibly prolific. In just the past few years, he and a small army of co-authors have been digging through the data to find out which kinds of companies hire workers, why the U.S. business sector is less dynamic than in the past, whether private equity improves productivity and a number of other questions that tell us a huge amount about what is actually happening to our economy. Haltiwanger’s work blurs the traditional lines between “macroeconomics” and “microeconomics,” and thus brings data to bear that would have been ignored by traditional macroeconomic theorists focused on broad aggregates such as gross domestic product and unemployment.
Hmm. Macro post 1970s has anyways focused on microfoundations of macro. This has led to lots of fights amidst econs. With more and more micro data helping draw macro trends, this debate could be buried as well.
In the end:
When you compare the data work of economists like these to the broad brushstrokes of macroeconomic theorists, it isn’t hard to tell where the real progress is being made. Like the rest of economics, macroeconomics is transitioning from a philosophical, theory-first discipline to a hard-nosed, practical data-first profession. This is a good thing.
Let’s see how this transition unfolds..Does it just become another hype/fad or a permanent change?