Noah Smith comments on the issue of mathiness in economics raised by Prof Paul Romer recently. He says problem is not with economics having math but the way subject uses math which usually goes beyond logic:
A lot of people complain about the math in economics. Economists tend to quietly dismiss such complaints as the sour-grapes protests of literary types who lack the talent or training to hack their way through systems of equations. But it isn’t just the mathematically illiterate who grouse. New York University economist Paul Romer — hardly a lightweight when it comes to equations — recentlycomplained about how economists use math as a tool of rhetoric instead of a tool to understand the world.
Personally, I think that what’s odd about econ isn’t that it uses lots of math — it’s the way it uses math. In most applied math disciplines — computational biology, fluid dynamics, quantitative finance — mathematical theories are always tied to the evidence. If a theory hasn’t been tested, it’s treated as pure conjecture.
Not so in econ. Traditionally, economists have put the facts in a subordinate role and theory in the driver’s seat. Plausible-sounding theories are believed to be true unless proven false, while empirical facts are often dismissed if they don’t make sense in the context of leading theories. This isn’t a problem with math — it was just as true back when economics theories were written out in long literary volumes. Econ developed as a form of philosophy and then added math later, becoming basically a form of mathematical philosophy.
In other words, econ is now a rogue branch of applied math. Developed without access to good data, it evolved different scientific values and conventions. But this is changing fast, as information technology and the computer revolution have furnished economists with mountains of data. As a result, empirical analysis is coming to dominate econ.
He is kind of right. To give an analogy, economics has become like this really bulky Microsoft software. Microsoft was least prepared to handle internet and other technological developments which suddenly caught onto the world. It had two choices. Either to start afresh but this meant loss to competition. Or keep adding and readjusting existing software and make it look as things connect together. This way it could just bide more time and hope it can come up with something. MS chose the second option but could not come out with anything substantial. It is losing out to competition eventually.
Similarly, economics in order to keep its imperialism intact has borrowed extensively from physics and math. The original structure of economics was based on philosophy and history where people argued via rhetoric. Now, it is just about some fancy math. Some bit of math could have been managed but now math has grown much more than the original core. So much so, the historical and philosophical foundations are hardly even taught anymore. If it were not for the libraries, one would have found it even difficult to read the original stuff. These classical stuff are hardly recommended and one is just lucky to stumble upon them in libraries. This crisis has led to some attention on the origins but is still just a drop in the ocean.
So, what next? Well just it happened for Microsoft same could happen to economics as well. It could be taken over by competition from data scientists and modelers. It could become the victim of its own perceived success and imperialism. The author discusses works of Susan Athey and Hal Varian who are challenging the way we look at economics.
One sign of this is the sudden burst of interest in machine learning in the economics field. Machine learning is a broad term for a collection of statistical data analysis techniques that identify key features of the data without committing to a theory. To use an old adage, machine learning “lets the data speak.” In the age of Big Data, machine learning is a hot field in the technology business, and is a key tool of the rapidly expanding field of data science. Now, econ is catching the bug.
Two economists who have been pushing for the adoption of machine learning techniques in economics are Susan Athey and Guido Imbens of Stanford University. The two economists explained machine learning techniques to an interested crowd at a recent meeting of the National Bureau of Economic Research. Their overview stated that machine learning techniques emphasized causality less than traditional economic statistical techniques, or what’s usually known as econometrics. In other words, machine learning is more about forecasting than about understanding the effects of policy.
That would make the techniques less interesting to many economists, who are usually more concerned about giving policy recommendations than in making forecasts. But Athey and Imbens have also studied how machine learning techniques can be used to isolate causal effects, which would allow economists to draw policy implications.
Basically, Athey and Imbens look at the problem of how to identify treatment effects. A treatment effect is the difference between what would happen if you administer some “treatment” — say, raising the minimum wage — and what would happen without the treatment. This can be very complicated, because there are lots of other factors that affect the outcome, besides just the treatment. It is also complicated by the fact that the treatment may work differently on different people at different times and places. A final problem is that the data economists have to answer the question is usually very limited — a big impediment for traditional econometrics, which generally assumes that the amount of data is comfortably large. Athey and Imbens deal with these issues by importing a method from data science, called a regression tree. Statistically literate readers can peruse their slides here.
Another economist who has looked at the potential of machine learning is Hal Varian, a highly successful former professor who now serves as the chief economist at Google. In a 2013 paper, Varian released a paper discussing how new machine learning techniques developed by data scientists can help economists improve their understanding of reality. For example, he discusses how machine learning can help choose between different models (something economists often ignore), cope with uncertainty about which model is correct and avoid overfitting (overly complex explanations that can’t predict anything). In a set of slides released in early 2014, Varian tied machine learning techniques to the recent rise of quasi-experimental methods in economics. This represents a fusion between traditional econometrics and new data science techniques.
Varian, Athey and Imbens are not the only examples of this mini-trend. Data science blogger Kenneth Sanford has a few more.
So is economics going to become another branch of applied math? Will econometrics and data science merge? Berkeley economist Brad DeLong thinks so. “The work [of economics] will be done,” he writes, “by data scientists, computer modelers, and historians of various stripes.” That is almost certainly too extreme a prediction. But the interest in machine learning is just one more sign that economics may be starting to shed its peculiar fixation on theory and join its cousins in the data-driven future.
And guess what? Historians might still remain in the game. It could be back to basics..