Monday, September 19, 2016

Of phlogiston and frameworks

It's always easier to criticize than to praise, so I first posted about the problems I had with Paul Romer's critique of macroeconomics. My criticism so far was twofold:

  1. The string theory analogy isn't appropriate. String theory is a framework that is a natural extension of the wildly successful quantum field theory framework. The DSGE framework is not based on empirical success but rather the wildly inaccurate rational utility maximizing agent framework. This means even the sociological implications of the analogy aren't appropriate. Specifically, citing Smolin's argument string theory should abandon their (single-parameter) thus far fruitless and approach with no experimental validation and divert resources to Smolin's (single-parameter) thus far fruitless and approach with no experimental validation (loop quantum gravity) as a parable of DSGE macro makes little sense. DSGE hasn't just been fruitless, but is either empirically wrong or suffers from identification problems (too many parameters to be wrong), and is built of pieces that themselves aren't empirically accurate. Or more simply, the lack of empirical grounding of string theory is different from DSGE macro. Neither is empirically successful, but string theory is built from empirically successful pieces and just applies to regimes in physics (Planck scale) where we can't measure data. [1], [2], [3]
  2. Romer claims macro practitioners Lucas, Sargent, and Prescott aren't being scientific because of what Romer sees an obvious natural experiment. However the obviousness of this natural experiment is itself model-dependent (I show two additional, but different interpretations of the data); the scientific way to have dealt with this is to say that Lucas, Sargent, and Prescott were should be much more skeptical of their own claims because the data isn't terribly informative. [4]

This could be summarized as simply saying there is no framework for macroeconomics. Frameworks, like string theory and quantum field theory, do two things: they tell you how to start looking at a problem, and they represent a shorthand for capturing the empirical successes of the field. My first criticism above says that string theory is a framework, so you can't make an analogy with macro which doesn't have a framework. My second criticism above says that until you have a working framework, you should be skeptical of any "natural experiments" because the interpretation of the natural experiments change with the framework.

In spite of this, for the most part I liked Romer's take on macroeconomics and I think he delivered some powerful arguments.

The first one is that Romer says macroeconomists (or at least Lucas, Sargent, and Prescott) keep reinventing phlogiston -- imaginary fields or forces that produce effects. I am a bit less harsh on this particular point. Since macroeconomics is in a nascent state, it's going to invent lots of phlogiston before it hits on the right one. Energy and momentum in physics started off as phlogiston. They gradually became useful concepts over time.

However as Romer points out these concepts have become impervious to measurement whether through theoretical evasion or simply ignoring the data. This is how phlogiston hypotheses turn into derpy phlogiston priors. Or as Romer puts it: post-real models.

Second, after illustrating the that the parameters in macroeconomic models scale as where m is the number of equations, he goes on to tell us that expectations make the number of parameters scale as 2m²:
Adding Expectations Makes the Identification Problem Twice as Bad ... So allowing for the possibility that expectations influence behavior makes the identification problem at least twice as bad. This may be part of what Sims (1980) had in mind when he wrote, "It is my view, however, that rational expectations is more deeply subversive of identification than has yet been recognized."
My own view is that expectations create a far more serious problem than too many parameters. However, Romer is illustrating a general principle in physics. Those parameters -- take the form of an m × m matrix. Because there are no established theoretical economic principles to reduce this number, you have to deal with all parameters. In physics, you can have principles like rotational symmetry or Lorentz invariance. For example, G = 8πκ T without general covariance could have had 16 parameters (a 4 × 4 matrix) for even small perturbations around equilibrium; instead it has one.

Because you have so many parameters (2m²) and so few observables (m), we have a case where many different sets of parameter values are consistent with a given set of observations. This is the identification problem in a nutshell. It's basically a dimensional argument -- a mapping from a 2m²-dimensional space to an m-dimensional space is going to have large subsets of that 2m²-dimensional space mapping to the same point in the m-dimensional space.

In physics, as noted above, this problem is solved by saying those subsets are actually equivalence classes (established by e.g. symmetry principle, gauge invariance, or general covariance -- theoretical frameworks). Economics has no such theoretical principles (yet), so per Romer it ends up relying on FWUTVs (which makes me thing of ROUSes): Facts With Uncertain Truth Value. These take several forms in Romer's paper: assumption, deduction (from assumptions), and obfuscation.

*  *  *

I think Romer's criticisms are serious, but as Cameron Murray said on Twitter: "It’s not a valid argument in economics [until] a high priest says it." That is to say these criticisms have existed for a long time. The real question is: how will economics deal with them?

My own approach is the information equilibrium framework.

In the same way string theory is based on the successful framework of quantum field theory, information equilibrium is based on the successful framework of information theory. It encodes the "empirical success" of supply and demand, promotes the idea that relative prices are all that matters (the scale invariance of economics) to a general symmetry principle, and is just a minor generalization (based on that symmetry principle) of an equation that appears in Irving Fisher's doctoral thesis.

Information equilibrium is a kind of gauge invariance relating several different parameterizations of the same model to each other. For example, any system of information equilibrium relationships that express a relationship between a set of observables {X, Y, Z ...} and some phlogiston variable U can be rewritten without the U. Originally, that U represented utility, but it can represent anything in Romer's menagerie of phlogiston. (One way to think about this is that information equilibrium builds up models out of pairwise relationships between observables or Cobb-Douglas functions of observables, limiting the possible relationships in that m × m matrix).

It's not necessarily completely at odds with existing economic theory either. For example, I was able to build a simple New Keynesian DSGE model out of information equilibrium relationships. Interestingly it has fewer parameters and a couple of its components are not empirically accurate. Lots of other pieces of existing economic theory also fit in the information equilibrium framework.

There is still a kind of phlogiston that exists in the information transfer framework, but it's good phlogiston. Let me explain ...

When information equilibrium relationships fail to describe the data, it could be that information is failing to be transferred completely from source to destination (non-ideal information transfer) -- i.e. there is information loss. This lost information is very much like phlogiston. It is unknown and it explains deviations from observations. However, three principles make it a good kind of phlogiston:

  • Deviations are always in one direction. That is to say an information equilibrium model is a bound on the related information transfer model.
  • Systems that are frequently far from information equilibrium will have not in general be good fits to information equilibrium relationships. That is to say unless the system is close to information equilibrium, you probably wouldn't posit the relationship in the first place. A "mostly phlogiston" (mostly lost information) relationship would be contentless.
  • Information loss is always due to correlations among agents. This makes the study of information loss a subject of sociology, not economics.

Phlogiston can be fine -- as long as there's a framework.

2 comments:

  1. I'd love to see a very creative / clever person put together an analogy of the state of economics. I'm imagining something like that story about the blind men trying to determine the nature of an elephant, each coming to a very different conclusion because each is near a different part of the animal. But it'd be fun to match of the different blind men with different "schools."

    ReplyDelete
    Replies
    1. That was literally the story with string theory; Witten figured out the various string theories (I, IIA, IIB, etc.) were perturbative expansions of what he called M-theory.

      However I think there are multiple forces that shape economic theory -- science, (lack of) data, physics methods, morality, and lots of politics.

      I'm not sure economics as we understand it is really a single animal. Some people are feeling something that really is a different thing -- they just think it's connected to economics. I think financial crises have more to do with sociology and neuroscience than the "economics" of asset prices, supply, and demand.

      Delete

Comments are welcome. Please see the Moderation and comment policy.

Also, try to avoid the use of dollar signs as they interfere with my setup of mathjax. I left it set up that way because I think this is funny for an economics blog. You can use € or £ instead.

Note: Only a member of this blog may post a comment.