Evan Soltas
Mar 5, 2012

Potential Insights

Towards a new theory of recessions?

It's been almost a month since I posted about the "paradox of productivity growth," which theorizes that coordinated shifts in the desire of employers to increase productivity can actually reduce it in the short run and even its growth rate in the longer run. It's not that I haven't been thinking about it; I didn't know quite where to go from there.

Now I think the picture of where this fits into a broader theoretical structure is becoming clearer for me. I was never particularly satisfied with my explanation of the longer-run consequences, and what I'm about to discuss develops that in greater depth.

In short, I had argued that the standard measure of productivity could be misleading in that it only measures the productivity of the employed, not productivity at full employment. This could be introducing faulty procyclic patterns into productivity growth rates, while hysteresis driven by skill loss leads to a secular decline in productivity growth as the duration of unemployment rises.

Since then, I've noticed that business cycles with the largest increases in mean duration of unemployment also tend to see the largest decreases in the real potential output growth rates. A brief survey: 1950s see rising unemployment duration with each business cycle, productivity growth rate falls; 1960s see unprecedented reduction in unemployment duration, productivity growth rate soars; 1970s see reversal of this trend, culminating in the 1980 recession; productivity growth rate ducks lower each time; 1980s, 1990s, and early 2000s see consistent cyclical behavior in unemployment duration and productivity growth (except for late 90s tech gains); and then we have the latest recession.

Yet the fact that we're expecting the lowest real potential output growth in the postwar era continued to gnaw at me. One is taught in basic macroeconomics that if fiscal or monetary policy does not close the recessionary gap, then in the long run, the economy will return to full employment of its resources -- i.e. to potential -- by a self-correcting mechanism in which aggregate supply increases as input prices, particularly wages, fall. Of course, this story has always been slow, given downward nominal wage rigidity, and wrenching in terms of its social costs. Nevertheless, I suppose that consensus (i.e. neoclassical synthesis) economics trusted it would happen eventually -- otherwise I don't see how the "returning to full employment" story works.

In the synthesis, though, I think economics lost a particularly important element of Keynesian thought in the General Theory, one which cast doubt not just on the strength and speed of this self-correcting mechanism, but on its existence per se -- and by extension, on the existence of a full-employment equilibrium and of a level of potential output which was "stable" in that only supply, and never demand, determined it:

[W]e oscillate, avoiding the gravest extremes of fluctuation in employment and in prices in both directions, round an intermediate position appreciably below full employment...But we must not conclude that the mean position thus determined by 'natural' tendencies, namely, by those tendencies which are likely to persist, failing measures expressly designed to correct them, is, therefore, established by laws of necessity. The unimpeded rule of the above conditions is a fact of observation concerning the world as it is or has been, and not a necessary principle which cannot be changed.
Thus mainstream economics chugged along, until we got to trying to determine what potential GDP was, so that we could calculate the output gaps needed for Keynesian policy. There are, as I have seen it, two ways to do this: (1) assume a constant percentage growth rate in real potential output by modeling changes in component productivities, usually within a limited time frame [note: this is what the BEA and CBO do to derive the output gap]; (2) use more complicated fitting, like the Hodrick-Prescott filter, to remove cyclical changes in output and determine a secular output trend.

The problem with the first one, of course, is that it's a pretty tough judgement call what is cyclical and what is structural change -- consider the fact that the Fed's range for what qualifies as "full employment" is 0.8 percent from minimum to maximum. The second one, though, actually explains the dataset's lower real potential output growth: When an output gap is sustained over time, the HP filter eventually "writes off" some of the gap as a secular change to lower productivity growth.

In effect, we've let our models do the theorizing, and what they want to generate is a hysteresis story. The self-correcting mechanism might not be a little more grim then falling wages and increasing aggregate supply: after long enough, potential output might merely fall. (See graph above.) I see this as most likely to occur where aggregate supply cannot correct, or where the drop required is so large that a high duration of unemployment causes potential output to fall.

Why can we say potential would fall? Because one can imagine a recession where there is little downward pressure on wages: a recession where the relevant pool of labor has not increased -- a productivity-shock recession, where unemployed workers do not have, or are losing, employable skills. In this sort of recession, there ought to be sharp sectoral differences in the slack supply of labor.

It is sort of as if the "real business cycle" theorists had it backwards. A productivity-shock recession isn't caused by a sharp decrease in aggregate supply; instead, a sharp increase in supply, or rather, a coordinated attempt to increase supply, can cause a severe recession with lasting consequences.

Now I am getting the feeling that the theory is coming into its own, that is, how a productivity-shock recession is particularly dangerous in that it is especially likely to cause hysteresis and a diminishment of potential growth. The paradox is making more sense to me now.

Update (3/17): Greg Ip of The Economist says that he's beginning to worry about the possibility that the level of potential output fell during the recession -- as a result of the recession, due to hysteresis effects. Welcome to the club.