The President of Minneapolis Fed, Narayana Kocherlakota has a widely discussed paper on the evolution and the major issues facing modern macroeconomics. He concedes that macroeconomists let policymakers down twice in the last few months - first by not providing them with rules to avoid the circumstances that led to the global financial meltdown and then by not having a systematic plan of attack to deal with fast-evolving circumstances of a financial market melt-down and economic recession. His suggestion is therefore to rehabilitate the dominant DSGE models with the inclusion of more complexity and some real-world exogenous shocks.
The Lucasian critique pointed attention away from the traditional historical macroeconomic data based models, to development of dynamic micro-founded models (constructed based on individual/firm preferences/objectives, technology and resource constraints) that aggregate the individual decisions (and dynamic expectations) to calculate the macroeconomic effects of the policy change. The stagflation of the seventies and the failure of the historical-data based Phillips curve tradeoff between unemployment and inflation only added to the opinion against traditional macroeconomic models.
It led to the displacement of the classical Keynesian model with the New Classical model. And when the New Classical could not explain both the duration and magnitude of actual cycles and its implication that only unanticipated money matters appeared to be contradicted by actual data, it was replaced by the New Keynesian model.
He argues that all the modern macroeconomic models that developed since the eighties in response to the Lucasian critique, share five common features - they specify budget constraints for households, technologies for firms, and resource constraints for the overall economy; they specify household preferences and firm objectives; they assume forward-looking behavior for firms and households; they include the shocks that firms and households face; and they are models of the entire macroeconomy. About the most popular DSGE models that are micro-founded on the interaction of rational agents and emerge from a combination of these five features, he writes,
"Dynamic refers to the forward-looking behavior of households and firms. Stochastic refers to the inclusion of shocks. General refers to the inclusion of the entire economy. Finally, equilibrium refers to the inclusion of explicit constraints and objectives for the households and firms."
On the freshwater-saltwater divide that emerged as soon as the modern macroeconomic models started being used, he writes
"(Freshwater economists)) brought a new methodology. But they also had a surprising substantive finding to offer. They argued that a large fraction of aggregate fluctuations could be understood as an efficient response to shocks that affected the entire economy. As such, most, if not all, government stabilization policy was inefficient... In the models of the freshwater camp, the benefits of the stimulus are outweighed by the costs of the taxes. The recession... is efficient...
Scholars in the opposing (“saltwater”) camp argued that in a large economy like the United States, it is implausible for the fluctuations in the efficient level of aggregate output to be as large as the fluctuations in the observed level of output. They pointed especially to downturns like the Great Depression as being obvious counter examples (to the 1970s oil crisis)."
And his assessment of the divide
"The division was a consequence of the limited computing technologies and techniques that were available in the 1980s. To solve a generic macro model, a vast array of time- and state-dependent quantities and prices must be computed. These quantities and prices interact in potentially complex ways, and so the problem can be quite daunting. However, this complicated interaction simplifies greatly if the model is such that its implied quantities maximize a measure of social welfare.
Given the primitive state of computational tools, most researchers could only solve models of this kind. But—almost coincidentally—in these models, all government interventions (including all forms of stabilization policy) are undesirable. With the advent of better computers, better theory, and better programming, it is possible to solve a much wider class of modern macro models. As a result, the freshwater-saltwater divide has disappeared.
... On the one hand, the freshwater camp won in terms of its modeling methodology. Substantively, too, there is a general recognition that some nontrivial fraction of aggregate fluctuations is actually efficient in nature. On the other hand, the saltwater camp has also won, because it is generally agreed that some forms of stabilization policy are useful."
Kocherlakota attributes much of the progress in macro in the last quarter century to adoption of "models that incorporate more realistic versions of the exchange process". Older macroeconomic models assume that all mutually beneficial trades occur without delay (or frictionless exchange), whereas the real world is characterized by infrequent price and wage adjustment. Therefore, the newer models have incorporated assumptions that firms can only adjust their prices and wages infrequently (sticky prices).
Further, older macro-models also assumed that firms and households can fully capitalize all future incomes through loan or bond markets, do not face any restrictions on the amounts they can borrow, and can buy insurance against all possible forms of risk. However, again the real world financial markets contain all these frictions. Further, in the real world there are labor market frictions that require people to spend time to find jobs. Modern macroeconomic models which accommodate pricing, financial market and labor market frictions, and the interaction between them, are fiendishly complicated to compute.
Despite all these changes, he still sees three specific weaknesses of modern macro models, which were largely ignored during the "Great Moderation" period of 1982–2007, and whose consequences were felt in the aftermath of the sub-prime meltdown
"First, few, if any, models treat financial, pricing, and labor market frictions jointly. Second, even in macro models that contain financial market frictions, the treatment of banks and other financial institutions is quite crude. Finally, and most troubling, macro models are driven by patently unrealistic shocks."
And addressing these concerns would make the modern macro-models extremely complex,
"Within these models, the distribution of financial wealth evolves over time. Suppose,for example, that a worker loses his or her job. If the worker were fully insured against this outcome, the worker’s wealth would not be affected by this loss. However, in a model with only partial insurance, the worker will run down his or her savings to get through this unemployment spell.
The worker’s financial wealth will be lower as a result of being unemployed. In this fashion, workers with different histories of unemployment will have different financial wealth. Aggregate shocks (booms or busts) will influence the distribution of financial wealth. In turn, as the wealth distribution changes over time, it feeds back in complex ways into aggregate economic outcomes. From a policy perspective, these models lead to a new and better understanding of the costs of economic downturns."
He writes that it becomes very difficult to construct and compute macroeconomic models with financial, pricing, and labor market frictions, and exposed to unpredictable exogenous shocks
"The models do not capture an intermediate messy reality in which market participants can trade multiple assets in a wide array of somewhat segmented markets. As a consequence, the models do not reveal much about the benefits of the massive amount of daily or quarterly re-allocations of wealth within financial markets...
The difficulty in macroeconomics is that virtually every variable is endogenous, but the macroeconomy has to be hit by some kind of exogenously specified shocks if the endogenous variables are to move...
Macroeconomists... are handicapping themselves by only looking at shocks to fundamentals like preferences and technology. Phenomena like credit market crunches or asset market bubbles rely on self-fulfilling beliefs about what others will do... Macroeconomists need to do more to explore models that allow for the possibility of aggregate shocks to these kinds of self-fulfilling beliefs."
James Morley disagrees with Kocherlakota's emphasis on theoretically constructed complex models based on Lucas critique and the existing DSGE models and feels that more attention should be paid to historical data
"Macroeconomists need to do more than simply add complexity to their models. They should also remember that it is empirically testable whether models that put most of their weight on 'deep structural parameters' produce more accurate predictions than models that put more weight on historical correlations. In doing so, it may be found that some macroeconomic relationships are useful even if they cannot be easily motivated as the literal outcome of a micro-founded model. This is not to deny the important role that economic theory plays in the formulation of models used for explanation, prediction, and policy analysis. However, there is no reason for models to take theory quite so literally as is typically done in modern macroeconomics. Instead, the data should be taken more seriously."
He feels that large-scale macroeconometric models that are based on large numbers of interlocking demand and supply relationships estimated using various kinds of data have many advantages over DSGE models,
"First, there are many more variables in the models. This allows for more useful details that are typically missing from DSGE models, from 'small' things like the consideration of different types of consumption (e.g., durables vs. non-durables and services) and different forms of fiscal policy (i.e., more than just lump-sum transfers), to larger things like foreign trade. Second, the models consider levels data rather than deviations from steady state. This is helpful for statistical and economic identification, as well as for forecasting. Third, the models are grounded in macroeconomic theory, but they are not intended to be a literal description of reality.
An example might help illustrate the nature of large-scale macroeconometric models. The model developed by Macroeconomic Advisers is based in part on the life-cycle hypothesis in which households are forward-looking and smooth their consumption across their lifetime income profiles. This theoretical setting implies a consumption function in which the marginal propensities to consume can be thought of as complicated functions of 'deep structural parameters'. A key point is that the estimates for the marginal propensities to consume are remarkably stable over the postwar period, implying that the deep structural parameters for this model are also fairly stable."
Such models, which have all the five features of modern macro-models outlined by Kocherlakota, use directly observable data rather than deviations from steady state; provides direct forecasts of the data rather than deviations from steady state; allows for relatively flexible short-run dynamics; and allows for residuals that can explain the severity of outcomes (like say, depth of recessions or job losses) and act as a kind of 'safety pressure relief valve' to address the fact that models are not reality.
He also feels that fundamentally, there is little to differentiate between the various macroeconomic models,
"VARs, large-scale macroeconometric models, and DSGE models all imply systems of equations. Policy forecasts for these different approaches are all based on some assumptions from macroeconomic theory and some consideration of how economic agents perceive a given change in policy — i.e., was it anticipated or unanticipated and will it be permanent or transitory? The main differences across approaches are in terms of how estimation is carried out and how the theoretical assumptions are imposed. The VAR places the least (but not zero) weight on theory, while the DSGE models place the most, even to the extent of imposing strong restrictions on some parameters across equations. It is ultimately an empirical question as to whether the imposition of these cross-equation restrictions really helps with predicting the effects of policy and forecasting more generally."
Pointing to the effectiveness of theories of endogenous financial crises inspired by the ideas of Hyman Minsky in successfully explaining the Great Recession, he advocates the need for macroeconomists to be pluralistic, drawing from "different types of analysis, be it time-series models, large-scale macroeconometric models, DSGE models, and more narrative approaches".
See also this from Nick Rowe about models-based economic forecasting and Paul Krugman here.
Update 1 (23/7/2010)
Robert Solow made a superb testimony to the House Committee on Science and Technology that sought to explain the reasons why modern macroeconomic theories have little useful to say about the way out of the Great Recession.
No comments:
Post a Comment