Вы находитесь на странице: 1из 12

Persisting Problems on the link between Macroeconomics and Microeconomics

James Matthew B. Miraflor, MA Economics School of Economics, University of the Philippines Diliman October 2011

Abstract The history of economics, for the most part, has been bifurcated between the study of individual economic decisions (microeconomics) and the aggregate economic phenomena (macroeconomics). The attempt to marry the two, via incorporating microeconomic foundations or microfoundations to explanations for macroeconomic observations and predictions, has so far taken sway a majority of mainstream economists with the failure of Keynesian models to accurately predict aggregate behavior in the presence of government policy. Robert Lucas Jr. posited that people form rational expectations of government policy and act so as to render forecasts unstable. However, there are some persisting theoretical and empirical challenges on this research direction the empirical instability of macro-models which incorporated microfoundations, the Sonnenschein MantelDebreu result which may spell the theoretical dead end to economic aggregation, the still unresolved Cambridge capital controversies started by the reswitching argument by Italian economist Pierro Sraffa and American economist Joan Robinson in the 1960s, and the missing representative consumer or firm that can take into account the behavior of the aggregate. These challenges give the idea that aggregate economic behavior is almost impossible to deduce from microeconomic behavior of agents. Post-Keynesianism which asserts that long-term expectations are largely determined by non-economic, psychological processes exogenous to the model is posited as a possible way forward.

Persisting Problems on the link between Macroeconomics and Microeconomics


James Matthew B. Miraflor, MA Economics School of Economics, University of the Philippines Diliman October 2011

The history of economics, for the most part, has been bifurcated between the study of individual economic decisions (microeconomics) and the aggregate economic phenomena (macroeconomics). And for good reasons at least for the last century, they deal with a different set of questions requiring different tools of analysis. For macroeconomics, what was asked is how the national economy behaves as a whole vis--vis a government that intervenes through fiscal and monetary policy. Empirical and econometric models are thus applied to analyze trends in aggregate data 1 and create forecasts of future behavior of the economy. For microeconomics, the behavior of individual firms and consumers and how they allocate resources is central, to which one attempts to capture how rational economic agents behave 2 and from there deduce the probable behavior of a system in which these agents interact 3. The recent 4 attempt to marry the two, via incorporating microeconomic foundations or microfoundations to explanations for macroeconomic observations and predictions, has so far taken sway a majority of mainstream economists, from the adherents of the Chicago price theory to the New Keynesians who accepted the long-run assumptions of the neoclassicals. The impetus then has been very clear the failure of Keynesian models to accurately predict aggregate behavior in the presence of government policy. The cause? Economist Robert Lucas Jr. forwarded a thesis that people form rational expectations of government policy and act so as to render forecasts unstable. Thus, it is nave to try predicting impacts of economic policy based solely on historical aggregate economic data. A generation of economists after him would later pursue a research program to uncover microfoundations that deeply capture parameters on preferences, technology, and resource constraints at the individual level. The effort has been so successful that it was able to penetrate policy -making itself formerly out of reach to micro and long-run economists, having been the domain of Keynesians after the US Great Depression and the mainstreaming of strong government intervention in the economy. This paper, however, attempts to survey some persisting theoretical and empirical challenges on this research direction the empirical instability of macro-models which incorporated microfoundations, the Sonnenschein MantelDebreu result which may spell the theoretical dead end to economic aggregation, the still unresolved
Macroeconomists usually employ large-scale macroeconometric models to estimate relations between macroeconomic variables using time series analysis to establish empirical correlations. These models frequently use hundreds of equations describing as many economic variables and vectors. Robert Lucas Jr. criticized the effort for being too sensitive to policy changes, and insisted that economic behaviour should be established first theoretically to produce policy invariant models. This also entails verifying it empirically via a myriad of statistical tools to establish, for instance, the Weak Axiom of Cost Minimization (WACM) or the Weak Axiom of Revealed Preferences (WARP) necessary conditions (but not sufficient) to establish rational behavior amongst firms and consumers. Game theoretic tools and the general equilibrium approach remains to be the most popular in extrapolating aggregate behaviour of interacting economic agents.
4 3 2 1

The effort started after the Lucas critique (1976).

Cambridge capital controversies started by the reswitching argument by Italian economist Pierro Sraffa and American economist Joan Robinson in the 1960s, and the missing representative consumer or firm that can take into account the behavior of the aggregate. These challenges give the idea that aggregate economic behavior is almost impossible to deduce from microeconomic behavior of agents. Because of heterogeneity of economic agents, modern macroeconomics based on microfoundations may in fact be comparing and lumping together apples and oranges in its hope to deduce macroeconomic phenomenon from microeconomic hypotheses. These challenges, among others, can either inform or even affirm the direction of microfoundations research or usher new ways to describe aggregate economic behavior, among which Post-Keynesianism which asserts that longterm expectations are largely determined by non-economic, psychological processes exogenous to the model is posited as a possible way forward. But before such alternatives are explored, we first have to rediscover the logic of those who have championed the incorporation of microfoundations in macroeconomic theory. They lived in a time when complicated large-scale macroeconometric models are failing to deliver correct forecasts and policy recommendations to resolve contemporary problems. Thus, it is only by looking at the original problems microfoundations tried to solve can we have a more grounded approach on exploring other research directions.

1. Macroeconometric Models and the Divorce of Micro and Macroeconomics


Since the time of the Great Depression, macroeconomics has been the primary pursuit of the vast majority of those who have taken a career in economics. John Maynard Keynes The General Theory of Employment, Interest and Money revolutionized both economic research and pedagogy, shifting away the attention from microeconomics and long-run growth. Macroeconomist N. Gregory Mankiw (2006:3), for instance, observed that the microeconomic concept of supply and demand now at the heart of economic teaching did not appear until page 447 of the 608-page Economics textbook by Paul Samuelson the best-selling classic on economics. With the ascendance of Keynes thought, the work of the next generation of economists focused on creating simple and concrete macro-models based on the General Theory from John Hicks IS-LM model to Robert Mundell and Marcus Flemings IS-LM-BP model, to Franco Modiglianis extension and recasting of Hicks model. Meanwhile, pioneers such as Lawrence Klein and Jan Tinbergen had been working on applied econometric models which can be used for policy analysis (Mankiw 2006:4).This led to the rise of large-scale macroeconometric models that attempt to model aggregate economic behavior to inform governments fiscal and monetary interventions. These models extrapolate and forecast using historical aggregate economic data emphasizing past correlations of variables and parameters instead of theoretical relations. The 1960s saw the rise into popularity of macroeconometric models based on the work of prominent Keynesians, including but not limited to Kleins Wharton Econometric Forecasting Associates (WEFA) LINK project associated with Klein, Otto Ecksteins DRI (Data Resource, Inc.) model, and Albert Ando and Modiglianis MPS (MIT-Penn-Social Science Research Council) model used by the US Federal Reserve 5 (Mankiw 2006:4).

These large-scale models, while complicated, have been very popular among Central Banks and State Banks around the world especially with the advances in computing technology.

Meanwhile, by 1950s, several advances were already being made in microeconomic theory. In 1954, Kenneth Arrow and Gerard Debreu were able to prove the existence of a general equilibrium point (Janssen 2006:3). Hungarian mathematician John von Neumann and Austrian Oskar Morgenstern (1920-1976) published in 1944 the Theory of Games and Economic Behavior - the precursor for Game Theory which has been a core element of modern microeconomics. These advances proceeded more or less independently of the work done by Keynesian macroeconomists, which delved on questions of government economic policy. A large part of why the divorce of macro and microeconomics gradually ended can be explained by the growing dissatisfaction with the work on macroeconometric models. The weakest link in Keynesian macroeconometric models the Philipps curve (the hypothesized link between inflation and unemployment) was heavily exploited by the opponents of Keynesian economics. Monetarist Milton Friedman (1968:11) for one insisted that the negative correlation between inflation and unemployment turned out not to be stable in the long-term, and can only hold in the short-run. Stagflation the coincidence of high unemployment and high inflation seemed to have discredited the Philipps curve and with it, the school of pure macroeconomics Keynes started. There are important differences with how a macroeconomist and a microeconomist would approach the issue of unemployment sticky wages and involuntary unemployment form the core of Keynesian macroeconomics interest while general equilibrium theorists would insist on flexible prices and market-clearing. The resolution of this conflict, that involuntary unemployment and sticky wages are short-run phenomena while labor markets eventually clear in the long-run consistent with the general equilibrium approach, is thought to be theoretically unsatisfactory, as one supposedly cannot attribute unemployment to sticky wages while leaving general equilibrium theory intact. The notion of fixed prices likewise deeply affects microeconomic concepts of supply and demand (Janssen 2006:3). Clearly, the divorce of micro and macroeconomics resulted then to inconsistencies in explaining important economic phenomenon, such as unemployment. How can the short-run of fixed prices proceed to the long-run of market-clearing? If one would follow the generally accepted view that large-scale phenomena is a direct result of the market interaction between individual agent, then one would come to a position that general equilibrium, rather than fixed prices, is more fundamental (Janssen 2006:3). This thought has catalyzed another revolution in economics that has then displaced Keynesian school of pure macroeconomics in favor of microeconomic foundations.

2. The Lucas Critique, Emergence of Microfoundations, and Empirical Challenges


American economist Robert Lucas Jr. commenced the emergence of microeconomic foundations through his classic Econometric Policy Evaluation: A Critique (1976) where he, as well as other new classical economists, criticized large-scale Keynesian macroeconometric models as useless for evaluating policy impacts because empirical correlations are sensitive to policy changes. The reason is that these models fail to take rational expectations of agents more seriously (Mankiw 2006:6). Lucas insisted that while preferences and technology are invariant to policy rules, decision rules describing behavior of individual agents are not (Chari & Kehoe 2006:4). Thus, empirical relationships based on correlations will likely break down under different sets of policies. Lucas

asserted that only a model based on theoretical relations instead of empirical ones can account for shifting policies (Mankiw 2006:6) 6. This critique of macroeconometric models known now as the Lucas critique had profound implications with the way economist think of monetary policies. It implies for one, that the monetary authority cannot predict how people will respond to a policy decision today if it cannot predict how peoples expectations of future monetary policy will change as a result of the current decisions. For that matter, the monetary authority also needs to predict how its own behavior will change in the future due to its current actions (Chari & Kehoe 2006:4). Taking from earlier models (1973) by Lucas himself based on assumptions of imperfect information, rational expectations, and market clearing, monetary policy effectiveness is now seen only insofar as it surprises people and confuses them about relative prices 7. But since it is impossible to surprise rational people systematically (Sargent & Wallace 1975, as cited by Mankiw 2006:6), systematic monetary policy will fail. This started a wave of changes in practical monetary policy, with academic and policy-oriented macroeconomists basing analysis and models on quantitative general equilibrium models with policy invariant parameters of preferences and technology. Models in response to Lucas critique became increasing sophisticated, incorporating financial market imperfections, monetary non-neutralities, and other frictions (Cooley 1995, as cited by Chari & Kehoe 2006:4). The so-called time inconsistency 8 critique has been a major influence in central banking and fiscal policy in the last three decades (Chari & Kehoe 2006:4). Complicated models with deep parameters had been setup, which new classical economists hoped to produce policy invariant artificial data. Empirical Challenges to Microfoundations Unfortunately for the new classical economists, empirical evidence has not been consistent for their side. Arturo Estrella and Jeffrey Fuhrer (2003), Senior Vice Presidents of the Federal Reserve Banks of New York and Boston respectively, tested the robustness of forward looking models those which incorporate rational expectations and found out that for a popular class of monetary policy models, rational expectations do not guarantee resolution of the instability problem identified by the Lucas critique. In fact, there is little evidence that backwardlooking, econometric models are unstable. Agents reaction to structural changes in policy known as the Lucas effect has not been significant.

Consider, for instance, an economy where investment and labor demand derive from present value of net revenues of firms 1 (assuming perfectively competitive environment) which is discounted by , being the interest rate, for every time period. 1+ To alter the value of firms net revenues, a government can conduct a monetary policy that affects real interest rate defined by the Fisher equation = + , or = , where is the inflation rate. However, as far as firms are concerned, they expect a change in inflation as central bank changes . Instead of , we thus have +1 = [+1 | ] = [+1 ] (where is information at time , then = +1 . It is unlikely that remains constant (Da Silva 2009:2). The Surprise Aggregate Supply (SAS) function, or the Lucas Aggregate Supply function, presents output as a function of monetary surprise unanticipated price or money shocks in the economy. Time (or dynamic) inconsistency happens when an economic agent's preferences change over time, i.e. what is preferred at one point in time is inconsistent with what is preferred at another point in time. An important example is when government policy makers promise lower inflation at some point today for tomorrow, only to backtrack when tomorrow comes because negative effects become more apparent. Time inconsistency problems supposedly emerge when people's current decisions depend on expectations of future policies.

Surprisingly, it is the forward-looking models incorporating rational expectations and deep parameters that exhibit clear evidence of instability, rendering them seemingly inferior to the backward-looking models (Estrella & Fuhrer 2003). Apparently, deep parameters resulted to less predictive power of models. Glenn Rudebusch (2005) of the Federal Reserve Bank of San Francisco had the same result, questioning the empirical relevance of the Lucas critique in his study of stability of usual vector autoregressive (VAR) 9 empirical representations used in statistically correlating macroeconomic parameters.

3. Comparing Apples and Oranges: Microfoundations fallacy of composition?


The apparent instability of models with deep parameters coincided with questions on the theoretical viability of having microfoundations explain aggregate economic phenomenon. Is it, as new classical economists believe, really possible to take into account the behavior of the economy as a whole through microeconomic assertions on the behavior of individual economic agents? This is important to ask because if the answer to the question is negative, then the whole microfoundations effort may be put into jeopardy, especially since empirical results have not been too positive. The next three subsections explore three facets of the aggregation problem. Aggregating Demand: The SonnenscheinMantelDebreu Result Let us take aggregate consumer demand. Is it possible to characterize aggregate demand provided that individual demands are well-behaved? The SonnenscheinMantelDebreu theorem (named after economists Grard Debreu, Rolf Ricardo Mantel, and Hugo Freund Sonnenschein) answer to this is negative. The theorem states that the excess demand function of an aggregate economy is not restricted by rationality of individual demands. Consequently, it is not sufficient that individual excess demands satisfy the Weak Axiom of Revealed Preferences (WARP) for the aggregate excess demand to satisfy WARP also (see Figure 1). Thus, it is asserted that microeconomic rationality assumptions have only macroeconomic implications under limited conditions (Sonnenschein 1973, as cited by Da Silva 2009:6). Furthermore, economic equilibrium may neither be unique nor stable in a pure exchange economy with many Figure 1: Individual demand may satisfy WARP but not aggregate demand (Mas-Collel, Whinston, & Green interdependent markets.
1995:110)

Mas-Collel, Whinston, and Greens (1995:599) discussion of what it dubbed as the Anything Goes Theorem (in their popular graduate-level book Microeconomic Theory) pointed at wealth effects as the culprit. Consider an
9

Vector autoregression (VAR) is a statistical model used to capture the linear interdependencies among several time series. The application of VAR to economics is known to be championed by Charles Sims, earning him the 2011 Nobel Prize in Economics for his efforts and contribution.

economy in equilibrium. We know that price changes, which affects demand for a good relative to others because it becomes cheaper or more expensive compared to other goods (substitution effect), also tend to change the wealth of individual consumers, in turn compelling them to change their demand vectors i.e. increase their demand of some goods and/or decreasing their demand of others according to their preferences. Since substitution and wealth effects can either reinforce or neutralize each other, we are not assured of a single price vector that can clear the market 10. Mas-Collel et al. (1995:199) thus said that aside from homogeneity of degree zero and satisfaction of Walras law properties of excess demand (as well as other basic properties such as continuity), we cant derive any other property of the aggregate excess demand function from the individual excess demand functions. The consequence of SonnenscheinMantelDebreu theorem for microeconomic theory is staggering, and deeply negative. Some economists even argue that general equilibrium which underpins microfoundations does not apply to large economies and may only be applied to small groups such as committees, clubs, villages and other local organizations (Chiappori et al. 2004, as cited by Risvi 2006:238). Moreover, it had a real effect on the aggregation problem and on microfoundations. Kenneth Arrow (1986, as cited by Risvi 2006:233) said that in the aggregate, the hypothesis of rational behavior has in general no implications.Properties of aggregate demand is simply only deductible from individual demand only under restrictive conditions; thus macroeconomic phenomenon is almost impossible to derive from microeconomic assumptions. Aggregating Supply: Reswitching and the Cambridge Capital Controversies Let us now take aggregate production. Is it possible to deduce the aggregate production function provided one knows the properties of individual firms production functions? Mas-Collel, Whinston, & Green (1995:147) asserted that unlike aggregating demand, aggregating supply is supposed to be easy because the absence of a budget constraint implies that individual supply is not subject to wealth effects (there are only substitution effects along the production frontier as price changes). But is there a reason to be that optimistic? Unfortunately for aggregation theory, there are also pending questions to be answered on the matter of aggregate supply. The Cambridge Capital controversy so called because it was largely a battle between economists from Cambridge, England (such as Pierro Sraffa, Joan Robinson, Luigi Pasinetti, Pierangelo Garegnani) and Cambridge, Massachusetts, US (neoclassical economists such as Paul Samuelson, Robert Solow, Frank Hahn, Christopher Bliss) which raged in the 1950s to mid-1970s (Cohen & Harcourt 2003:201) dealt with the question of whether it is possible or not to jump from microeconomic conception of production to social production. English economists insisted that aggregate production functions are largely a result of sloppy habits of thought (Robinson 1953, as cited by Cohen & Harcourt 2003:199) because it suffers from a fallacy of composition. American neoclassical economists, on the other hand, insist that the model is good enough, and that the English economists failed to provide an alternative model anyway.
In neoclassical microeconomics, equilibrium is assured because the number of excess demand functions equals the number of prices, which means we have a determinate system (number of unknowns equal number of equations). However, a unique solution is only guaranteed if the equations are linear, which is assured if consumers exhibit parallel, straight, wealth expansion paths at any price vector . A necessary and sufficient condition for this is that preferences admit indirect utility functions of the Gorman form with the coefficients on the same for every consumer , i.e. (, ) = () + () (Mas-Collel, Whinston, and Green 1995:119). If indirect utility functions are non-linear, then the system of individual excess demand functions may not have a unique root.
10

The problem begins with the Solow-Swan/neoclassical aggregate one-commodity production function which assumes positive and diminishing marginal returns of factor inputs (capital and labor), i.e. = between the rate of return over additional aggregate input and the amount of aggregate input. For instance, rate of return of aggregate capital depends on how much aggregate capital there is, which diminishes for each additional aggregate capital.

( , )

> 0 , =

( , )

> 0,

2 ( , )
2

= ( , ) <0 ,

2 ( , ) 2

< 0 a one way relationship

But how does one compute for the amount of aggregate input? The labor input, in this case, is usually measured as man-hours and can thus be treated homogenously. Thus, we can then say that the real rate of return for each additional man-hour is diminishing. The problem stems from the computation of the value of capital. For homogenous capital supply, this problem becomes trivial. But assuming heterogeneous capital in which we cannot aggregate physical units (like chairs and laptops) how does one measure, valuate, and aggregate capital? For a neoclassical economist, one can just proceed to add up the money value of heterogeneous capital items. On this, it is plausible price capital stock as either cost of production or projected revenue from the future output stream. But in either case, since it involves time, the rate of return becomes part of the equation (Cohen & Harcourt 2003:201). For instance, a classical economist would computer for price as = ( ) + ( )(1 + ), where is the rate of capital return. Thus, what we have is no longer a one way relation between aggregate capital and rate of return. The former no longer just determines the latter, aggregate capital amount (which is valuated through total price of capital) is also determined by rate of return. A circular relationship is established. This creates a problem for the Solow-Swan diminishing marginal returns assertion for aggregate capital. This stems from the fact that the choice of capital is also partly determined by the rate of return for the capital used. Samuelson (1966, as cited by Cohen & Harcourt 2003:202) provides intuition on this through a champagne-making example. Assume that two techniques for making champagne using capital and labor only. In technique A, 7 units of labor make 1 unit of brandy in one period, which ferments into champagne in the next period; in technique B, 2 units of labor make one unit of grape juice in one period, which ripens into wine in the next period available for 6 units of labor shaking to turn it into champagne in the third period. Samuelson then used Piero Sraffas concept of dead or dated labor to represent capital goods, which then have compounding rate of return . The cost equations of techniques A and B then becomes 7(1 + )2 and 2(1 + )3 + 6(1 + )1 respectively.

time period 3 2 1 0

input or output

technique A 0 7

technique B 2 0 6 1

labour input output

0 1

Table 1: Samuelsons (1966) demonstration of Wicksell effects.

Cost equations are: Technique A7(1 + )2 : Technique B: 2(1 + )3 + 6(1 + )1

This exposes capital to so-called Wicksell effects instability in the value of aggregate capital at different rates of return. At < 50% and > 100%, technique A is cheaper than technique B. However, at 50% < < 100% , technique B yields less cost than technique A. If we are going to plot the aggregate capital per labor with respect to rate of return , what we will get is not simply a downward sloping graph, but a graph that switches to B at = 100% and reswitches back to A at = 50% (see Figure 2). The circular relationship between capital and rate of return renders inadequate the neoclassical concept of diminishing returns. The Cambridge Capital Controversy thus casted doubts on efforts Figure 2: Demand for Capital per Labor in Samuelson's (1966) Example [Cohen & Harcourt, 2003:203] to construct aggregate production functions from knowledge of individual production. Thus, even the supply-side of microfoundations are in as weak foundations as its demandside has been due to SonnenscheinMantelDebreu theorem. Even Frank Hahn (1972:8, as cited by Cohen & Harcourt 2003:206) who is part of the neoclassical consensus have admitted that aggregate production functions are open to severe logical objections. Thus, it is not surprising that by 1970s to early 1980s, aggregate production functions fell into disuse, only to be reused without resolving much less addressing the reswitching problem by the proponents and adopters of endogenous growth and real business cycle theorists (Cohen & Harcourt 2003:206).

Representing the Aggregate: Will the real Representative Agent please stand up?
Economists have often attempted to sidestep the aggregation problem by the use of the representative agent whose preferences, constraints, and technology mimic that of the aggregate economy. In some cases, the whole is represented as if it were the outcome of a single individuals decision problem. The possible differences between individual and aggregate economic behavior are thereby assumed away (Janssen 2006:3). This approach conveniently bypasses the aggregation problem in both supply and demand, as well as in many economic modeling problems. The problem then becomes proving the existence of such representative agent, and that the representative agents preferences and technology is stable across redistribution of wealth. Consider Mas-Collel et al.s normative representative consumer 11. By itself, proving the existence of a such a consumer suffers from the same predicament as that of the aggregate demand function with the Sonnenschein MantelDebreu result it exists under very restrictive conditions. Mas-Collel et al. (1995:118-121) explained that
To define this, it is useful first to take note first the positive representative consumer, a fictional individual whose utility maximization problem when facing societys budget set would generate the economys aggregate demand function see Definition 4.D.1 of (Mas-Collel, et al. 1995:116). From here, we have to ask the question whether or not the positive representative consumer captures the welfare of society as a whole i.e., whether the positive representative consumer is normative as well. We can then define a normative representative consumer with respect to a social welfare function as a positive representative consumer with an indirect utility function generated by optimal wealth distribution i.e. it solves the given social welfare function see Definition 4.D.3 of (Mas-Collel, et al. 1995:118). The normative consumer is thus an outcome, always, of an optimal distributional solution.
11

the existence of a normative representative consumer requires individual consumers to have indirect utility functions of the Gorman form (with the coefficients on the same for every consumer , i.e. (, ) = () + () ). But what kind of preferences can the Gorman form represent? It turns out that if all consumers in an economy have homothetic preferences 12 then their preferences can be represented by Gorman form utility functions. However, leading microeconomist Angus Deaton (1992:8-9) observed that homothetic utility functions are rejected by empirical evidence, primarily because they require goods to have identical unit elasticities (Engel curves are straight lines across the origin). Deaton wrote that the supposition that there are neither luxuries nor necessities contradicts both common sense and more than a hundred years of empirical research. But more than the difficulty of the existence of a normative representative consumer, economist Alan Kirman (1992:123) attacked the concept the representative individual above as misleading policy analyses. Kirman notes that in models with representative consumers, one makes policy change then examines new equilibrium for the representative with the implicit assumption that the choice of the representative will match aggregate choice 13. However, changes involve frequently affects individuals differently (many policy changes are, in fact, differential) so that the representative constructed before the change will no longer represent the economy after the change. This runs into counter with the efforts of those who hoped Lucas critique will be addressed by microfoundations that ensure a model invariant to policy changes. But even assuming that representative individuals matches the aggregate in whatever change, we run into another difficulty; that is, that even though a representative individual choices matches the aggregate choices, the preferences of the agent may be completely different with the preferences of all the individuals (Kirman 1992:124). Kirman cites as example by M. Jerison (1984, as cited by Kirman 1992:124) which uses a numerical example based on a Cobb-Douglas function. Kirman (1992:134) argued that given that (1) even if all individuals in the economy are well-behaved, it may not produce a well-behaved representative agent, (2) reaction of individuals to change might not be reflected by the representative agent, and (3) preferences of representative agent may be diametrically opposed to the individuals composing the economy, the representative agent concept has no future and actually deserve a decent burial, as an approach in economic analysis that is not only primitive, but fundamentally erroneous (Kirman 1992:119).

4. Reintroducing the Divide: the Post-Keynesian Outlook


Despite, all criticism and challenges on the philosophical and empirical basis of microfoundations due to the aggregation problem and the problem of representation, the Lucas dream to unite microeconomics and macroeconomics has remained to dominate mainstream economic research (Janssen 2006:6). Economist Roger Garrison captures this concept of a united economics when he said that, describing what he thought of economist Friedrich Hayeks (of the Austrian school) outlook, that there are macroeconomic phenomena but only microeconomic solutions.

12 13

It is also true for preferences that are quasilinear with respect to the same good.

Kirman (119:123) notes that this assumption usually comes with the caveat we will ignore distributional considerations. This may be so because the proof of the existence of a normative representative consumer is a distribution concern vis--vis a given social welfare function.

10

Nonetheless, there are now some economists who have already moved away from the idea that we should start at the level of the isolated individual to theorizing in terms of groups who have collectively coherent behavior with assumptions about the organization of society (Kirman 1989:138, as cited by Risvi 2006:231), emphasizing that aggregate regularities are not solely generated by market interaction. This trend and school of thought, in which the insights of Cambridge economist Pierro Sraffa as well as Joan Robinson had often been categorized, has often been called as the Post-Keynesian view. Economist Maarten Janssen (2006:8) of the Tinbergen Institute wrote a good description of the Post-Keynesian view on expectations, with some Post-Keynesians arguing for the irreducibility of macroeconomic issues to purely microeconomic considerations where individuals actions are based on expected utility calculations. Post Keynesians hold that long-term expectations are largely determined by non-economic processes such as those determined by mass psychology. These expectations therefore should be regarded as exogenous to the economic model, rather than as endogenously determined as in the case of rational expectations. In this sense, important investment decisions are, by their nature, long-term decisions that are largely determined by the state of these long-term expectations (Janssen 2006:7). This outlook supposedly comes close to the results of the overlapping generations general equilibrium model (Geanakoplos & Polemarchakis 1986, as cited by Janssen 2006:7) which shows that indeterminacy of equilibria mean that expectations on market outcomes can be exogenous. If this direction is to be pursued, the behavioral economics must shift emphasis from observing and analyzing individual behavior (regardless of whether they are rational or not) to characterizing of behavior of individuals in large groups incorporating advances in crowd psychology and herd behavior suspected to exist in the stock market, especially during booms or busts. For the mean time, while the post-Keynesian mass psychology develops into a framework which can be used to scientifically explain macroeconomic phenomena and behavior, large-scale macroeconometrics from the tradition of Klein and without the deep parameters owing to the Lucas critique can remain to be a descriptive tool in analyzing economic patterns, especially since they are seen to be empirically adequate anyway (Estrella & Fuhrer 2003). The future of economics may be seen then as reintroducing the divide between macroeconomics and microeconomics. This only makes sense because in reality, peoples decisions, preference, tastes and even productivity are contextual features which depend on who they are with, when, and for how long. Attempting to characterize the entire economy by making more sophisticated our models of individual behavior reminds us of a famous quote from Friedrich Hayek: The curious task of economics is to demonstrate to men how little they know about what they imagine they can design.

References
Chari, V. V., and Patrick J. Kehoe. "Modern Macroeconomics in Practice: How Theory Is Shaping Policy." Journal of Economic Perspectives (American Economic Association) 20, no. 4 (2006): 328. Cohen, Avi J., and G. C. Harcourt. "Retrospectives: Whatever Happened to the Cambridge Capital Theory Controversies?" Journal of Economic Perspectives (American Economic Association) 17, no. 1 (2003): 199214. 11

Deaton, Angus. Understanding Consumption (Clarendon Lectures in Economics). New York: Oxford University Press, 1992. Estrella, Arturo, and Jeffrey C. Fuhrer. "Monetary Policy Shifts and the Stability of Monetary Policy Models." Review of Economics and Statistics (MIT Press) 85 (2003): 94-104. Friedman, Milton Landau. "The Role of Monetary Policy." American Economic Review (American Economic Association) 58 (1968): 1-17. Janssen, Maarten. "Microfoundations." Tinbergen Institute Discussion Paper, May 2006. Kirman, Alan P. "Whom and What Does the Representative Individual Represent?" Journal of Economic Perspectives (American Economic Association) 6, no. 2 (1992): 117-136. Lucas Jr., Robert E. "Econometric Policy Evaluation: A Critique." Carnegie Rochester Conference Series on Public Policy. 1976. 19-46. Mankiw, N. Gregory. "The Macroeconomist as Scientist and Engineer." NBER Working Paper (National Bureau of Economic Research), no. 12349 (June 2006). Mas-Collel, Andreu, Michael D. Whinston, and Jerry R. Green. Microeconomic Theory. New York: Oxford University Press, 1995. Rizvi, S. Abu Turab. "The Sonnenschein-Mantel-Debreu: Results after Thirty Years." History of Political Economy (Duke University) 38 (2006). Rudebusch, Glenn D. "Assessing the Lucas Critique in Monetary Policy Models." Journal of Money, Credit, and Banking ( Wiley-Blackwell, Ohio State University) 37 (2005): 245272.

12

Вам также может понравиться