Complexity and the Economy Page 5
that evolves procedurally in a series of events; it becomes algorithmic.
There is a danger that seeing the economy this way is merely bowing to
a current fashion in science, but the idea allows me to make an important
point. Suppose for a moment that we—or better, Laplace or “God”—know
the algorithm12 behind the computation (the large but finite set of detailed
mechanisms by which the economy, or the part of it that interests us, makes
its next move). A fundamental theorem in computation (Turing, 1936) tells
us that in general (if we choose an algorithm randomly) there is no way—no
systematic analytical method—to tell in advance whether that algorithm or
computer program will halt (as opposed to going on forever, or cycling). Since we could arrange that an algorithm halt if its output fulfilled some particular set of mathematical conditions or reached a given “solution,” in general we
cannot tell that that will be the case either. In other words there is no analytical method to decide in advance what a given algorithm will do.13 All we can
do is follow the computation and see what it brings. Of course, with simple
algorithms we can often see they will settle down to a given outcome. But
algorithms don’t have to be particularly complicated before we cannot decide
their outcomes (Wolfram, 2002).
So we need to be cautious. For highly interconnected systems, equilib-
rium and closed-form solutions are not the default outcomes; if they exist
10. Current circumstances would of course include relevant past history or memory of past history.
11. Modern computational thinking sees computation as ongoing, concurrent (parallel), distributed, and often probabilistic. See the 2010 ACM Ubiquity Symposium What Is Computation? See also Beinhocker (2011).
12. Earlier I argued that the economy’s future is indeterminate, so strictly speaking the economy is not perfectly algorithmic. Hence for this thought experiment I posit a
“God” who can determine how each agent would react in all circumstances.
13. Including whether it converges (or stays within a given neighborhood of some limit forever).
[ 8 ] Complexity and the Economy
they require justification. And computation for such systems should not be regarded as the avoidance of analytical thinking; rigorously speaking, it may
be completely necessary. We can often do much useful pre-analysis of the
qualitative properties of nonequilibrium systems, and understand the mecha-
nisms behind these; still, in general the only precise way to study their out-
comes is by computation itself.
Of course the algorithm behind the actual economy is not randomly cho-
sen, it is highly structured, so it may be that the actual economy’s “computa-
tions” always have simple outcomes. Or it may equally be that the economy’s
computations are always unordered and amorphous. Usually in the parts of
the economy we study, neither is the case. Often, especially when there are
strong countervailing forces at work, we see large structures—regions of
attraction that correspond loosely to equilibria. And within these (or in their absence) we also see mechanisms that cause phenomena or sub-patterns or
sub-structures to appear and disappear randomly from time to time. To give a
physical analogy, consider the sun. From afar it appears to be a large gaseous ball in uniform spherical equilibrium. But within this “equilibrium,” powerful mechanisms cause dynamic phenomena such as gigantic magnetic loops and
arches, coronal holes, X-ray bright spots, and mass plasma ejections moving
at up to 2,000 kilometers per second. The gaseous ball indeed displays a loose spherical shape, but it is never at equilibrium. Rather it is seething with activity that disrupts the possibility of equilibrium and builds from earlier disruptions. These phenomena are localized and can act at many scales. And they
are transitory or temporal—they appear, disappear, and interact, seemingly
randomly in time.
We will find a similar situation frequently in the economy. Theorizing
in nonequilibrium then would mean uncovering large attractors at work (if
indeed there are any), but also studying other sub-structures or phenomena
that might be present for their properties and behavior. We can use care-
fully designed computer experiments to do this, often using statistics on
the results to isolate phenomena and the mechanisms that cause these. And
in many cases we can construct simpler toy models of a phenomenon that
capture its essential features and allow us to use mathematics or stochastic
theory to study it. The objective, we should remember, is not necessarily to
formulate equations or to arrive at necessary conditions. The objective, as it is with all theory, is to obtain general insights.
Let us put some of these ideas together by looking at an actual nonequilib-
rium study performed computationally. Here is a classic example.
In 1991 Kristian Lindgren constructed a computerized tournament where
strategies competed in randomly chosen pairs to play a repeated prisoner’s
dilemma game. (The details of the prisoner’s dilemma needn’t concern us;
think of this as simply a game played by a specified current set of strategies.) The strategies consisted of instructions for how to move given the opponent’s
comPlexi t y economics [ 9 ]
immediate past moves. If strategies did well they replicated and mutated, if they did badly they were removed. Lindgren allowed that strategies could
“deepen” by using deeper memory of their opponent’s immediate past moves
and their own. So in our language we can think of such strategies as “explor-
ing” strategy space: they change and adapt if they are not successful. Lindgren found that at the start of his tournament, simple strategies such as Tit-for-Tat dominated, but over time, deeper strategies appeared that exploited the simple ones. In time, further deeper strategies emerged to take advantage of these with periods of relative stasis alternating with dynamic instability (Figure 1).
The dynamics are simple enough that Lindgren could write them as stochas-
tic equations, yet these give far from a full picture; we really need computa-
tion to see what is going on. What emerges computationally is an ecology—an ecology of strategies, each attempting to exploit and survive within an environment created by itself and other strategies attempting to exploit and sur-
vive. This ecology is a miniature biosphere where novel species (strategies)
continually appear, exploit the environment created by existing species, and
do not survive if they fail. Notice that evolution has entered, but it hasn’t been brought in from outside, it has arisen in the natural tendency of strategies
to compete for survival. The point is general in this type of economics. What
constitutes a “solution” is typically an ecology where strategies, or actions, or beliefs compete; an ecology that may not settle down, that has its own characteristic properties and can be studied qualitatively and statistically.14
In Lindgren’s study, the outcome differs from one run of the computation
to another. In many runs an evolutionary stable strategy appears, a compli-
cated one that relies on four periods of memory of past actions. In other runs the system continues to evolve. In some runs we see the quick emergence of
complicated strategies, in others these appear later on. And yet there are constants: phenomena such as coexistence among strategies, exploitation, the
spontaneous emergence of mutualism, sudden collapses, pe
riods of stasis and
unstable change. The picture resembles paleozoology more than anything else.
I have put forward Lindgren’s study as an example of doing nonequilib-
rium economics and the reader may be wondering how the study of such
computer-based worlds can qualify as economics, or what relationship this
might have to doing theory—it certainly doesn’t look very mathematical. My
answer is that theory does not consist of mathematics. Mathematics is a tech-
nique, a tool, albeit a sophisticated one. Theory is something different. Theory lies in the discovery, understanding, and explaining of phenomena present
in the world. Mathematics facilitates this—enormously—but then so does
14. In the well-known El Farol problem (Arthur, 1994a) an ecology of ever-changing individual forecasts emerges, along with an overall equilibrium attractor state.
Metaphorically the individual trees change, but the shape of the forest persists.
[ 10 ] Complexity and the Economy
1.0
1001
0001
0001
1001000100010001
0.8
01
0.6
10010001
00011001
0.4
10
0.2
0.00
10 000
20 000
30 000
40 000
50 000
60 000
Figure 1:
Strategies in Lindgren’s computerized tournament. The horizontal axis denotes time, the vertical axis numbers using a particular strategy, the labels code for the memory-depth of strategies.
computation. Naturally, there is a difference. Working with equations allows
us to follow an argument step by step and reveals conditions a solution must
adhere to, whereas computation does not.15 But computation—and this more
than compensates—allows us to see phenomena that equilibrium mathemat-
ics does not. It allows us to rerun results under different conditions, exploring when structures appear and don’t appear, isolating underlying mechanisms,
and simplifying again and again to extract the bones of a phenomenon.
Computation in other words is an aid to thought, and it joins earlier aids in
economics—algebra, calculus, statistics, topology, stochastic processes—each
of which was resisted in its time. The computer is an exploratory lab for eco-
nomics, and used skillfully, a powerful generator for theory.16
All this suggests a way forward for our nonequilibrium way of looking at
the economy. We can see the economy, or the parts of it that interest us, as the ever-changing outcome of agents’ strategies, forecasts, and behaviors. And
we can investigate these parts, and also classic problems within economics—
intergenerational transfers, asset pricing, international trade, financial
transactions, banking—by constructing models where responses are speci-
fied not just at equilibrium but in all circumstances. Sometimes our models
will be amenable to mathematical analysis, sometimes only to computation,
sometimes to both. What we can seek is not just equilibrium conditions, but
15. Note that we can always rewrite any algorithmic model in equation form (any computation by a Turing machine can be represented in equation form) so that, rigorously speaking, computation-based analysis is as mathematical as standard analysis.
See Epstein (2006).
16. For computation’s role in theorizing in mathematics, physics, biology, and earth-sciences, see Robertson (2003). See also Bailey (2010) and Chaitin (2006).
comPlexi t y economics [ 11 ]
understandings of the formation of outcomes and their further unfolding, and of any dynamic phenomena that appear.
PHENOMENA AND THE MESO LEVEL
What dynamic phenomena then appear under nonequilibrium? And how do
these, and nonequilibrium, connect with complexity? I will take these two
questions in succession. To look at what patterns or structures might appear
in the economy under nonequilibrium, we can begin by looking at the differ-
ence the filter of equilibrium makes to the patterns we see. To set ideas, consider a simple model of something slightly outside the economy, traffic flow.
A typical model would acknowledge that at close separation from cars in
front, cars lower their speed, and at wide separation they raise it. A given high density of traffic of N cars per mile would imply a certain average separation, and cars would slow or accelerate to a speed that corresponds. Trivially, an
equilibrium speed emerges, and if we were restricting solutions to equilibrium that is all we would see. But in practice at high density, a nonequilibrium phenomenon occurs. Some car may slow down—its driver may lose concentration
or get distracted—and this might cause cars behind to slow down. This imme-
diately compresses the flow, which causes further slowing of the cars behind.
The compression propagates backwards, traffic backs up, and a jam emerges.
In due course the jam clears. But notice three things. The phenomenon’s onset
is spontaneous; each instance of it is unique in time of appearance, length
of propagation, and time of clearing. It is therefore not easily captured by
closed-form solutions, but best studied by probabilistic or statistical meth-
ods. Second, the phenomenon is temporal, it emerges or happens within time, and cannot appear if we insist on equilibrium.17 And third, the phenomenon
occurs neither at the micro-level (individual car level) nor at the macro-level (overall flow on the road) but at a level in between—the meso- level.
What about the economy more generally? If we are willing to take away the
equilibrium filter, what phenomena might we see there and how will these
operate? I will mention three.
17. We could of course model this as a stationary stochastic process that includes jams, and artificially call this an “equilibrium” process. Some neoclassical models do this (e.g. Angeletos and La’O, 2011), which would seem to negate my claim that standard economics doesn’t handle nonequilibrium. But closer scrutiny shows that such nonequilibrium behavior is always contained within an overall equilibrium wrapper, typically within some overall stochastic process that remains stationary (and hence “in equilibrium”). Such models stretch the neoclassical paradigm by appearing to be “in equilibrium,” but at their core are nonequilibrium processes, so I include them as such under the argument here.
[ 12 ] Complexity and the Economy
The first is self-reinforcing asset-price changes, or in the vernacular, bubbles and crashes. To see how these are generated consider the Santa Fe artificial
stock market (Palmer et al., 1994; Arthur et al., 1997). In this computer-based model the “investors” are artificially intelligent computer programs, who for
the reasons given earlier, cannot simply assume or deduce a given “rational”
forecasting model, but must individually discover expectations (forecasting
models) that work well. The investors randomly generate (or discover) their
own forecasting methods, try out promising ones, drop those that don’t work,
and periodically generate new ones to replace them. The stock price forms
from their bids and offers, and thus ultimately from agents’ forecasts. Our
market becomes an ecology of forecasting methods that either succeed or are
winnowed out, an ecology that perpetually changes as this happens.18 And we
see several phenomena, chief among them, spontaneous bubbles and crashes.
To see how these appear, we can extract a simple version of the m
echa-
nism from our experiment. Suppose some of our investors “discover” a class
of trading forecast that essentially says “If the price has risen in the last k periods, expect it to increase by x% next period.” Suppose also, some investors (they could even be the same investors) “discover” forecasts of the type: “If
the current price is more than y times fundamental earnings (or dividend) value, expect it to fall by z%.” The first forecasts cause bubble behavior: if the price rises for a while, investors will buy in, thus validating it, which may cause a further rise. Eventually this drives the price high enough to trigger the second type of forecast. Investors holding these sell, the price drops, which
switches off the upward forecasts, causing other investors to sell too, and a
crash ensues. The scale and duration of such disruptions vary, they happen
randomly in time, so they cannot be predicted. What can be predicted is that such phenomena will occur, and will have certain probability distributions of
size and scale.
A second temporal phenomenon is clustered volatility. This is the appearance of random periods of low activity followed by periods of high activity. In our artificial market these show up as periods of low and high price volatility.
Low volatility reigns when agents’ forecasts are working reasonably well mutu-
ally; then there is little incentive to change them or the results they produce.
High volatility happens when some agent or group of agents “discover” better
predictors. This perturbs the overall pattern, so that other investors have to change their predictors to readapt, causing further perturbation and further re-adaptation. (This pattern is clearly visible in Lindgren’s study, Figure 1.) The result is a period of intense readjustment or volatility. Such random periods
of low volatility alternating with high volatility show up in actual financial market data, where they are called GARCH behavior.
18. Cf. Soros’s (1987) observation that “stock markets are places where different propositions are tested.”
comPlexi t y economics [ 13 ]
A third phenomenon, more to do with space than with time, we can call sudden percolation. When a transmissible change happens somewhere in a network, if the network is sparsely connected the change will sooner or later peter out for lack of onward connections. If the network is densely connected, the