Complex socio-ecological systems/Complex Adaptive Systems

The concept of social-ecological resilience from week 1 includes adaptation, learning and self-organization. This week's reading is an entrée to fundamental system concepts, and shed light on the concept of self-organization. Next week we will look at how Complex Adaptive Systems adapt and learn.

Although it is a difficult concept to grasp, self organization was presented by Folke's resilience paper (Week 1) as a contrast between lack of organization, and organization forced by external factors. Lack of organization = chaos, and organization forced by external factors may be akin to order that can not adopt. In this sense, self-organization is at an intermediate place between order and chaos. In Lansing's Figure 2, he labels this intermediate place between chaotic and ordered (which can be periodic or fixed) as "complex." This place has been popularly called "the edge of chaos," but perhaps more literally could be called "an intermediate state between totally ordered and chaotic."

It may be worth noting that the "edge of chaos" concept comes from physical science, specifically a "phase transition." For example, if you gradually cool liquid water, it slowly cools and the molecules slow down, but they maintain their chaotic motion. As water nears the freezing point and cooling continues, there is a sudden change, when all of the molecules rapidly switch from chaotic (liquid) to ordered (frozen). This phase transition is non-linear, it is a tipping point or sudden change from chaotic to ordered.

The key assertion is that complex adaptive systems populate this intermediate area between order and chaos, or perhaps (more specifically, but hypothetically) they are in the ordered state but close to the edge or boundary of this phase transition. And that being in this state allows systems to self-organize, adapt and evolve ever-increasing complexity.

Required readings:

Lansing, J. S. 2003. Complex Adaptive Systems. Annual Review of Anthropology 32 (2003), pp. 183-204.

Lansing starts with a question (from Holland, the pioneer in the field of artificial intelligence) that is the opposite of the Beer Game: why is a supply chain (in this case, for pickled herring) not chaotic?

The first 10 pages of Lansing are a tough read. Nevertheless, it is about as brief and coherent an explanation of the emerging thinking about complex adaptive systems that you will find. To ease your burden, I offer 2 points. (1) It is true. Well, not "true" in the positivist sense. But I've made a hobby of surveying this literature for the past year (e.g. the Kauffman book described below) and I can tell you there is a deep and very wide-ranging foundation underlying the overview that Lansing offers. (2) You don't need to understand this to enjoy the remaining reading, which is much easier and familiar and applied. But I think it's good to get a flavor of the underlying science, so that when you read that "Both E. coli and IBM coevolve in their respective worlds," you'll have a sense of where that's coming from.

'Levin, Simon A. 1998. Ecosystems and the Biosphere as Complex Adaptive Systems. Ecosystems 1: 431-436' Simon Levin is a famous mathematical ecologist. His article starts with the fundamentals of Complex Adaptive Systems, probably more understandable than that of Lansing -- so you could go back and forth between Lansing's 184-192 and Levin's 432-433.

By focusing on ecosystems and the loss of biodiversity, the author introduces the concept of Complex Adaptive System (CAS) as applied to ecosystems, and represents the existence of processes that link elements of the ecosystem and beyond it partially determining its future state. Levin builds up on Arthur’s work on economics (1994) and Holland (1995) and proposes a simpler framework to look at CAS, with these traits: sustained diversity and individuality of components; localized interactions among those components that have implications at higher scales (hierarchical organization within the CAS); an autonomous process that selects from among those components based on the results of local interactions (selection); and a subset for replication or enhancement (learning). In the process of adaptation CAS acquire hierarchical and aggregation properties that emerge from within the structural linkages and feedbacks amongst elements, and between the system and the environment. CAS cannot follow completely random trajectories throughout its history, but rather is constrained by history (resilience interpreted as resistance to change). At the same time, this history provides windows for opportunity for novel configurations. As environmental conditions change they influence the surrounding matrix on which CAS are embedded: the extent to which CAS will retain its nature will depend on the environmental regimes (e.g. disturbance frequency and intensity), as well as on the functional diversity of the system. At the end of the paper, Levin enunciates 6 questions that are the foundation to understand ecosystems as CAS, and which are basic to understand ecosystem sustainability issues. Some of those have been partially addressed by evolutionary ecologists in the following years (e.g. patterns in the distribution of organisms are determined by local conditions as well as historical and spatial factors). For him human interventions are not an integral component of ecosystem resilience: he rather considers those factors exogenous. Claudia 14:39, 27 January 2011 (UTC)


Additional reading:

Kauffman, Stuart. 1996. At Home in the Universe: The Search for Laws of Self-Organization and Complexity. Oxford University Press, 321 pp

This is a less mathematical version of Kauffman's 1993 "The Origins of Order: Self-Organization and Selection in Evolution." Kauffman believes that selection in insufficient to explain the degree of development of complex adaptive systems, and that there is some innate tendency for such systems to "self-organize". He looks at systems such as the development of life from a set of ever more diverse organic chemicals; ontology of complex organisms from a single, uniform genome; and the co-evolution of ecological communities. The analysis also applies to technology and economic systems.

Much of his analysis focuses on how such systems explore a "fitness landscape" (applying the concept originally developed for biological evolution by Sewall Wright in 1937). He shows that such fitness landscapes are exceedingly large (2 to the nth power, where n is the number of possible "alleles" or different "bits" of information, which quickly becomes larger than the number of atoms in the universe). His basic argument is that the "genetic algorithm" based on mutation and selection is not sufficiently powerful to explore this entire fitness landscape, and that there must be other processes leading to the complex systems that are at near global optima -- self-organization, which he calls "order for free" because it is achieved quicker than simple selection would permit.

Kauffman does not provide the mechanism; he argues that it probably exists and finding the mechanism will be the next great breakthrough in our understanding of nature. He does, however, analyze how several such systems seem to find this "order for free" by working at the "edge of chaos", i.e. systems that are neither chaotic (which would never converge on an optimum in the fitness landscape) nor extremely ordered (which would be frozen at a local optimum and unable to navigate the fitness landscape in search of a near global optimum). For example, figure 3.4 on page 57 shows that connected webs undergo a "phase transition" when the ratio of links to nodes is at 0.5; with fewer connections, the systems consists of a multiple of unconnected parts; with more connections the system gets locked into one large cluster, but there is a non-linear transition at the link:node ratio of 0.5 which is just when one giant interconnected component of the system crystallizes.

Kauffman offers a model of how large, constrained systems can effectively explore the fitness landscape, by breaking it down into "patches" of interconnected components (pp 252-271). Such a model could serve as a representation of federalism, profit centers, restructuring, or checks and balances.

Mitchell, M., 2009. Complexity: a guided tour, New York: Oxford University Press.
Mitchell addresses what is meant by a complex system: “a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution.” She recounts the history of many important scientific discoveries, from information theory, to quantum phenomena, to dynamics. The author explains non-intuitive phenomena (Maxwell's demon) and more intuitive (entropy, information theory, and computation) with equal clarity. Like many in the field the author struggles with adaptive systems, and how information theory often does not have direct application to biological and social systems. She notes that no one quantitative measure of complexity has surfaced, which complicates its study. While innovations in Turing machines and cellular automata can result in sophisticated patterns of activity, Mitchell cautions that the necessary starting conditions and computational time are limiting factors in the application of these ideas to the real world.Tegeticula