A few highlights from reading All models are wrong: reflections on becoming a systems scientist; John D. Sterman. This is the paper form of a lecture held in 2002 by John D. Sterman, upon receiving the Jay Forrester award.
What is System Dynamics? It can be seen through multiple lens, as:
- Science: the modeling process assumes hypothesizing, building a model, running an experiment - the simulation of the system through that model, and potentially invalidating the model.
- Engineering: the building of the model itself, building the simulation is engineering practice.
- Applied Mathematics: SD is grounded in Control Theory and Nonlinear Dynamics. Stocks & flows compile to differential equations which can be simulated on a computer.
- Social Science: modeling social systems, explaining social phenomena.
- Philosophy: it is a world view. Seeing feedbacks everywhere. A paradigm, Sterman says.
- Consulting: organizations want to achieve goals, stay competitive, etc..
- Theory of Action: models are used to inform policy changes. They offer theoretical grounding for intervention through policy.
Concepts
- Narrow Model Boundaries. “almost nothing is exogenous”; “invisible fences in the mind”, “infinite sources and infinite absorption capacity in sinks”, “a narrow boundary has cut critical feedback loops”
- I think this is one of the main points of the paper. Awarness of model boundaries. This means being aware of what you leave out of the model, how much is assumed exogenous in the sense that it’s not sourced by the system nor is it affected by the system. How much is part of “the environment” in which the system exists. How much is ignored.
- There’s an example in the paper that depicts Model Narrowness - the economics of Petroleum Resource Exploitation. The first narrow model presented is that of a economics-only view point which ignores geological aspects of resource exploration & exploitation, and considers resources as “exogenous”, meaning they have no beginning and no end, and are not affected by the system’s operation.
- Such a model fails to incorporate critical feedback loops, which makes it’s predictions unreliable, and the policies (or actions) derived from that worldview, ineffective.
- Non-Locality of cause & effect. In complex systems, cause and effect may reach large time & space distance! Intuitively we’re used to understanding locality, but that intuition is not as effective for grasping complex dynamics spanning wider space/time intervals.
- Event-oriented worldviews.
- Shift in feedback loop dominance. I love this one because it makes you ask “what is the dominant influence over this variable” and “how can this influence change over time”. Say we’re interested in a particular quantity, that’s measured in a system which is governed by multiple feedback loops. Each feedback acts with a certain power over the quantity we’re interested in. Over time, feedback dynamics can change: a reinforcing loop may be diminished by a stronger negative feedback, causing the quantity to start falling.
- Policy Resistence. When you have a certain understanding of a system, a given mental model in your head. You picture an intervention to improve some outcome. Policy resistence happens when, despite you implementing that change, that new policy, that new structure, the system over time re-adapts and renders your solution counterproductive. “Today’s solutions become tomorrow’s problems”. Some quick examples:
- increasing taxes
- intention: accrue more tax revenue
- actual system response: tax revenue income falls, as entities migrate to other jurisdictions.
- building more roads
- intention: more fluid traffic, less traffic jams
- actual system response: heavy traffic is maintained, since using the car becomes more attractive, more people use cars, and the problem compounds
- increasing taxes
- Time Delayed feedback. TBD, there’s a whole chapter in Business Dynamics studying the effects of delays in systems with feedbacks.
Quotes
-
“One [student] once told me I was a model professor. I thought this was high praise until I realized that a model is a small imitation of the real thing.”
-
“People frequently talk about unexpected surprises and side effects as if they were a feature of reality.”
-
“Most people, regardless of their background, are comfortable with their current philosophy”.
-
“As system thinkers, we must always strive to break down the false barriers that divide us.”
-
“Today’s solutions become tomorrow’s problems. The result is Policy Resistence, the tendency for interventions to be defeated by the response of the system to the intervention itself. [..] At the root of this phenomenon lies the narrow, event-based, reductionist world view that most people live by”. (the remaining 2 paragraphs are beautiful, but I should stop quoting here.)
-
“We are unaware of the majority of the feedback effects of our actions. Instead, we see most of our experience as a kind of weather: something that happens to us but over which we have no control. Failure to recognize the feedbacks in which we are embedded, the way in which we shape the situation in which we find ourselves, leads to policy resistence in which we persistently react to the symptoms of difficulty, intervening in low-leverage points and triggering delayed and distant but powerful feedbacks.”
-
“School teaches us that every subject is different and knowledge is fragmented. The invisible lines in the mind are the boundaries of our mental models.”
-
“We are continuously pressured by our clients, our students, colleagues and our own egos to slip out of the role of the questioner and learner into the role of expert and teacher. Doing so often fails, by generating defensiveness and resistance. The phrase ‘getting client buy-in’ should be banned from our lexicon. Selling a product is antithetical to the process of inquiry.”
-
“There is no learning without feedback, without knowledge of the results of our actions.”
-
“Developing the capacity to see the world through multiple lenses and to respect differences cannot become an excuse for indecision, for a retreat to impotent scholasticism. We have to act. We must make the best decisions we can despite the inevitable limitations of our knowledge and models, then take personal responsibility for them. Mastering this tension is an exceptionally difficult discipline, but one essential for effective systems thinking and learning.”
-
“It’s by asking these ‘Why?’ questions that we gain insight into how we are both shaped by and shape the world, where we can act most effectively, where we can make a difference and what we are striving for.”
Psychology to succeed in modeling. A healthy mindset. Tips.
- Focusing on the process of modeling rather than results, or historical accuracy/replication.
- Diverse and vast array of tests. Statistical, sensitivity/robustness to parameter chanegs, qualitative assesment of model structure, evaluating results at model limits, being inclusive in the learning phase as the model is build and information from stakeholders integrated, etc..
- Include soft-variables, for which data/measurements are missing, or is more qualitative in nature. Omitting them is stating that they’re influence is Zero; which is false. Qualitative information is still information, knowledge about reality and it may be crucial to be incorporated in the model. If it’s not been measured yet, probably no one has acknowledged its importance. But critical variables are exposed through the process of inquiry and learning, challenge and expansion of mental models, of current understanding.
Concepts to dive deeper into
-
Bifurcation: i finally realized why I had added bifurcation to the list. A few days ago, I read a paragraph in “The Complexity Paradox” which described 3 states a system can be in: equilibrium (dead, no gradients = no activity, no life), near-equilibrium (linearization, the system close to equilibrium can be approximated to a linear behavior, think a transistor in small-signal operation) and far-from-equilibrium.
(p.54) A third condition is a system operating far from equilibrium. Here, behavior is no longer predictable because the system behaves in a nonlinear fashion. Unpredictability results from the capacity of the system to take on a number of surprising properties. Chaos, or small changes in one system variable that lead to unexpected changes in other system variables, cannot be modeled with simple mathematical equations. [..]
Far-from-equilibrium conditions mean the system is subject to a large forcing agency that moves the system beyond regions of stability. In many cases, these systems will self-organize into more stable states called dissipative structures. As the system moves farther from equilibrium and becomes increasingly more unstable, a bifurcation or fork in the road is encountered, ant the system can “choose” which fork to take. These bifurcations lead to more complex stable states. The system acquires complexity by internal differentiation. Internal differentiation can be thought of as the establishment of novel interaction networks among system components. Accordingly, reductionist approaches to understanding complex systems have limited utility because the system reflects couple interactions among components that can be understood only by studying the whole rather than individual components.
I’ll stop quoting here, otherwise I might keep on transcribing, the book’s really good!
-
Criticality: Self-organized criticality, Critical brain hypothesis. I first noticed this when I watched this Quanta Mag video - Could One Physics Theory Unlock the Mysteries of the Brain?.
-
Non-linear differential equations: List of nonlinear ordinary differential equations. I really have no clue here.
-
dissipative structures: See Ilya Prigogine.
Dissipative structure theory led to pioneering research in self-organizing systems, as well as philosophical inquiries into the formation of complexity in biological entities and the quest for a creative and irreversible role of time in the natural sciences.
-
soliton: Soliton (ok, this is really outside the article, but it’s a good ref)