Learning from Evidence

John Sterman, 2006, American Journal of Public Health, 96, 3

Why is it that when a politician announces a reduction in hospital waiting lists for surgery, the net effect is often to make the situation worse? In this article, John Sterman from MIT (who has written widely on systems thinking) explores why policies to promote health and welfare can end up failing, or even exacerbating the problems they are meant to solve.

This is where systems thinking comes in. Many of the social issues we face are themselves linked to earlier actions, decisions, or system effects. Because of the complexity of social systems, the real impacts of policy interventions might only be seen at a different time, or in a different place, to that intended.

Sterman argues that the things we label ‘side effects’ or ‘unanticipated events’ are in fact simply systems working in ways that we failed to foresee, because our mental models were too narrow or our time horizons were too short.

In complex settings it is unrealistic to expect that continued and diligent application of ‘logical’ interventions will ensure that ‘targeted’ policies and programs will be successful. If this were the case, then former Australian Prime Minister, Bob Hawke’s promise that ‘no Australian child will be living in poverty’ by 1990 would have been fulfilled.

Complexity, Sterman notes, is dynamic and evolving, so taking a snapshot of the various parts of a system will only ever show part of it.

What often happens is ‘policy resistance’, whereby the intended effects of a policy or program are delayed, distorted or even overturned by the responses of different parts of the system to those interventions.

Sterman cites the example of how the introduction of tar and nicotine cigarettes had the effect of actually increasing the intake of toxins because smokers tended to take longer, more frequent drags on their cigarettes to compensate. Antibiotic over-usage and subsequent bacterial resistance is another well-known example.

Like other systems thinkers, Sterman cautions against leaving policy solutions to the experts. People are reluctant to make fundamental changes to their beliefs and behaviours simply because they are told to by experts. The profound changes needed to tackle poverty, homelessness, or global warming for example, require ‘complementary changes in education, incentives and institutions’, according to Sterman. It is only by educating and involving the range of stakeholders (including the public at large) that they will be able to ‘learn’ from any evidence presented to them, and hence support more considered interpretations. Contrast the educational initiatives and policies used in Australia to combat HIV/AIDS in the 1980s and 1990s, with the current policy approaches and (lack of) education around the treatment of asylum seekers.

Typically, leaders and decision makers don’t understand the range of feedback loops surrounding their decisions, which are frequently made around the low-level ‘leverage points’ that Donella Meadows tells us are typically ineffective. Even when strong evidence is available, our mental models lead to ‘erroneous but self-confirming inferences’, Sterman says. Often the response is to do more of the same, rather than take a step back, consult with stakeholders and try to understand the system better.

Sherman also discusses the need for ‘double-loop learning’ in addressing problems. This idea, first introduced by Chris Argyris, involves not simply acting to correct problems (this is single-loop learning) but taking a step back and looking at the underlying systems, assumptions and mental models that led to the problem in the first place. In doing so it is important to become aware of our own biases and ‘defensive routines’ – those routine interpersonal behaviours we use to save face, make our untested beliefs seem like facts, and suppress dissent.

All of us fall back on them at from time to time, in spite of our best intentions!

Sterman concludes this article by recommending we become more familiar with modelling and simulation tools, and what he calls ‘virtual worlds’, to augment the evidence we collect from the real world. If we couple more rigorous and reflective application of the scientific method, along with collaborative enquiry skills and an openness to multiple perspectives, he is optimistic we will be more likely to find success in leadership for social change.

Post a comment