Systems Thinking Part I: What is the “Systems Thinking Approach?”

Understanding the leading theory of accident causation to improve safety context and risk management work.

 

If you’ve been to a risk management conference lately or dabbled in your own professional development in this area within the past few years, you’ve likely encountered this term: systems thinking, or, the systems thinking approach.

But, what is it?

Perhaps a question many of us have after these types of conference sessions is, “What’s the difference between the various safety science theories,” and “How does this knowledge change my work, exactly?”

In part one of this article I’ll describe the leading theoretical framework in safety science, officially called the systems thinking approach to accident causation, and Rasmussen’s (1997) Risk Management Framework.

In part two, I’ll outline two risk management methods that are gaining more traction among outdoor program professionals, and that I’ve used the most in my work as a program director and a risk management director.

Feel free to skip around or skip ahead to Part II: Using Theory to Drive More Effective Risk Management Work. The goal of this two-part article is that you walk away with a new or more informed understanding of why you should subscribe to a contemporary and leading theoretical framework and how this knowledge can guide your risk management work to be more informed and effective.

 

What is the systems thinking approach?

Systems Thinking is a school of thought. When applied to risk and safety management, it’s the leading theory that describes how accidents occur (Salmon et al., 2020).

By understanding how incidents happen, we can construct the models and methods that help us to analyze and learn from them. The theory directs us on how to make the system safer.

As a school of thought, systems thinking helps describe the behavior of a system– itself, not just the interactions of component parts alone.

Systems thinking is in contrast to linear or reductionist thinking, which focuses on isolating and understanding components and individual parts (think: multiple contributing factors leading to an incident versus a single, root cause alone).

In other words, the system itself becomes the unit of analysis, rather than just the malfunction of a piece of equipment or the decision of a field instructor alone.

 

“A system is an interconnected set of elements that is coherently
organized in a way that achieves something.”

– DONELLA MEADOWS, THINKING IN SYSTEMS (2008, PG 8) 

 

There is a long history of systems-thinkers, extending to the late 1800s in the West, across millennia in the East, and it is prevalent in numerous Indigenous knowledge systems.

Donella Meadows is a prominent systems thinker and the founder of Systems Dynamics, a problem-solving theory that identifies leverage points, or– the most important points to improve that will maximize systems change. Her seminal book Thinking in Systems: A Primer (2008) describes the fundamental concepts of systems thinking, such as system behavior over time and how systems operate. I won’t get into it too much here, but I do encourage readers to explore Meadows’ work for an in-depth explanation of systems thinking outside the risk management context.

 

Safety Science Theory: Key concepts from an evolution thinking

Jens Rasmussen (1997) provides a theory and framework for accident causation.

Building from many theorists, Rasmussen’s work builds on Perrow’s (1984) Normal Accident Theory, which describes, due to normal and inherent systems characteristics, failures and work that deviates from prescribed practices are normal and expected.

This means that deviations from prescribed practices, like a written policy or procedure, are normal. Expanding on this, incidents in safety-critical environments (like outdoor programs) are inherent and should be anticipated.

This likely comes as no surprise to outdoor programming folks who know the inherent dangers of recreating and working outdoors.

Rasmussen’s framework also builds on Reason’s (1990) Swiss Cheese model. Reason’s work describes how accident conditions are latent in an organization and are on a trajectory to influence safety outcomes at the front line, or sharp end, of the system.

In other words, the holes in the Swiss cheese’s defenses, like gaps in policy, staffing, equipment, and training, align. This alignment creates the context for accident conditions in the field.

Rasmussen’s Risk Management Framework (RMF; 1997) builds on these and other theorists’ work to outline the hierarchical work system (shown below), where external and internal pressures influence safety context.

Safety context comes from the level above and is further influenced in a cyclical loop by feedback from the level below. This process is called vertical integration:

 

Rasmussen’s Hierarchical Work System Applied
to Outdoor Programs

Each level of the work system creates the risk and safety context for the level below. In turn, each level below gives feedback to the level above, which further influences safety context. Adapted from Rasmussen, 1997.

 
 

Migration of Work Practices

The internal and external factors that influence safety context are normal economic and workload pressures (think Perrow & normal work). As a result, work practices within an organization migrate over time– they drift (Dekker, 2011).

Migration of work practices occurs at multiple systems levels, such as among the field staff leading groups in the field, program administrators who feel the season’s crunch to get groups fielded, and the organization’s business leaders who drive sales and partnerships.

Migration occurs across many hierarchical levels and causes the system’s safety defenses to erode (think Reason’s Swiss cheese model). Work practices, then, cross the invisible safety boundaries. It is this intersection where safety incidents occur.

Safety incidents, then, are caused by a combination of migration and triggering events:

Dekker’s Drift Into Failure model (DIF; 2011) outlines several key drivers of migrating work practices:

a) competition and scarcity,
b) decrementalism (where constant and small changes lead to unsafe behaviors),
c) initial conditions of the system (and their influence) that create unsafe behaviors,
d) unruly technologies that behave in unanticipated ways (Crowd Strike, anyone?!), and
e) protective structures competition for resources that negatively affect safety behavior.

Leveson (2004) adds that, in addition to engineered systems and direct intervention, behavior is controlled by policies, shared values, and other aspects of organizational and social culture. These factors are useful to understand safety context, and when measured, they are helpful in identifying indicators of migration.

Drivers of migration are helpful in telling us where to look in our assessments and investigations. They shine light on the levers leaders can pull to improve safety context in their organizations.

 

Rasmussen’s Migration Model

Migration of work practices is influenced by economic and workload pressures, causing defenses to erode. Incidents happen when work practices cross an invisible safety boundary and are initiated by a triggering event. Adapted from Rasmussen, 1997.

 
 

3 Principles of Contemporary Safety Science

According to Goode et al. (2019), all of these contemporary and popular frameworks in safety science have in common:

  • Multiple contributing factors span multiple hierarchical systems levels, from the teacher or field instructor, equipment, and environment at the scene of an incident, all the way up to and including, the government and regulations governing the program and activity;

  • All actors across all systems levels share responsibility for safety outcomes and incident prevention, from the folks in the field to members of the board;

  • Incident analysis should be free from blame and examine the whole system, rather than the individuals who work in it.

    • Safety improvements and incident prevention strategies should prioritize optimizing the interactions between the systems’ component parts, rather than focusing on individual components in isolation.

 

References

Dallat, C., Salmon, P. M., & Goode, N. (2017). The Networked hazard analysis and risk management system (NET-HARMS). Theoretical Issues in Ergonomics Science, 19(4), 456–482. http://doi.org/10.1080/1463922x.2017.1381197

Dekker, S. (2011). Drift into failure: From hunting broken components to understanding complex systems. CRC Press. https://doi.org/10.1201/9781315257396

Goode, N., Salmon, P. M., Lenné, M., & Finch, C. (2019). Translating systems thinking into practice: a guide to developing incident reporting systems. CRC press. https://doi.org/10.1201/9781315569956

Leveson, N. G. (2004). A new accident model for engineering safer systems. Safety Science, 42(4), 237–270. http://doi.org/10.1016/s0925-7535(03)00047-x

Meadows, D. H. (2008). Thinking in systems: a primer (D. Wright, Ed.). Chelsea Green Publishing.

Perrow, C. (1984). Normal Accidents: living with high risk technologies. Basic Books.

Rasmussen, J. (1997). Risk management in a dynamic society: A modeling problem. Safety Science, 27(2–3), 183–213. http://doi.org/10.1016/s0925-7535(97)00052-0

Reason, J. (1990). Human error. Cambridge University Press.

Salmon, P. M., Hulme, A., Walker, G. H., Waterson, P., Berber, E., & Stanton, N. A. (2020). The big picture on accident causation: A review, synthesis and meta-analysis of AcciMap studies. Safety Science, 126, 1–15. http://doi.org/10.1016/j.ssci.2020.104650

Salmon, P., Williamson, A., Lenné, M., Mitsopoulos-Rubens, E., & Rudin-Brown, C. (2010). Systems-based accident analysis in the led outdoor activity domain: Application and evaluation of a risk management framework. Ergonomics, 53(8), 927–939. http://doi.org/10.1080/00140139.2010.489966

 

Wondering how the theory can improve your work?

Continue to Part II: Using theory to drive more effective risk management work– 5 lessons from contemporary safety science that program directors, school administrators, and risk managers can use to improve their work.

Stuart Slay

Stuart Slay is an independent risk management advisor and safety educator working with schools and outdoor activity programs. He is based in Taipei, Taiwan, and Seattle, Washington.

Previous
Previous

The Definitive Guide To Leveraging Culture For Safer Programming & Operations

Next
Next

Systems Thinking Part II: Using Theory To Drive More Effective Risk Management Work