By now most of you should be finished or most of the way through our first book, and we will be opening up the discussion portion of the book club. I am posting some discussion questions to get us started, but if there are any other salient points or questions you would like to ask, you are welcome. As always, let’s keep the discussion fact based, and cite your sources if needed.

1. Charles Perrow introduces us to concepts such as complex vs linear, tightly coupled vs loosely coupled,  and ranks different systems based upon their complexity and coupling. Where would you list your organization, or for that matter, emergency management? Why?

2. Perrow waffles somewhat on the concept of risk homeostasis: the idea that we all have a natural risk level, and that new safety features entice us to push the system to higher limits. Sometimes he accepts it as a legitimate cause of behaviour, other times he dismisses it, and still others he attributes it more to the macro-scale of production (it is less the individual’s risk homeostasis than the industry’s homeostasis). Which do you think is the more correct formulation? Do you think we are always seeking to push the boundaries, that it’s more the production pressures from the system, or does none of that play a role and it is just a natural consequence of the system at large?

3. Perrow in the last chapter takes a pretty dim view of what was to him recently a new class of professionals: risk assessors. Arguably many regulatory agencies are in bed with the groups they oversee, but he takes it one step further: arguing that they exist to legitimize the wisdom of the crowd. Expert knowledge, statistics and mathematical probabilities, he feels exist as a way to legitimize what would otherwise be unacceptable risks to a world, not as an objective measure of risk. Is he right? What does that mean for our profession (arguably we would fit in the class of risk assessors)?

4. This book was written back in the 1980s. The world has undoubtedly changed since then. What new technologies have arisen that you think fit in that class of risks that we should abandon, modify, or accept?

5. Are there any specific lessons, concepts, or ideas that you felt you could take away and use in your organization?

2 thoughts on “Book Club – Exploring Normal Accidents: Living with High Risk Technologies

  1. Well it’s been a while (I gave it some extra time) and admittedly not too much conversation has been going on, so as a change of pace I’m going to leave my thoughts here:

    1. Charles Perrow introduces us to concepts such as complex vs linear, tightly coupled vs loosely coupled, and ranks different systems based upon their complexity and coupling. Where would you list your organization, or for that matter, emergency management? Why?

    I can’t speak for my current organization (I haven’t been there that long), but speaking from past experience, I worked at Pearson for an airline, which as Perrow points out is a pretty tightly coupled system. Every second of the day, security, infrastructure, ticketing, IT, aircraft, ATC, etc. etc. all interfaced in a way that produced a near seemless product. But with all these components and systems, our organization was rife with bizarre and unintentional outputs.

    I can recall the time that we lost power to all of the carousels and had to bring in bags to the Arrivals hall through the security doors. Why? Maintenance was being done in the Terminal and the construction workers had to shut-off the emergency power lines, while leaving the main ones on. It turns out in a slight bit of inefficient efficiency, the people who designed the Terminal only placed the carousels on the emergency line, probably figuring that if there was ever a situation where emergency power was lost too, there were bigger issues at hand. Well, a seemingly good idea made our night a lot worse, and thankfully there were only three flights left at that hour.

    As for emergency management as a whole: the planning aspect I’d say is rather loosely coupled (stuff we do is repeatedly tested often there is a lag from when the plan is put into action and when we write it), but the response portion can vary. Quick-onset events are tightly coupled by design, and as a response grows larger, it is all we can do to keep the response from becoming incomprehensible. While it is different in consequence from a nuclear power plant’s incomprehensibility, there are many stories of crucial supplies sitting idle because of logistical bottlenecks or the mobilization order was never given. Consequences for unrealized and unknowable errors, until the whole machine kicks into gear.

    2. Perrow waffles somewhat on the concept of risk homeostasis: the idea that we all have a natural risk level, and that new safety features entice us to push the system to higher limits. Sometimes he accepts it as a legitimate cause of behaviour, other times he dismisses it, and still others he attributes it more to the macro-scale of production (it is less the individual’s risk homeostasis than the industry’s homeostasis). Which do you think is the more correct formulation? Do you think we are always seeking to push the boundaries, that it’s more the production pressures from the system, or does none of that play a role and it is just a natural consequence of the system at large?

    My opinion is that it’s a bit of both. I think we all tend to push the boundaries conciously or unconciously, but I would lean more towards system pressures as root causes. I think most people, when given the chance, will take the safer option, but the nature of the circumstances they’re placed under will dictate the risk as well.

    While anti-lock breaks and seatbelts do let me make stupider moves while I’m in my car, I am less likely to drive like a maniac if I know I have lots of time to make it to my destination, as opposed to when I’m running late. Similarly environments where time, especially the company’s time, is the main consideration, then this is going to produce an environment where shortcuts and corner-cutting is the norm. So I’m more in agreement with Perrow.

    3. Perrow in the last chapter takes a pretty dim view of what was to him recently a new class of professionals: risk assessors. Arguably many regulatory agencies are in bed with the groups they oversee, but he takes it one step further: arguing that they exist to legitimize the wisdom of the crowd. Expert knowledge, statistics and mathematical probabilities, he feels exist as a way to legitimize what would otherwise be unacceptable risks to a world, not as an objective measure of risk. Is he right? What does that mean for our profession (arguably we would fit in the class of risk assessors)?

    I think this is very much a product of its time. I think we tend to have the opposite issue nowadays. Perrow was writing in the tail end of the 70s and early 80s when ‘the expert’ carried a lot more weight. There could be a very easy collusion of expert knowledge and vested interests. However by the 2010s I’d say the sacrosanct position of the expert has been pretty eroded. I think there’s a stronger tendency to appeal to the ‘wisdom of the crowd’ than the ‘expert’ in order to subvert the most prudent choice. After reading the book you’d be hard pressed not to agree with Perrow, but I imagine he’s more jaded on the position of experts because the book is pretty much about the failure of experts.

    4. This book was written back in the 1980s. The world has undoubtedly changed since then. What new technologies have arisen that you think fit in that class of risks that we should abandon, modify, or accept?

    The internet I think is a technology that we’re struggling with. There really hasn’t been anything like it, and it has represented a dramatic shift in how every aspect of our lives is carried out. While there’s a huge subset of technologies that fall under the internet, the ability to suddenly make all voices equal has done untold good and bad for the world. Net neutrality is one way we can modify it, and while I appreciate on the one hand the closer regulation the end of net neutrality will bring to the internet, it may give too much power to too few.

    5. Are there any specific lessons, concepts, or ideas that you felt you could take away and use in your organization?

    I think the concept of complex systems interacting in unintended ways is a useful concept to keep in the back of your mind. The idea that complexity will breed unintended and unknowable results by nature, helps temper some of the decisions we make, and the planning we do. It’s not often what we plan for that manifests, rather the unexpected.

  2. For next quarter’s books, I’m making the following suggestions:

    The Checklist Manifesto:
    Atul Gawande

    I heard one emergency manager call this one of the most important books he has read. We all make mistakes but not all mistakes are made equally. Some are made because we didn’t know better, some are made even though we do. We’ve heard the stories: surgeon’s leaving scalpels inside of bodies, pilots missing key checks and crashing the plane, an knob left on, a switch left flicked, small errors that have catastrophic import.

    This book is on the power of checklists – a small tool with powerful implications that makes sure we do practice what we do know,

    The Failure of Risk Management
    Douglas W. Hubbard

    This is a book that is going to stir the pot. I’ve read this before and his main thesis is a lot of the assessments we do are fundamentally broken. He rails against qualitative assessments that are elastic enough to be meaningless and quantitative methods based upon nothing. Hubbard has some very strong opinions on what constitutes good risk management.

    Thinking Fast and Slow
    Daniel Kahneman

    I’m a big fan of these kinds of books because I feel as emergency managers a big aspect of our job is helping people make better decisions, and in order to do that, we need to really study how they do it. Based upon his lifetime of work, Kahneman goes into Type I and Type II thinking, heuristics, and the kinds of fallacies and shortcuts that we make, which are sometimes useful but sometimes lead us into some pretty astounding errors.

    I’m open to suggestions until the 15th. If I don’t hear anything else by then, we’ll open a poll based upon the above.

Comments are closed.