Human error is cited over and over as a cause of incidents and accidents. The result is a widespread perception of a 'human error problem', and solutions are thought to lie in changing the people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label 'human error' is prejudicial and hides much more than it reveals about how a system functions or malfunctions. This book takes you behind the human error label. Divided into five parts, it begins by summarising the most significant research results. Part 2 explores how systems thinking has radically changed our understanding of how accidents occur. Part 3 explains the role of cognitive system factors - bringing knowledge to bear, changing mindset as situations and priorities change, and managing goal conflicts - in operating safely at the sharp end of systems. Part 4 studies how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in many different fields of practice. And Part 5 tells how the hindsight bias always enters into attributions of error, so that what we label human error actually is the result of a social and psychological judgment process by stakeholders in the system in question to focus on only a facet of a set of interacting contributors. If you think you have a human error problem, recognize that the label itself is no explanation and no guide to countermeasures. The potential for constructive change, for progress on safety, lies behind the human error label.
'This book, by some of the leading error researchers, is essential reading for everyone concerned with the nature of human error. For scholars, Woods et al provide a critical perspective on the meaning of error. For organizations, they provide a roadmap for reducing vulnerability to error. For workers, they explain the daily tradeoffs and pressures that must be juggled. For technology developers, the book offers important warnings and guidance. Masterfully written, carefully reasoned, and compellingly presented.' Gary Klein, Chairman and Chief Scientist of Klein Associates, USA 'This book is a long-awaited update of a hard-to-get work originally published in 1994. Written by some of the world’s leading practitioners, it elegantly summarises the main work in this field over the last 30 years, and clearly and patiently illustrates the practical advantages of going "behind human error". Understanding human error as an effect of often deep, systemic vulnerabilities rather than as a cause of failure, is an important but necessary step forward from the oversimplified views that continue to hinder real progress in safety management.' Erik Hollnagel, MINES ParisTech, France 'If you welcome the chance to re-evaluate some of your most cherished beliefs, if you enjoy having to view long-established ideas from an unfamiliar perspective, then you will be provoked, stimulated and informed by this book. Many of the ideas expressed here have been aired before in relative isolation, but linking them together in this multi-authored book gives them added power and coherence.' James Reason, Professor Emeritus, University of Manchester, UK 'This updated and substantially expanded book draws together modern scientific understanding of mishaps too often simplistically viewed as caused by "human error". It helps us understand the actions of human operators at the "sharp end" and puts those actions appropriately in the overall system context of task, social, organizational, and e
Contents: Preface; Part I An Introduction to the Second Story: The problem with 'human error; Basic premises. Part II Complex Systems Failure: Linear and latent failure models; Complexity, control and sociological models; Resilience engineering. Part III Operating at the Sharp End: Bringing knowledge to bear in context; Mindset; Goal conflicts. Part IV How Design can Induce Error: Clumsy use of technology; How computer-based artifacts shape cognition and collaboration; Mode error in supervisory control; How practitioners adapt to clumsy technology. Part V Reactions to Failure: Hindsight bias; Error as information; Balancing accountability and learning; Summing up: how to go behind the label human error; References; Index.