Human error is cited over and over as a cause of incidents and accidents. The result is a widespread perception of a 'human error problem', and solutions are thought to lie in changing the people or their role in the system. For example, we should reduce the human role with more automation, or regiment human behavior by stricter monitoring, rules or procedures. But in practice, things have proved not to be this simple. The label 'human error' is prejudicial and hides much more than it reveals about how a system functions or malfunctions. This book takes you behind the human error label. Divided into five parts, it begins by summarising the most significant research results. Part 2 explores how systems thinking has radically changed our understanding of how accidents occur. Part 3 explains the role of cognitive system factors - bringing knowledge to bear, changing mindset as situations and priorities change, and managing goal conflicts - in operating safely at the sharp end of systems. Part 4 studies how the clumsy use of computer technology can increase the potential for erroneous actions and assessments in many different fields of practice. And Part 5 tells how the hindsight bias always enters into attributions of error, so that what we label human error actually is the result of a social and psychological judgment process by stakeholders in the system in question to focus on only a facet of a set of interacting contributors. If you think you have a human error problem, recognize that the label itself is no explanation and no guide to countermeasures. The potential for constructive change, for progress on safety, lies behind the human error label.
David D. Woods, Ph.D. is Professor at Ohio State University in the Institute for Ergonomics and Past-President of the Human Factors and Ergonomics Society. He was on the board of the National Patient Safety Foundation and served as Associate Director of the Veterans Health Administration's Midwest Center for Inquiry on Patient Safety. He received a Laurels Award from Aviation Week and Space Technology (1995). Together with Erik Hollnagel, he published two books on Joint Cognitive Systems (2006). Sidney Dekker is Professor and Director of the Key Centre for Ethics, Law, Justice and Governance at Griffith University in Brisbane, Australia. Previously Professor at Lund University, Sweden, and Director of the Leonardo Da Vinci Center for Complexity and Systems Thinking there, he gained his Ph.D. in Cognitive Systems Engineering from The Ohio State University, USA. He has worked in New Zealand, the Netherlands and England, been Senior Fellow at Nanyang Technological University in Singapore, Visiting Academic in the Department of Epidemiology and Preventive Medicine, Monash University in Melbourne, and Professor of Community Health Science at the Faculty of Medicine, University of Manitoba in Canada. Sidney is author of several best-selling books on system failure, human error, ethics and governance. He has been flying the Boeing 737NG part-time as airline pilot for the past few years. The OSU Foundation in the United States awards a yearly Sidney Dekker Critical Thinking Award. Richard Cook, M.D. is an active physician, Associate Professor in the Department of Anesthesia and Critical Care, and also Director of the Cognitive Technologies Laboratory at the University of Chicago. Dr. Cook was a member of the Board of the National Patient Safety Foundation from its inception until 2007. He counts as a leading expert on medical accidents, complex system failures, and human performance at the sharp end of these systems. Among many other publications, he co-authored A Tale of Two Stories: Contrasting Views of Patient Safety. Leila Johannesen, Ph.D. works as a human factors engineer on the user technology team at the IBM Silicon Valley lab in San Jose, CA. She is a member of the Silicon Valley lab accessibility team focusing on usability sessions with disabled participants and accessibility education for data management product teams. She is author of The Interactions of Alicyn in Cyberland (1994). Nadine Sarter, Ph.D. is Associate Professor in the Department of Industrial and Operations Engineering and the Center for Ergonomics at the University of Michigan. With her pathbreaking research on mode error and automation complexities in modern airliners, she served as technical advisor to the Federal Aviation Administration's Human Factors Team in the 1990's to provide recommendations for the design, operation, and training for advanced 'glass cockpit' aircraft and shared the Aerospace Laurels Award with David Woods.
'This book, by some of the leading error researchers, is essential reading for everyone concerned with the nature of human error. For scholars, Woods et al provide a critical perspective on the meaning of error. For organizations, they provide a roadmap for reducing vulnerability to error. For workers, they explain the daily tradeoffs and pressures that must be juggled. For technology developers, the book offers important warnings and guidance. Masterfully written, carefully reasoned, and compellingly presented.' Gary Klein, Chairman and Chief Scientist of Klein Associates, USA 'This book is a long-awaited update of a hard-to-get work originally published in 1994. Written by some of the world’s leading practitioners, it elegantly summarises the main work in this field over the last 30 years, and clearly and patiently illustrates the practical advantages of going "behind human error". Understanding human error as an effect of often deep, systemic vulnerabilities rather than as a cause of failure, is an important but necessary step forward from the oversimplified views that continue to hinder real progress in safety management.' Erik Hollnagel, MINES ParisTech, France 'If you welcome the chance to re-evaluate some of your most cherished beliefs, if you enjoy having to view long-established ideas from an unfamiliar perspective, then you will be provoked, stimulated and informed by this book. Many of the ideas expressed here have been aired before in relative isolation, but linking them together in this multi-authored book gives them added power and coherence.' James Reason, Professor Emeritus, University of Manchester, UK 'This updated and substantially expanded book draws together modern scientific understanding of mishaps too often simplistically viewed as caused by "human error". It helps us understand the actions of human operators at the "sharp end" and puts those actions appropriately in the overall system context of task, social, organizational, and e