Skip to main content

The system approach

The system approach

Health systems add complex organisational structures to human fallibility , thus substantially increasing the potential for errors. A systems approach to error recognises that adverse events rarely have an isolated cause and that they are best addressed by examining why the system failed rather than who made the mistake. This is clearly outlined in the 2013 report by Professor Don Berwick called A Promise to Learn – A Commitment to Act , which looked into improving patient safety in the NHS, where the emphasis is on a system-wide approach to patient safety . However, greater analysis of organisational safety needs to go beyond understanding not only why unanticipated events occur, which has been referred to as safety 1; it also needs to understand the robustness and r esilience within systems due to human innovation and adaptability , which frequently pre Donald Berwick , b.1946, Professor of Pediatrics and Health Care Policy , Harvard Medical School, Boston, MA, USA. Herbert W Heinrich , 1886–1962, engineer, Hartford, CT , USA, a pioneer in industrial safety . James T Reason , b.1938, Professor Emeritus of Psychology , University of Manchester, Manchester, UK. vents untoward events and is referred to as safety 2. Safety 1 places the emphasis on identifying errors after the event and aims to prevent them from occurring or recurring in the futur e, whereas safety 2 acknowledges that healthcare work is resilient and that everyday performance succeeds much more often than it fails. This is because clinicians constantly adjust what they do to match the conditions. Working flexibly , and actively trying to increase clinicians’ capacity to deliver more care more e ff ectively , is key to this new approach. At its heart, proactive safety management focuses on how everyday perfor - mance usually succeeds rather than why it occasionally fails, and it actively strives to improve the former rather than simply preventing the latter. The publication ‘Fr om Safety-I to Safety-II: A White Paper’ (2015) expands on this concept and stresses the importance of assimilation of these two ways of thinking. Sophisticated healthcare systems need not only to examine what w orks well but also to examine adverse events and understand and plan for adverse outcomes. Balancing these concepts should be con - sidered an investment not only in safety but also in improving productivity and patient and sta ff well-being. The underlying principles to these approaches to risk man - agement stem from theories such as Heinrich’s safety pyramid, which proposes tha t each major injury within a system masks a multiple of minor injuries and near misses. This model stresses the importance of near-miss re porting in order to fully under - stand the spectrum of patient safety and allow adequate risk management planning. More recently James Reason proposed the ‘Swiss cheese’ model of causation to explain the consequences of multiple errors that result in harm, analogous to the holes in a Swiss cheese, which if aligned crea te a defect that has adverse - consequences. For organisational safety to be e ff ective each respective roles in order to avoid the summation of error and resultant harm. Consequently , the more layers of responsibil ity , the fewer the chances of adverse events occurring. Summary box 15.2 Understanding patient safety incidents /uni25CF /uni25CF /uni25CF /uni25CF /uni25CF /uni25CF

Errors can be viewed from a person-centred or a system approach The majority of near misses or adverse events are due to system factors Understanding why these errors occur and applying the lessons learnt will prevent future injuries to patients It is important to report all near misses or adverse events so that we can constantly learn from mistakes Error models can help us understand the factors that cause near misses and adverse events Examining what works well may be an additional constructive approach to de /f_i ning safe patient pathways

The system approach

Health systems add complex organisational structures to human fallibility , thus substantially increasing the potential for errors. A systems approach to error recognises that adverse events rarely have an isolated cause and that they are best addressed by examining why the system failed rather than who made the mistake. This is clearly outlined in the 2013 report by Professor Don Berwick called A Promise to Learn – A Commitment to Act , which looked into improving patient safety in the NHS, where the emphasis is on a system-wide approach to patient safety . However, greater analysis of organisational safety needs to go beyond understanding not only why unanticipated events occur, which has been referred to as safety 1; it also needs to understand the robustness and r esilience within systems due to human innovation and adaptability , which frequently pre Donald Berwick , b.1946, Professor of Pediatrics and Health Care Policy , Harvard Medical School, Boston, MA, USA. Herbert W Heinrich , 1886–1962, engineer, Hartford, CT , USA, a pioneer in industrial safety . James T Reason , b.1938, Professor Emeritus of Psychology , University of Manchester, Manchester, UK. vents untoward events and is referred to as safety 2. Safety 1 places the emphasis on identifying errors after the event and aims to prevent them from occurring or recurring in the futur e, whereas safety 2 acknowledges that healthcare work is resilient and that everyday performance succeeds much more often than it fails. This is because clinicians constantly adjust what they do to match the conditions. Working flexibly , and actively trying to increase clinicians’ capacity to deliver more care more e ff ectively , is key to this new approach. At its heart, proactive safety management focuses on how everyday perfor - mance usually succeeds rather than why it occasionally fails, and it actively strives to improve the former rather than simply preventing the latter. The publication ‘Fr om Safety-I to Safety-II: A White Paper’ (2015) expands on this concept and stresses the importance of assimilation of these two ways of thinking. Sophisticated healthcare systems need not only to examine what w orks well but also to examine adverse events and understand and plan for adverse outcomes. Balancing these concepts should be con - sidered an investment not only in safety but also in improving productivity and patient and sta ff well-being. The underlying principles to these approaches to risk man - agement stem from theories such as Heinrich’s safety pyramid, which proposes tha t each major injury within a system masks a multiple of minor injuries and near misses. This model stresses the importance of near-miss re porting in order to fully under - stand the spectrum of patient safety and allow adequate risk management planning. More recently James Reason proposed the ‘Swiss cheese’ model of causation to explain the consequences of multiple errors that result in harm, analogous to the holes in a Swiss cheese, which if aligned crea te a defect that has adverse - consequences. For organisational safety to be e ff ective each respective roles in order to avoid the summation of error and resultant harm. Consequently , the more layers of responsibil ity , the fewer the chances of adverse events occurring. Summary box 15.2 Understanding patient safety incidents /uni25CF /uni25CF /uni25CF /uni25CF /uni25CF /uni25CF

Errors can be viewed from a person-centred or a system approach The majority of near misses or adverse events are due to system factors Understanding why these errors occur and applying the lessons learnt will prevent future injuries to patients It is important to report all near misses or adverse events so that we can constantly learn from mistakes Error models can help us understand the factors that cause near misses and adverse events Examining what works well may be an additional constructive approach to de /f_i ning safe patient pathways