This study is the first research in Hong Kong in understanding the effectiveness of RCA since its mandatory use for investigating SEs in 2007. The results of this study, especially in the categorization of root causes and recommendations, can provide meaningful information for the Hong Kong in improving its quality in incident investigations and subsequent risk mitigation actions.
One of the major challenges for the RCA panels is to identify the system vulnerabilities contributing to the underlying latent failures of the organisation.[14, 25] According to the results by our grouping, about 46% of the root causes were identified to be related to staff behavioural factors, for example, violation, lack of vigilance and lapse of concentration. In some RCA reports, the Review Team noticed that only staff behavioural factors were identified and no other system factors had been identified. In fact, human failures like slips, lapse and mistakes are normal human behaviour and can be difficult to eliminate.[26] The identification of such root causes is only superficial and only demonstrates that humans are imperfect, but is not meaningful in solving the problem.[14] The large proportion of staff behavioural factors suggests that the RCA panels were not able to recognize different aspects of systems issues such as equipment and workflow design flaws, poor usability of system interface, work overload and inadequate safety culture.[10, 14, 16, 18] This observation in the HA can be explained by the reason that the last corporate-led RCA training in the HA was conducted in 2009 and most RCA members have not been formally trained in RCA investigations in the past decade given the lack of training opportunities. This is especially true for the clinical experts who are invited to join the RCA according to their respective clinical expertise. These experts, generally with more than 10 years of clinical experience, would have limited understanding in RCA, safety systems knowledge and improvement science, as these have only taught in the medical undergraduate curriculum in the recent decade in Hong Kong.[27]
The ‘violation’ root causes likely demonstrates the misconception to the term ‘violation’ amongst RCA panels in our review. From a human factors perspective, ‘violations’ are deviations from safe operating practices, procedures, standards, or rules, and have to be deliberately performed by the staff.[26] The Review Team noticed many RCA reports concluded that the staff had violated the policy or guideline, which was contributed by the causal factors of staff ‘having forgotten to perform a checking step’ or ‘not being aware of the situation’. There was no further investigation on the reasons for ‘violation’ nor was evidence of the staff’s intention to deliberately violate the rules provided. This observation is of particular importance as such misconception in ‘violation’ might have led to an unfair judgement being made to the involved staff. It is important that all RCA panel members buy into the purpose and principles behind an RCA as an incident investigation method, and to identify what is wrong at the system level and promote learning and sharing.[28] Tools like the Culpability Decision Tree or its recently adapted version, Just Culture Guide by the UK National Health Service, would be helpful in facilitating RCA panels to differentiate violations from other factors causing the staff not to follow the policy or guidelines and bringing a non-blame culture to the organisation.[29, 30]
At the HA public hospitals, ‘5 whys’ and fishbone diagram are the commonest tools used to identify root causes. Though easy to use, both techniques have their drawbacks and RCA panel members have to use these techniques with caution.[31, 32] Other incident investigation and human factors tools and techniques including fault tree analysis, cognitive walkthrough, task analysis, heuristic evaluation techniques and interview question bank, are very useful to facilitate investigation on the evaluation of workflow, equipment and user interface and support data analysis and should be considered in the investigation process.[12, 33–38] Currently, the HA RCA report template does not provide any of the above tools for the RCA panel to make reference. In many studies and organisations’ incident management guide, human factors considerations are key components in conducting a robust RCA while quality improvement expertise is vital to effective implementation and process monitoring of action plans. It is advised that RCA members should be formally trained in human factors to support incident investigation and identification of system issues, make use of different tools and techniques to facilitate the investigation and analysis, and understand improvement science to implement action plans effectively.[10, 12, 14, 16, 28, 3738, 40, 41]
The study results showed that most of the recommendations were weak. Observations of high proportion of weak RCA recommendations have been reported in other studies using similar methodologies.[10, 14, 15] The Review Team noticed that in most RCAs, when a root cause of staff behavioural factor was identified, the corresponding recommendations would generally be to share the incident in department meetings, re-educate the involved staff or enhance their awareness through one-off training. These are weak recommendations as they attempt to change the human behaviour but do not treat the underlying ‘why’ problems.[10, 25] The Review Team also noticed some recommendations were not clearly linked to the root causes. For example, in an incident of wrong dose of Gentamicin administration, one of the root causes was the unclear content of verbal communication among nurse, dispenser and pharmacist. The report did not mention how ‘unclear’ the communication was while the corresponding recommendations were to share the incident in a training forum and in the nurses’ meeting. The Review Team believed that the RCA panel in this incident should further elaborate how and why the communication had broken down and a specific enhancement in that particular communication process between the staff would be a more appropriate recommendation rather than solely sharing the incident with staff. Indeed, sharing of incidents and their findings is in the regular incident management process and should not be a specific recommendation.[37, 38] The recommendations written in RCA reports should be actions inducing systems changes.[8] If training is identified as a recommendation at last by the RCA panel like the example above, the training should explain the risks and consequences of not communicating effectively with other staff during a procedure, and teach the necessary knowledge and skills required to address this.[42]
Other system factor root causes were also found to propose weak recommendations in our review. Studies suggested that the tendency of RCA panels to propose weak recommendations is generally caused by the lack of understanding in RCA, limited knowledge in the hierarchy of controls and human factors.[10, 14, 16, 28] The RCA panels might perceive the investigation to be restricted to within the department such that organisational issues at a broader level were ignored for discussion. Within such confines, the choices of recommendations implementable solely at the departmental level become limited and additional staff training or putting more reminders has been the prominent actions arising from RCAs.[19] Stronger recommendations are also known to be costly and require more attention and monitoring to complete.[37, 38, 43] Though the RCA Panels are nominated directly by Hospital Chief Executives in the HA, political considerations for driving organisational change are often required for stronger recommendations, and those affecting fundamental systems of the whole HA may be difficult to ignore. For example, all hospitals and institutes in the HA use the same electronic clinical management system for patient documentation and management. When an RCA panel has identified a loophole in the electronic clinical management system, instead of directly asking the Information Technology Department to make the corrective actions, the proposed actions have to go through a series of processes including a number of platforms at hospital and corporate levels for stakeholder consultation, seek approval from different hospital and organisational committees, and lastly conduct a series of feasibility tests by software technicians. This process usually takes months if not years to complete. During such processes, hospital staff may question the RCA panel’s capability as they believe that the panel has not identified the appropriate root causes, and some observers may still assign blame to the involved staff as this is more visible.[16, 19]
In addition, staff may blame the RCA panel for introducing new or perceived unrealistic changes at the organisational level for one single adverse event which they believe to be contributed to by a staff’s mistake.[17] The organisation might also have difficulty in assessing the vulnerability of the system by one single event.[16] The RCAs investigating individual, similar incidents during this review were found to produce inconsistent recommendations. An example would be in incidents where known drugs that cause allergic reactions to patients were prescribed and administered. The causes were found to be due to the input of the allergen in the “free text” section of the electronic medication management system, and thus could not automatically check cross-sensitivity of the drugs to alert staff. However, in some reports the recommendations were to provide training or reinforce staff compliance in checking allergy histories, while some reports suggested converting the free text allergy entries into structured entries to enable drug allergy checking by the electronic system. These inconsistent recommendations may be a product of multiple RCAs being conducted for individual events that have similar causes rather than being collectively reviewed and identifying underlying themes. Such variations in recommendations could affect staff’s impression about the quality of RCA as a whole.
These anticipated difficulties and conflicts in proposing system modifying recommendations may encourage the shift to weaker recommendations to avoid the RCA panel being held responsible and taking up the role in the complicated consultation process.[19] Recommendations like training and education or ‘to explore the feasibility of implementing an action plan or revising the practice’ have become common, yet are considered weak as the feasibility in implementing the actions are not yet certain or concrete. Even though stronger recommendations leading to some system changes have been suggested, they are usually limited to the departmental level and do not reach the whole organisation. Similar incidents therefore still recur in the same facility.
The results provided insights to the Review Team in proposing suggestions to enhance RCA quality in public hospitals and other healthcare organisations. First, regular training for RCA panel members must be conducted, and systems thinking and human factors should be an essential component of the training. The correct concept of ‘violations’, or in general the human error model must be understood, and available RCA tools should be promoted and panelists trained in their application.[44] References of investigation tools can also be added to the existing HA RCA report template to facilitate ease of access and serves as a prompt for use. Promotion of no-blame culture to all hospital staff must also be carried out to encourage staff in focusing on system issues instead of blaming individuals.[16] Skills in writing RCA reports have to be developed with practice so that the root causes and recommendations can be written more specifically and enable a new reader to immediately understand the issues and solutions.[18]
Second, members with human factors expertise should be invited to join RCA panels as it may help shift any focus on blaming individuals to identifying systems and design flaws. This expertise can also support the design of patient safety initiatives.[10, 16, 44] Inviting staff who understand the involved workflows can also help the RCA panel quickly understand the detailed nuances of the situation.[12] Training a core group of staff specialized in incident investigations within the organisation would be a possible solution to solve the RCA expertise and sustainability problems. This core group of “specialists” can retain their knowledge and skills in RCA and they would have more opportunities to participate in an RCA.[18] From a broader perspective, an independent institution as a third party, with positioned similarly to the National Aeronautics and Space Administration of the US or the Healthcare Safety Investigation Branch of the UK,[45, 46] in the territory could be developed and hospitals can draw on expertise from this institution to support incident investigations as an independent party. The advantages of these independent institutions include minimising internal conflict, promoting unbiasedness, accuracy and credibility of RCA findings, and separate incident investigations and learning from violation disciplinary actions.[47]
Third, the HA must promote a safety culture to all staff, which includes understanding the goals of a RCA, and reduce the barriers of RCA panels in proposing stronger recommendations focusing on organisational changes.[43] Fourth, the HA should consider aggregating analysis of incidents to counter the inconsistencies that will inevitably arise from multiple RCAs investigating similar issues.[6, 10, 16, 37, 38] By implementing these suggestions, the Review Team believes that incident investigations and improvements arising from them will become more effective in the organisation. The root causes will be more focused on system defects and a higher proportion of strong or medium recommendations would be anticipated. Last but not least, Australia has reviewed its SE list in 2017 and launched a second version in December 2018.[48, 49] With the current SE and SUE lists being implemented in Hong Kong for about a decade, the list should be reviewed to ensure it aligns with the goal of effectively monitoring and preventing serious adverse patient harm events.
This study has two limitations. First, the study only reviewed the strengths of recommendations but not the work progresses and feasibilities. Due to the boundary of action hierarchy in categorizing recommendations of ‘additional study / analysis’ as weak, the Review Team believed that there will be more strong or medium actions taken when the recommendations are studied to be feasible. Second, this study is the first study in the HA to review RCA effectiveness. There was no other local data available for benchmarking. If improvements like conducting RCA training and improving the human factors expertise of RCA panel members are implemented, a follow-up study would be beneficial.