Home The Quarterly 2017 Now is the time

The Quarterly

Now is the Time Print E-mail
Dr Lynne McKinlay 
  
  
"Will you speak up if you observe the sort of behaviour that is unlikely to cause harm right now, but if persistent that behaviour will undermine the culture of your organisation? Will you see this as an opportunity to improve patient safety, or will you leave it and hope someone else also saw what you saw and that they will speak up?" 
 
  
Imagine a common scenario in healthcare. A clinical incident, an adverse event for a patient in our care, is about to occur. We know that patients suffer adverse events at a rate of approximately 10%1.  One in ten admissions results in an adverse outcome for a patient. About one in three errors in healthcare are medication errors2.  So, imagine an incident about to occur in which a drug will be given in error. If you observe the incident unfolding and speak up to prevent the patient being harmed, we might call that a ‘good catch’3. The push behind graded assertiveness training to enable staff in all high risk industries to speak up is that as humans it is inevitable that we will make errors; some of them will be serious and someone else is more likely to detect my error than I am. If my colleague speaks up before I harm the patient they are helping both my patient and me4.
 
Our current understanding of the role of human error in patient safety recognises that human factors—people, organisational and system factors—may all contribute to patient harm. James Reason’s creatively described and now famous ‘Swiss cheese model’ tells us an error will result in an incident when ‘holes’ in the many layers of protection designed to catch an error line up and let the error through5. In 1999, the landmark publication To Err is Human, which changed the way we think about patient safety, drew a conclusion that was surprising at that time: ‘that the majority of medical errors do not result from individual recklessness or the actions of a particular group - this is not a “bad apple” problem. More commonly, errors are caused by faulty systems, processes, and conditions that lead people to make mistakes or fail to prevent them’6.
 
What the authors appear to have meant by ‘bad apples’ are individuals who are incompetent, dangerous, reckless, impaired or uncaring, or even those who deliberately undermine safeguards with criminal intent (p. 169)7.
 
Since that time we have strenuously investigated clinical incidents, advocated for a blame-free culture, and doggedly tracked down system factors that can be corrected to mitigate risk. Some of these may be closer in time and space to the actual harm incident and some more remote, the so-called latent failures which Reason called ‘pathogens’ within the system5. Consider the thumb on the plunger of a syringe containing an intravenous medication about to be given in error. In legal terminology this might be called a proximate cause because there is a direct and uninterrupted relationship between the action and the harm event. But there is another sort if harm that we are only really now coming to grips with, and that is distal harm. Distal harm is complex to understand and may even be difficult to identify, as it results from actions not related directly in time or space to the incident.
 
One cause of distal harm that we are now starting to understand and address in a systematic way is the impact of unprofessional behaviour. A considerable body of evidence points to unprofessional behaviours and deviations in individual performance as factors which seriously undermine team function, the culture within which health professionals operate and the delivery of safe care — that such behaviour can undermine a culture of safety8.
 
This is not really new. A paper in 1991 looked at the impact of ‘noncognitive aspects of competence’, what might be called non-technical skills, on critical incidents in anaesthesia9.  Non-cognitive skills included interpersonal skills, willingness to take instruction and unprofessional behaviour. During a two year period commencing in 1987, 45 anaesthetic trainees were observed by 163 faculty anaesthetists. They found that non-cognitive skills were a powerful predictor of the trainees’ overall performance, and inadequacy of these skills was a predictor of critical incidents: errors that would have caused patient harm had the supervisor not intervened. Publications about professional behaviour around this time were interested in professionalism as a training issue. The CanMEDS framework, developed in Canada but now widely adopted and adapted by training colleges in many countries including Australian and New Zealand, clearly identifies the importance of professionalism and non-technical skills for students and trainees10.
 
Fast forward to 2015, a publication by Catron, Hickson and colleagues at the Vanderbilt University Medical Centre (VUMC) hypothesised and demonstrated a striking correlation between the risk of surgical complications and the number of unsolicited patient complaints made against the surgeon in charge. For lower risk patients, there was no great difference between high complaint surgeons and low complaint surgeons, but when the clinical situation became complex, the surgeons with a higher number of complaints against them—complaints relating to communication, respect or accessibility—had significantly poorer patient outcomes11. There are a number of possible explanations for this. As the complexity of the patient or the surgery goes up, the volume and importance of communication and the need for teamwork also rise. A clinician displaying unprofessional behaviour may induce errors around them, as well as modelling poor behaviour. This is not only an issue for students and trainees, but for us all.
 
So perhaps ‘bad apples’ are important after all.  Not in the way that the authors of To Err is Human originally considered, but a different barrel of apples: those who exhibit unprofessional behaviour, whether disruptive and aggressive, or passive and undermining. They may not intend patient harm and in fact may be quite ignorant of this as an outcome, but the evidence certainly suggests that unprofessional behaviour may have this effect12.
 
If you witness a fellow worker not wash their hands, avoid following an agreed process, or break a rule, will you speak up? What about a co-worker who is always just a bit late, or fails to attend handover, or does not return your calls until you page them three times? Dangerous shortcuts and disrespect are not easily captured by commonly used patient safety tools. In a 2010 study, four out of five nurses had concerns about these ‘undiscussables’ but less than a third had spoken with the person who concerned them most13.  How comfortable would most of us be when it comes to speaking with a colleague about unprofessional behaviour? Speaking up to a friend might be even harder. Those of us who lack the courage or the words to speak up may be unknowingly contributing to distal harm as well.
  
One of the things that may make speaking up to prevent distal harm more difficult is the relative lack of urgency compared with when harm is about to occur, right now.  ‘What is urgent is seldom important and what is important is seldom urgent.’  Dwight D. Eisenhower, the 34th President of the United States of America, said these words and conceived of a simple time management tool with which you are probably quite familiar, even if you had not heard of its origin, the Eisenhower Urgency - Importance matrix (below).

Urgent, Not Important

‘Distractions’

Urgent and Important

‘Necessities’

Not Urgent, Not Important

‘Time Wasters’

Not Urgent, Important

‘Fulfilment’

 
In the daily life of health professionals, witnessing an event in which patient harm is imminent easily falls into the Urgent and Important quadrant. Urgency may help us when it comes to speaking up to prevent patient harm. We know it is a now or never situation, and the urgency emboldens us, gives us the courage we need to speak up. 
 
If it is both urgent and important, it gets done: medical emergencies, crises and critical deadlines. But in our frenetic healthcare environments, with so many competing priorities, there is plenty of evidence that urgency often trumps importance when busy clinicians are trying to prioritise their work. Witness the habitually low rates of discharge summary completion, which everyone acknowledges as important but rarely treat as urgent. On the other hand, how many unimportant tasks fill our days - being interrupted by phone calls about routine matters, dealing with interminable email, attending meetings at which information is shared that could be provided in other ways.
 
Many opportunities to make clinical care safer fall into the Not Urgent but Important corner: keeping up our reading of the literature, undertaking training and skills development such as Advanced Life Support and building relationships with our professional colleagues. When we spend all our time in the high urgency quadrants, it is difficult to make time for the activities that might make the biggest difference in the long term and are professionally and personally fulfilling.
 
Will you speak up if you observe the sort of behaviour that is unlikely to cause harm right now, but if persistent that behaviour will undermine the culture of your organisation? Will you see this as an opportunity to improve patient safety, or will you leave it and hope someone else also saw what you saw and that they will speak up?
 
Is it important? Absolutely. Is it urgent? Perhaps not urgent enough. Do you have the skills? Maybe not. 
 
If it is challenging to speak up when patient harm is imminent, then I suggest it is even more difficult to speak up about behaviour that places patient safety at risk, not proximally but distally. Many of us would run a mile to avoid having a conversation with a peer about their behaviour. We did not choose to enter a caring profession because we thrive on conflict!
 
We need to learn the skills for speaking up using a graded assertiveness model4.  If we cannot speak up because of fear, or uncertainty, or a power differential, or if we have already tried and failed, then we need another means of acting. One such programme is the Vanderbilt University Medical Centre approach to promoting professional accountability, which has recently been adapted for the Australasian healthcare environment14,15.  Confidential reporting of unprofessional behaviour allows the organisation to speak up using trained peers when an individual cannot. The programme builds on the professionalism and commitment of the overwhelming majority of staff, while ensuring the actions of no one individual can undermine a culture of safety and reliability.
 
Change is precipitated when there is a sense of urgency. The American journalist Mignon McLaughlin said ‘Character is what emerges from all the little things you were too busy to do yesterday, but did anyway’16.  Now is the time to change our sense of urgency around how we deal with behaviour that undermines a culture of safety. 
 
 
Conflict of Interest Statement
Lynne McKinlay works as a Senior Medical Educator for the Cognitive Institute, the sole licensee for Vanderbilt University’s Promoting Professional Accountability Programme outside of the USA.
 
 
Dr Lynne McKinlay will be presenting an Interact Webinar, Managing Challenging Behaviour with Colleagues, on 19th July 12.30pm - 1.30pm.  For more information or to register, see: http://www.racma.edu.au/index.php?option=com_content&view=article&id=449
 
 
References
 
1. Hamilton JD, Gibberd RW, Harrison BT. After the Quality in Australian Health Care Study, what happened? The Medical journal of Australia 2014;201(1):23.
 
2. Belén Jiménez Muñoz A, Muiño Miguez A, Paz Rodriguez Pérez M, et al. Medication error prevalence. International Journal of Health Care Quality Assurance 2010;23(3):328-38.
 
3. Barnard D, Dumkee M, Bains B, et al. Implementing a Good Catch Program in an Integrated Health System. Healthcare Quarterly 2006;9(Sp):22-27.
 
4. Brindley PGR, S.F. Improving verbal communication in critical care medicine Journal of Critical Care 2011;26(2):155-59.
 
5. Reason J. Human error: models and management. BMJ 2000;320:768-70.
 
6. Institute of Medicine. Shaping the future for health. To err is human: building a safer health system. Summary. 1999.
 
7. Kohn LT, Corrigan JM, Donaldson MS, et al. To Err Is Human: Building a Safer Health System. Washington: National Academies Press, 2000.
 
8. Webb L, Dmochowski R, Moore I, et al. Using Coworker Observations to Promote Accountability for Disrespectful and Unsafe Behaviors by Physicians and Advanced Practice Professionals. The Joint Commission Journal on Quality and Patient Safety 2016;42(4):149-64.
 
9. Rhoton MF, Barnes A, Flashburg M, et al. Influence of anesthesiology residents' noncognitive skills on the occurrence of critical incidents and the residents' overall clinical performances. Academic Medicine 1991;66(6):359-61.
 
10. Kuper A, D’Eon M. Rethinking the basis of medical knowledge. Medical Education 2011;45(1):36-43.
 
11. Catron TF, Guillamondegui OD, Karrass J, et al. Patient Complaints and Adverse Surgical Outcomes. American Journal of Medical Quality 2015.
 
12. Martinez W, Etchegaray JM, Thomas EJ, et al. 'Speaking up' about patient safety concerns and unprofessional behaviour among residents: validation of two scales. BMJ QUALITY & SAFETY 2015;24(11):671-80.
 
13. Maxfield D, Grenny J, Lavandero R, et al. The Silent Treatment. Why safety tools and checklists aren't enough to save lives. Provo, UT: Vitalsmarts, AORN and AACN 2010.
 
14. Hickson GB, Pichert, J.W., Webb, L.E. & Gabbe, S.G. A complementary approach to promoting professionalism: Identifying, measuring and addressing unprofessional behaviours. Academic Medicine 2007;82(11):1040 - 48.
 
15. Cognitive Institute. Promoting Professional Accountability Programme, 2016. http://www.cognitiveinstitute.org/Courses/PromotingProfessionalAccountabilityProgramme.aspx 
 
16. McLaughlin M. The Neurotic's Notebook. Indianapolis: Bobbs-Merrill, 1960.
Last Updated on Friday, 15 September 2017 12:48