I read a comment the other day on Facebook that said, “To error is human, but to blame someone else… that shows management potential.” I have heard this expression before and laugh every time.
Humans have made mistakes throughout our existence on earth, and how we synthesize the information surrounding that mistake can be the “make or break” when it comes to reducing the odds of it repeating. When I come home, I have a horrible habit of leaving my keys in the door. This happens probably 2-3x a week. I have often wondered why my brain does not build some sort of neuron network to help me avoid this, but the truth is, Tyler's brain does not really see it as a threat. I live in a nice area, and the concern of someone breaking in while I am home and hurting me is pretty low on the radar. I believe that because this error has little impact on my day-to-day life, it gets pushed lower on my list of priorities.
We have all seen the employee who perhaps does not make it through orientation because of repeating very similar errors. It may seem like they are just flat-out not learning from their mistakes, and in fact, that may be the case. However, I have a hard time believing that someone orienting for a flight job would not feel as if learning from their mistakes was a high priority, but due to the influx of information when starting a new job, the priority list may be scattered. I have found it super helpful for myself and when precepting new clinicians to label the mistakes we make according to their presumed cause.
I am sure I am reinventing the wheel with these labels, but this is how I label my mistakes.
Knowledge Gap - You legit did not know
Human Factor - You knew but forgot in the moment
After typing my personal labels, I researched “types of mistakes” and found this chart from Ohio State University super interesting. I redid the graphic for simplicity sake, but you can find the original at the end of this article.
The A-ha mistake outlined above is the same as my knowledge gap mistake, and the “sloppy mistake” is very similar to my human factor mistake. The stretch and high-stake categories are very interesting to me and admittedly not on my radar initially. Plotting mistakes along an axis of learning opportunity and intentionality is brilliant and immediately made sense to my brain.
When we make a mistake because we are missing a piece of information, it is reasonable to agree that the learning opportunity is high and the intentionality is low. The interesting component of this to me is the word "opportunity." Just because a mistake falls into the category of a learning opportunity doesn't always mean the person will take advantage of that opportunity, and even if they do, it may be inaccurate.
Let's take a look at an example.
If I attempt to intubate a patient using a video laryngoscope and miss, did I make a mistake?
We have to examine the micro-skills of the larger task to make an accurate assessment.
A very common issue I have seen in the era of video laryngoscopy is tube delivery issues, especially with hyper-angulated blades (D blade, X Blade, King Vision, Glide-scope, etc.). If I used a hyper-angulated blade and grabbed a bougie instead of a rigid style, my chances of successful intubation would be drastically reduced because of... well, geometry.
Perhaps I should return to the station and ask several colleagues to watch me intubate a mannequin and assess my technique. This shows excellent intent and may be the key to discovering a knowledge gap and unlocking a "learning opportunity." However, if the clinicians you ask experience the same knowledge gap, this could also lead to further confusion.
A quick dive into several logical fallacies led me to three biases that we could experience within our close network of resources.
False Consensus -"Everyone on my shift hates etomidate because it causes patients to become hypotensive."
Bandwagon Effect - "The fire department I just started working for thinks the stylet is way better for intubating, and they have seen great success. I have to admit I agree. I never use a bougie unless it's a super difficult airway."
Availability Cascade - "I heard Zofran works better if you give it before a patient pukes"
I believe this type of peer bias has probably been significantly reduced with the FOAMed movement of blogs and podcasts. However, utilizing both inside and outside resources for data collection is always a wise idea during the reconnaissance phase of an error.
Accurate reflection is difficult and often skewed by perception and occasionally ego. I have found that one of the best ways for me to identify mistakes or behaviors is to give them a name. Labeling specific mistakes can help a clinician develop a plan to avoid repeating them in the future.
I have attached the actual picture from Ohio State's Dennis Learning Center Below. This chart not only has the label but also corrective action suggestions.
Komentar