Why doctors make mistakes and how to make sure they are fixed

By Dr. Keith Roxo, Guest Writer

One of my general philosophies is to try to stand out. In my typical fighter pilot style, I’ll refer to this as “being less bad.” I have always believed that it is important to recognize areas where we lack knowledge and experience to avoid becoming complacent, arrogant or insecure. One of the things I liked most about military aviation, which also translated well to medicine, was the constant learning required to stay on top of the competition.

As a young and inexperienced fighter pilot, he was clearly lousy compared to the more experienced ones. As a trainee doctor, you tend to be terrible due to inexperience. When we believe we know everything, regardless of the topic, we close the doors to learning. Recognizing that you are terrible and having a personal drive not to be is the way to continue growing. It also helps limit arrogance.

I’ve read a lot of books on personal finance. As I mentioned in my previous post, How to Start a Medical Consulting Business as a Doctor and Top Gun Fighter Pilot, I knew I wanted to be self-employed and I read a lot of books on business. But I’ve also read a lot of books on personal growth, success and failure, and problem solving. The Checklist Manifesto fits in here. As do many of Malcolm Gladwell’s books.

Without a doubt, the best book I’ve read on how not to suck is Black Box Thinking: Why Some People Never Learn from Their Mistakes, But Others Do by Matthew Syed. While I initially listened to it, I ended up also purchasing it on Kindle so I could take notes more easily. As a testament to how important I think this book is, I have a quarterly reminder to re-read my notes.

Part 1 – The logic of failure

The beginning of the book talks about both aviation and medicine, and that’s how it caught my attention. The wife of an airline pilot who was undergoing a routine outpatient procedure died on an operating table. The pilot asked what had gone wrong and received basically no answer and was told that the hospital only conducts investigations if someone files a lawsuit. He didn’t want to sue, but since he worked in an industry where every accident is investigated and the results are published for everyone who works in aviation to learn from, this seemed unacceptable to him.

This section of the book mainly discusses how failure should be viewed as an opportunity to learn. But doing so means that the failure needs to be fully investigated to determine what went wrong and what could have been done differently. This section also discusses monitoring complex processes to detect errors and establishing a no-blame culture to encourage reporting errors and mistakes so others can learn.

More information here:

Buy (or sell part of) a company

Part 2 – Cognitive dissonance and confirmation bias

The second part of the book begins to analyze the personality traits that lead to continued failure. This is often due to the concepts of cognitive dissonance and confirmation bias. Cognitive dissonance is the unwillingness to listen to alternative ideas that do not align with your own established beliefs and values. Confirmation bias results in overreliance on results that support our preconceptions and expectations. These concepts can result in significant denial.

Medical School Scholarship Sponsor

The result of being unwilling to accept that something has failed, an unwillingness to investigate what went wrong, an unwillingness to challenge a concept to failure rather than simply accept easy confirmation — all of these behaviors stagnate knowledge, personal growth, and greater success. But by challenging assumptions, intentionally seeking out contrary positions, and keeping an open mind to real evidence versus anecdotal experience, a person can truly open themselves up to the possibility of standing at the top of Mount Stupid in the Dunning-Kruger Curve with a whole slope of enlightenment in front of them.

Part 3 – Facing complexity

Many things in life are extremely complex, but at first they may not seem that way. In part three, Syed talks about the importance of testing and iterative design changes with your target audience. What you thought in the abstract would be perfect may not be what works in real life. If you spend an eternity and a fortune making a “perfect” product that no one ends up using, you haven’t achieved any level of success. A minimum viable product allows you to start selling or using something while seeing how it works in the real world. Then, you can make the necessary changes.

Syed highlights a couple of examples of this. The first was about the Unilever company that manufactured washing powder in the 1970s. This process relied heavily on a spray nozzle through which the constituent components were passed to obtain the final powder product. After initial designs were suboptimal, the company hired a team of mathematicians and physicists to design the perfect nozzle. After spending the money on design and development, these new mouthpieces were a complete failure. Unilever then turned to a team of biologists who carried out an iterative design process. They would take one aspect of the nozzle, such as the diameter of the discharge end, and make a bunch of nozzles by changing just that one thing. After trying several, they settled on what seemed to be the best end diameter. Then, they would repeat the same process for the entire length. Then the input width, etc., etc. Finally, they iterated on the most functional mouthpiece the company has ever produced. This example combined the concepts of minimum viable product and randomized trials. He also highlighted that even fluid dynamics “experts” did not fully understand the real-world application.

The concept of narrative fallacy is also introduced, which involves trying to simplify complex issues in a way that makes them more understandable. This makes us feel better about ourselves and our level of understanding, but in doing so we neglect the bigger picture and the nuances, failing to devote time and energy to proper understanding.

An example: In the late 1970s, a crime reduction program called “Scared Straight” was developed, where at-risk youth were exposed to hardened criminals in prison. The goal was to shock these teens into changing their ways. It was a public phenomenon. Initial results seemed promising: 90% of the kids involved were still out of trouble three months after their trip to prison. It was an exercise in closed-loop thinking with no randomized controlled trials. They saw the results they expected to see and didn’t do in-depth analysis. But deeper analysis showed that the program created more criminals than it prevented. Most of the kids involved in the program were never at risk to begin with, and those who were actually on the path to criminal behavior were only emboldened.

Intuition is often wrong, specifically due to the tendency to oversimplify.

Parts 4, 5 and 6: Putting it all together

fixing bugs

As we progress through the book, fewer new concepts are introduced and more is said about how to use the knowledge presented. The fourth part spends time analyzing the value of iterative improvement. If the same task is performed 50,000 times, saving a small amount of time on that task translates into real results. Many times, people look for one big change when what can really add up are the cumulative impacts of multiple small changes. There is more talk about the value of collecting data, conducting randomized controlled trials to determine what may be best, and how sacrificing customer service for short-term marginal gains tends to have detrimental long-term effects.

Part 5 delves into how “The Blame Game” can crush innovation, ruin safety, and create an overall terrible work environment. A just culture that seeks to improve everyone does the opposite.

The last section talks about how experience is built from study, repetition and repeated failure to learn to be better. Syed comments that we should all be open to the idea that there is room for improvement no matter what we do, that accepting failure is the most efficient way to learn, and that people should be open to criticism and questioning their own point of view. .

More information here:

Reduce Your Medical School Expenses by Living in an RV

Lessons learned from black box thinking

Much of this book fits with my current philosophy. It is a personal philosophy that I learned hard over several decades of training and professional development in both aviation and medicine. What helped me the most was providing context, but also highlighted areas where I was still weak.

Before reading Black box thinking, I allowed intuition to guide opinions and decisions outside my areas of expertise. I assumed a product had to be perfect for release and didn’t fully appreciate the value of conducting randomized trials to make things as scientific as possible. The reason I re-read my notes quarterly is to make sure I remember the areas where I was weak (and maybe where I slipped again).

Before starting my consulting business, Wingman Med, I assumed that I had to have all the answers before I started. The concept of the minimum viable product gave me the confidence to begin consulting in my area of ​​expertise knowing that while I may not have the immediate solution, I do know where and how to get it. The subsequent client interaction forced me to become even more of an expert through repetition and trial and error.

Another concept I’ve used in my business is the use of randomized testing, specifically for advertising. Early on, I decided to run the same ad with the same budget for the same period of time on both Google and Facebook for two different aspects of my business. The results showed that for one of my business aspects, Google was significantly better. But for the other, Facebook was much more cost effective. I continue to use this same method to see what other types of advertising might be beneficial as I strive to find the most cost effective use of our advertising budget.

The concepts in Black box thinking It can open your eyes to things that you may not realize you are doing, but that may be inhibiting your growth as a person, doctor, and/or business owner. You may be unintentionally falling into the traps of cognitive dissonance, confirmation bias, and narrative fallacy. Being aware of these concepts and how they can affect you is the first step in developing a personal strategy to avoid these mistakes. And then you can start being less bad.

What do you think? What other ways have you learned to spot your mistakes and put an end to them? Are there other personal growth books that have influenced you? Comment below!

(Dr. Keith Roxo is a Top Gun-trained adversary pilot turned aerospace medicine doctor. He has over 2,000 hours in a variety of high-performance military aircraft, including the F/A-18, F-16 and F-5, and holds multiple military flight instructor qualifications. He also holds airline transport pilot and CFII certificates. His medical qualifications include board certification in both aerospace and occupational medicine, and he is a senior medical examiner. aviation designated by the FAA and qualified by HIMS. Keith provides aviation medical consulting with Wingman Med. This article was submitted and approved in accordance with our Guest Post Policy We have no financial relationship.