I'm currently in the process of reading the book “Outliers” by Malcolm Gladwell. In it, he looks at some examples of extraordinary human behaviour of one kind or another. Much of this is fascinating but there was one chapter really struck home whilst reading it. This was the chapter on plane crashes. Throughout my anaesthetic training it was fairly common to draw parallels between the aviation industry and anaesthesia. The analogy is not perfect, and sometimes overstated, but I think there are some interesting similarities. What struck me about this chapter is the centrality of human factors in these disasters. This brought me back to reflections that I have had previously but which I have not actually thought about in detail for some time. That is that it is actually the individual person (and the psychology of that person) that seems to be such an important part of the clinician. What I mean is that it is less about the knowledge of perhaps even the skills of that clinician (or even the specifics of the medication prescribed); rather, it is about the application of all this and the human fallibility of that application. That the outcomes from complex medical procedures, critical illness, surgery or, to be honest, even pretty mundane activities, rests on the human factors of the clinicians involved more than anything else? I remember finding this fascinating regarding medical training especially because so little of our training really focused on this. It still very much treated us as rational, impartial, and calm individuals. That we would behave rationally and effectively in the real world environment. But we know this isn’t true. We are only partly rational, very vulnerable to biases and illusions, and, ultimately, deeply fallible individuals. We are often in a position to potentially cause severe harm, and I remain of the belief that we do weigh this component of care heavily enough. I don't think we teach about these illusions and biases in enough detail and I don't think we practice the implementation of effective strategies adequately.
As Gladwell notes, the implementation of the crew resource management system was an example of the aviation industry trying to combat these recurring disasters. The world of medicine has implemented these, although not as broadly as in aviation. As such, it was interesting to read this paper about some of the components of such a system. These are not factors that we know nothing about. But they often don't get practised quite as much as they probably should. This is probably because of the simulation and dedicated training needed to deliver it. Indeed, it feels like something that should be a regular and recurring aspect of training as part of an overall team and that is something very tricky to implement. We change teams regularly and there is limited flexibility within the current system to facilitate such a “luxury”. This is such a shame when the benefits seem so notable - indeed, they seem invaluable. As such, I would love to hear your stories about human factor training. If you have been involved in it, what you have found positive or negative about it. What have you found that has worked well and what has not worked well? And I would also love to hear any other opinions you have about this topic. Please post below or go to Bluesky @DrTomHeaton.bsky.social to continue the discussion there.
0 Comments
Leave a Reply. |
Dr Tom HeatonReflecting on aspects of clinical practice and training. ArchivesCategories |