The Art of Clinical Reasoning

The Art of Clinical Reasoning


medicinehealthcaretechAI

This is a part of the series from What Tech Bros Don’t Tell You about Healthcare. See the world from the view of The Contrarian.

Disclaimer: This does not represent the views of AWS or my past jobs.

ChatGPT will see you next

I’ve been contemplating this topic for a while, and I think it’s finally time to put my thoughts into writing since I have a lot to say.

Since the launch of ChatGPT, I’ve received responses from people from all walks of life suggesting that AI is going to take over the jobs of doctors. It’s not. The public tends to view doctors as a homogeneous profession, doing nothing more than typing away on keyboards during consultations. While it’s easy to slip into autopilot mode or order numerous tests, this approach does a disservice to patients.

The most important function of a doctor in the next decade, given the aging population and the shortage of staff worldwide, is to listen and ask the right questions to arrive at the correct diagnosis in the shortest amount of time.

Clinical Reasoning

I laughed during year 1 of medical school when one of our professors said that medicine is an art. The advancement of the medical field relies heavily on science and technology… how can the practice of medicine be described as an art? But as I join the many that came before me, I realized how true that statement is. The best doctors are excellent lie detectors and communicators. They are able to extract the most relevant information in the shortest amount of time and discern noise from actually useful information. They are caring, and time seems to stop when you are in a consultation with them.

It was incredibly frustrating when I was in medical school. Clinicians, like athletes, often do not understand the secret sauce that makes them good, or at least they are not able to articulate it in a way that can be appreciated by a novice in the field. The mental muscle memory and pattern recognition that is almost instant cannot be taught. It was just something that you know and get when you have seen enough.

The conventional approach to clinical reasoning is:
1. Collect data points
2. What comes to mind
3. What are some red flags and things you must rule out
4. Recognize patterns
5. Compare and generate differentials

The reality is that doctors probably interchange deductive reasoning with inductive reasoning throughout the whole consultation, which is why it is so hard for my professors to articulate what goes through their mind when they arrive at a diagnosis and recommend treatments.

Here is an encounter I had with my patient:

  • Patient: I have hives after I took the antibiotics for H Pylori (Showed me pictures of hives). I must be allergic to this antibiotics.
  • Me: But did the hives come on immediately after every time you took the meds?
  • Patient: No, just twice. Not related to the timing of the medication.
  • Me: Did you change what you eat?
  • Patient: No, I did not change anything in my diet.
  • Me: Do you have sensitive skin?
  • Patient: Yes, I do. I have always had sensitive skin, but never broke out in hives.
  • Me: Any major life event in the past 6 months?
  • Patient: Now that you mentioned it. I just moved to a new condo a few months ago.
  • Me: Did you do any renovation?
  • Patient: Oh yeah, I really hated the wall smell. It is very strong. I am extra itchy these days.
  • Me: Can you specify when these hives happened?
  • Patient: It was particularly worst last week. I was outside when I noticed my arm getting the rash.
  • Me: I think there was haze last week. Was it the day that the sky was extra smoggy?
  • Patient: Yes, yes, yes. I was outside for a long time that day, and I noticed that the hives were super itchy after I got home.

The reasonable assumption is that the patient has baseline eczema and was likely allergic to the haze, which led to her hives. It improved once she was indoors and did not have any hives on the days she was at home. It has nothing to do with the antibiotics.

From this illustrated example, you can see that clinical reasoning does not have clear rules. They might not have patterns for AI to recognize and demands cognitive flexibility. I am not even sure how the AI would be able to correlate weather with hives breakout. It requires it to have a piece of data that is real-time.

Now let’s play the devil’s advocate

The truth is that no single doctor can retain the amount of medical knowledge that has come out in the past decade. That is why we have specialists, and that is why your family medicine doctor is meant to be the Jack of all trades but not necessarily the master of none.

So who will get replaced? Medicine is no longer about having the right answer and equipping doctors with the right knowledge. Knowledge is fluid. Does anyone remember that we were not supposed to eat butter in the 90s? Well, butter is in again. Across white-collar industries, AI is poised to replace lower-skilled roles universally. Those who lack critical thinking and the ability to ask the right questions may find their effectiveness diminished in the fourth industrial revolution.

Technologically-savvy doctors, on the other hand, will harness AI to provide consistently high-quality care at a fraction of the cost of human labor. This is the true power of technology: scalability and continuous availability.

However, not everyone will benefit in this new landscape. The individuals most impacted by the wave of automation will be clerical workers and those who depend on their wages for survival. Their jobs are kept for now simply because digitalization is not complete and data fragmentation is still an issue in healthcare. Jobs like clinic assistants, although not the most exciting, help pay the bills. Nurses, whose work is largely face-to-face, will remain essential for care delivery. Specialists in procedural-based fields will continue to be in high demand.

Hospitals can significantly reduce administrative overhead, but the real question is, where will the existing employees in these roles find their place? Unlike the tech industry, healthcare in Asia rarely witnesses layoffs, making it a sanctuary for IT professionals who may not secure positions at top tech companies or banks. Mediocracy is something that AI will eradicate or cure (depending on your perspective).

The unfortunate reality is that businesses may only retain humans for tasks requiring physical presence. This leads to a recurring question: Is technology genuinely propelling us toward a better future? The absence of new industries to facilitate a smooth transition for those whose roles are at risk remains a pressing concern and the way that our tech is developing is outpacing the speed at which we change our social contracts.

Babylon Health

I always find it ironic that they named the company Babylon Health. Babylon was this ancient Mesopotamian city that had hydroponically-engineered gardens in the sky (one of the seven wonders of the ancient world) that fell during the Persian Empire conquest.

The events that unfolded at Babylon Health are a cautionary tale for healthcare leaders who don’t do the appropriate due diligence and accept tech wholesale based on a good story. Highly recommend this article from The Times UK.

A few years ago, I was a critic of Babylon Health when someone wanted to implement an AI symptom checker using similar technology. Now I would like to be petty and say, “I told you so,” but instead of doing that, I think it would be more appropriate to analyze why Babylon failed:

  1. Letting patients choose their symptoms: Any doctor would know that is probably a bad idea. I have a patient that insist that she is vomiting bile when it was leftoever salad from last night.
  2. Telehealth: I remember the craze of people trying to do telehealth without crunching the number. You can’t price telehealth the same price as an in-person visit yet your biggest costs (doctors) stay the same so it is just not a good business model.
  3. Pseudo healthy measurements: I have said this a lot when I was working in the public sector with an agency. Step counts don’t mean anything. Garbage in and garbage out. Step counts in a heart failure patient means something. Step counts in someone who has undergone rehab means somethings but steps on its own means nothing in a clinical context.

The biggest white elephant is the fake it till you make it mindset.

Tech industries like to sell things that are not ready to drum up interest, and that is a reasonable thing to do in unregulated B2C industries but not in a regulated industry like healthcare.

Start-up gets themselves deep in the mud when they realize that the tech cannot catch up with the dream that they have sold, but too much (lives) is at stake for healthcare, so sooner or later their bubble will be burst when the patients or the doctors come knocking on the door when that s#$@ just doesn’t work. Babylon’s facial-analysis tool is a fantastic illustration of this phenomenon.

It was a show. The facial-analysis tool, a prop for a demo, never made it to market. The patient in the video was an executive assistant at Babylon, according to former colleagues and her employment record on LinkedIn. This sleight of hand was a small example of a culture fixated on form over substance, a trait common in Silicon Valley but dangerous in healthcare. And it was by employing Silicon Valley techniques that Babylon secured NHS work and won the support of senior politicians, using its deals with the health service to sell itself to private investors and expand globally, including in the US, Rwanda and beyond.

I have more questions in my head than answers at this point in time. But one thing I do know is that AI, like all technology used in healthcare, should be regulated and validated. This is so we can avoid the prophets of falsehood and a dream sold by the tech bros/gals (Theranos) that is too good to be true.

© 2024 Petty Chen