The Big Story: When Doctors Use a Chatbot to Improve Their Bedside Manner
“‘Doctors are famous for using language that is hard to understand or too advanced. It is interesting to see that even words we think are easily understandable really aren’t. The fifth-grade level script, he said, ‘feels more genuine.’”
The Chatbot Who Cares: Coaching Caregivers on Empathy
By David Jarrard
This is not a story about chatbots. Not really.
This is a story about thoughtful physicians using a new tool to help them be better communicators and, through this, better healers.
“Most doctors are pretty cognitively focused, treating the patient’s medical issues as a series of problems to be solved,” says one of the physicians quoted in the story above. As a result, he said, they may fail to pay attention to “the emotional side of what patients and families are experiencing.”
Clarity is compassion, they say. True communications is not merely the transmission of data. It’s honoring your audience by employing the right language in the right spirit of delivery to not only be understood, but to appear as compassionate as you are. It’s an act of translation.
Insert cliché: Clear is kind.
Clear is hard work, too, and one of healthcare’s greatest challenges.
In our frantic, how-many-patients-have-you-seen-in-the-last-hour business, slowing down to translate your care is a luxury even if it leads to better outcomes. Easier to use the Latin and hope the pharmacist helps ensure medication compliance.
The challenge is not limited to physicians, either. The movement for “transparency” – in prices and business practices – is as much a cry for translation as for information.
The flood of complexity and healthcare alphabets can puree in the listener’s ear into a word soup of condescension mixed with mistrust.
Instead, the priority should be a desire to be understood. Not to be impressive or insensitive to cumbersome translation. Transparency cultivates trust. Being clear is good medicine and a competitive advantage.
And, among other good avenues, the algorithm may be just what the doctor ordered.
More chatting about Chat
So we’re clear: The ethics and accuracy of using a chatbot for clinical diagnostics and treatments remains very much a work in progress. That’s a different use of the tool and not the point here, but for more on that angle, be sure to check out the AI in Real Time series from our colleagues at Chartis.
Every caregiver who cares – and the systems that support them – want to engage well and richly with their patients. It leads to improved treatment and medical compliance. Better outcomes overall. Improved patient experience scores and stronger patient loyalty. It’s better care.
Trying ChatGPT is an acknowledgment that the way they’re communicating now may not be fully resonating. That a different approach is needed. And a little outside help never hurts.
AI can be used to create good draft responses quickly, saving time and reducing the burden on the messenger.
But what else does this development tell us about improving communications in healthcare?
Language is a differentiator. Physicians are in a race to help their patients understand their health and take action to improve it. Likewise, health systems are in competition with those who have a track record of simplicity and clarity and are bringing it to care delivery. The edge goes to those who take language seriously and find ways to use it more effectively.
Tone counts. In high-stakes communication, it can be easy to focus only on the content of the message, not the impact. The tone of the responses spit back by ChatGPT reminds the messenger of what they hope to accomplish. In the case of a physician, it’s not just delivering medical information but also giving hope, a reality check or compassionately delivering bad news.
Clarity fosters buy-in. Theoretical physicist Richard Feynman was renowned for his ruthless simplification. This testimonial from a colleague sums it up. The team would have him pick apart – sentence-by-sentence – their technical presentations ahead of time. “Don’t say ‘reflected acoustic wave.’ Say [echo],” he would admonish. According to the colleague, “Nothing made him angrier than making something simple sound complicated.” Essentially, to buy in to an idea, audiences must first understand it.
Tools are coaches, not stand-ins. The biggest value of the chatbot is to help the human communicator learn and refine their own approach. If ChatGPT can point us in the direction of clearer, simpler language, then it’s our responsibility to learn from that guidance and use the lessons to improve the next human-to-human interaction.
Language can improve health and equity. Think of language and vocabulary as medicine. Clarity and simplicity matter because if people don’t understand their diagnosis, their treatment, they won’t engage in their own care. Mistranslation and miscommunication lead to failed medical delivery. If an algorithm can help simply show a patient why taking their meds, doing their PT, eating smart matter, let’s take advantage of it.
The curse of knowledge can be especially damming in this most complex industry. It can seed mistrust in institutions and relationships, especially with those singular individuals anxiously sitting in a smock in your exam room, eager for conversation.