The Big Story: “Now that we can transcribe conversations, should we?”
“It’s great to have a full record of what was said …But there may be a hidden cost.”
How Care Changes with AI as an Audience
3.5-minute read
Step into the exam room—a small, sacred space where a physician, a nurse and a patient play out an intimate drama.
Secrets spill, care unfolds, trust takes root in those fleeting moments they share. It’s raw, human, unguarded. So it has been for generations.
But now, there’s a new player slipping in…an artificial ear, transcribing every word, assisting silently. Maybe even judging the show.
Does this unseen audience that is ambient AI change the script?
Writing about workplace chatter in the article above, The Wall Street Journal’s Alexandra Samuel warned, “The biggest risk of AI transcription is how it affects our ability to trust one another.”
In an exam room, where stakes—and emotions—run higher, that risk feels like a live wire.
“When people know they’re being recorded,” Samuel noted, “they don’t talk the same. Some get careful, knowing there’s a record.”
This isn’t news. Back in the 1920s, the Hawthorne Effect showed factory workers tweaking their hustle when they knew management’s eyes were on them.
Observation shifts behavior—it’s human nature. People change when they know they are being watched.
Now, swap stopwatches for tech. Joshua Meyrowitz’s No Sense of Place saw this coming in the ‘80s: Technology blurs the line between private and public, thrusting us onto a relentless stage. He predicted the TikTok life—always performing, always exposed, fraying authenticity under a digital gaze. Ask a teenager.
Healthcare’s not dodging this spotlight. For physicians, that “on stage” life hits hard.
Picture a doctor mid-chat, mid-empathy, suddenly wondering: How’s this landing with the AI? Too sharp? Too soft? Too edgy? Am I quick enough? Self-doubt sneaks in, turning a real moment into a rehearsed line.
It’s not just nerves—it’s a psychological grind. The exam room, where patients bare their bodies and souls, risks becoming a theater (or a management performance review), every word a bow to an algorithm’s unseen nod. That spontaneity, that confidence defining great care? It could unravel quietly, eroding the trust patients lean on.
And how do patients feel in this new limelight? The Hawthorne Effect sweeps them in, too. It’s challenging enough for caregivers to earn the trust needed for patients to be vulnerable and share intimate insights into their health, including details they may find embarrassing.
Does that patient behavior change when AI is listening? In a survey from 18 months ago—ancient history for AI adoption—half of patients said they would not trust advice from doctors who were using AI. The survey described Americans as “cautiously warming” to AI.
A moment of stage fright for everyone.
AI’s Here—And It’s Growing
This isn’t a maybe. It’s an inevitability. Trust in AI lags, for now. But familiarity and evolutionary surges in the tech itself will close that gap. Some six in 10 physicians already use AI in their workflow, a stat set to soar as the technology improves year-over-year.
The benefits are undeniable. Ambient AI catches the details—meds, histories, fears—that slip when a caregiver is juggling tools and screens. A New England Journal of Medicine report pegs it at 10-20 minutes saved per visit—time handed back for connection, not discouraging “pajama-time” charting.
With over 26% of doctors eyeing the exit over admin burnout, this isn’t just a perk—it’s a lifeline. Medicine unshackled: listening over typing, seeing the patient, not the chart.
The Messy Now
AI is tearing through healthcare like a storm, but we’re still in its wild early days.
Healthcare Dive asked this week: “Could AI tools meant to ease clinician burden actually pile it on?” Too early to call, they say, but it hinges on how much doctors must babysit the tech and whether fee-for-service incentives turn saved time into more patients crammed in.
“That’s a real worry,” Graham Walker of Kaiser Permanente told HIMSS in Vegas. “The easiest way to juice revenue is pushing doctors to go faster, see one more.” Is that the AI future we’re scripting—caregivers as cogs in a speed-up machine?”
Whoa. Is this how we want to use the power of AI? Should we, just because we can?
We must be cautious that AI’s tectonic momentum does not propel us thoughtlessly to uses we have not carefully considered and into which we have not cautiously stepped lest we sleepwalk into an unintended future.
“Everybody has their own pace of comfort with adoption of change,” said Brian Hoberman, MD, EVP of the Permanente Federation and CIO of the Permanente Medical Group.
“As leaders and as people who are trying to support care delivery, we have to be really empathetic to that and try to make sure that we reach people as they are, and we work with them as they are… We have to make sure too that when we’re using technology, that it’s purpose-built to do the job that we need to do and that we understand what the risks are when we’re using it.”
It’s the Empathy, Stupid
Last month, Harvard Business Review dropped a gem: 167 execs worldwide used GPT-4 to coach their communication, analyzing talks and serving up feedback. The verdict? “Largely useful,” even “surprising,” challenging assumptions with granular insight.
But here’s the rub: “AI can’t authentically replicate empathy” or catch “cultural nuances, emotional dynamics, and unspoken cues.” The head nod, the distracted gaze, the nervous hands. AI can’t read them today. It’s a tool, not a soul. Imagine that coach in the exam room—sharpening a physician’s clarity, for sure, but also missing the heartbeat of care.
Patients don’t just need facts; they need feeling.
Curtain Call
The promise of ambient AI is that it can lighten loads and elevate care. But not if it turns physicians into actors and patients into props. Get it wrong, and that spotlight torches trust and connection. Get it right—with transparency, training, and a human-first lens—and it’s a trust-builder, proof we can evolve without losing our core. The tech’s here, unstoppable.
Our job? Keep the exam room a sanctuary, not a soundstage.
Contributors: Emme Baxter, Alex Hunter, Jed Lam
Image Credit: Shannon Threadgill