The Big Story: Texas church holds AI-generated services, uses ChatGPT – The Hill
“There’s so many different applications for AI,” said pastor Jay Cooper. “I just had the idea, ‘What would it look like to incorporate this into a worship service?’
What is sacred?
By David Jarrard
3-minute read
“Can a prayer written by artificial intelligence in some way communicate truth?” asked Jay Cooper, pastor of Violet Crown City Church in Austin, Texas. “Can you experience God through that? What is sacred?”
Since the launch of ChatGPT last November we’ve all watched this new generation of AI chatbots infiltrate new and sometimes unexpected spaces, including those once reserved for mere humans. Testing boundaries, as children do.
How far is too far…or too soon?
To explore the question of hallowed ground, Cooper’s church recently hosted a Sunday service entirely generated by ChatGPT. While the AI-led worship content was robust and relevant, it lacked, the pastor said, the essential ingredients needed for connection and emotion.
“I think the human touch is critical in life and in ministry,” he concluded. “I think the messiness of humanity should be present.”
How does “the messiness of humanity” meet AI in the sacred space of healthcare delivery?
So much good seems possible as our healthcare industry bounds forward to realize the timesaving, cost-saving, outcome-improving promises proffered by artificial intelligence. The boundaries push back weekly.
But anyone who has had a laptop shoved between them and a caregiver during a clinic visit can understand why most consumers (57 percent, says Pew) are initially cautious about healthcare’s rapid adoption of AI.
The concern is not that the AI’s information will be wrong. In fact, most are inclined to agree that its use is likely to reduce clinical mistakes and improve health equity. Instead, the concern is that the use of AI will interfere with the relationship they have with their doctor, their nurse, their care providers.
The relationship is the sacred space.
Humans can be very protective of our bonds, especially given the epidemic levels of relationship-starved anxiety and loneliness that plague us today.
It’s the “in-between” places in a relationship – between clinical colleagues, between caregivers and patients, between leadership and staff – that serve as the fertile seedbeds for nurturing the culture and brand-building attributes you most prize in your work: trust, belonging, preference, engagement, loyalty, action. These can be powerful, tender and volatile places.
It’s also in relationship where the best healthcare happens.
“For now, diagnostic experts say, no form of AI can replace a human clinician’s ability to strike up a personal bond with a patient, observe subtle cues and nuances in a physical exam or test result, spot gaps in a patient’s story or medical history and revisit things that don’t add up,” writes Laura Landro in The Wall Street Journal.
“As hospitals begin to adopt new algorithms and chatbots, they are also recognizing the limitations and risks of the technology. While AI can process and interpret massive amounts of medical data, there is a human art to diagnosis—and the new technology can’t duplicate many of the nuances that doctors see.”
There it is: The art of medicine, always living in tension with the science of healing. You know this space. You’ve navigated the introduction of new medical technologies throughout your healthcare career. It’s the scale that’s different.
That AI is powerful and teeming with potential for cash-and-time-strapped healthcare organizations is unquestioned. In operations, scheduling, workflow management, clinical diagnostics, development of best practices, pharmaceutical enhancements and so on, the opportunity is rich and will soon be worthy (we all hope) of the massive investment of blood and treasure it has captured.
But, today, the potential is far from realized, and the application of AI’s promised solutions can breathlessly rush ahead of its acceptance by colleagues and patients, especially as it becomes visibly integrated into your important relationships. Integrated…if not developing a “relationship” of its own.
“On Reddit forums, many users discussing mental health have enthused about their interactions with ChatGPT,” writes Scientific American. “ChatGPT is better than my therapist,” one user wrote, adding that the program listened and responded as the person talked about their struggles with managing their thoughts. “In a very scary way, I feel HEARD by ChatGPT.”
In one recent experiment, an online therapy chat service found that users could often tell if responses came from a bot, and they disliked those responses once they knew the messages were AI-generated. The discovery “provoked a backlash.”
“It appears that even though we’re sacrificing efficiency and quality, we prefer the messiness of human interactions that existed before,” the researcher said.
Handled poorly, the AI organ risks being rejected. It’s a change management challenge writ large.
How do you honor your patient and colleague relationships while taking advantage of the opportunities AI may present?
“Patients want to know that AI is used as a complement and not as a replacement for the clinician they have a relationship with,” says our own Kevin Phillips, co-founder and COO of Jarrard Inc.
“A key to patient engagement is the key to communications in general: It’s about trust and transparency. There’s trust in the message and in the messenger, and then there’s trust and transparency in the information.
“Doctors are using AI to communicate with greater empathy and to better balance clinical advice with compassion. They’re even using it to help them better deliver bad medical news. Doctors can use AI to translate complex jargon and concepts into a message that’s easy to understand. Used with an empathy angle, AI can really help in patient engagement.”
Integrating AI into the art of care is key to its adoption. We note the University of Texas Health Science Center this month launched the first known dual degree in medicine and AI. Expect more to follow soon.
The good use of AI will become good medicine. It’s likely AI is busy in your organization today. As AI proves its value, it will become more center stage in your important relationship work. As it moves into the limelight it will require a proper introduction that acknowledges its great strengths and its limitations.
Back in Austin at Violet Crown City Church, church attendee Ernest Chambers said he was able to worship during his church’s AI-service, but it fell short.
“I’m not sure that AI can actually express the emotions of love and kindness and empathy,” Chambers said. “I think that we must practice love and express that. Not only feel it, but we must express it.”
Contributors: Kevin Phillips, David Shifrin, Emme Nelson Baxter,
Image Credit: Shannon Threadgill