The Big Story:
“Metaphors really matter, and they have shaped the public discourse” for all kinds of new technologies, says Emily Weinstein, a technology researcher at Harvard University’s Center for Digital Thriving. Scientific American
Are We Hallucinating by Humanizing AI?
4-minute read
When a generative artificial intelligence program unfurls a solid B-level, 800-word composition within 30 seconds of your one sentence prompt, is the machine “writing?”
Hell no, says author John Warner in an impassioned defense of human writing and communications in “More Than Words: How to Think About Writing in the Age of AI.”
“Writing is an embodied act of thinking and feeling,” Warner writes. “Writing is communicating with intention. Writing involves both the expression and exploration of an idea. Reading and writing are inextricable, and outsourcing our reading to AI is essentially a choice to give up on being human.”
Does AI “write?” Is it “intelligent?” Does it “hallucinate?”
These words are more than shortcuts. As leaders, you know the power of language to shape how we think and act, especially in moments of transformational change. The right words empower or disempower, inform or misinform, create fear or desire or excitement.
In the late dawn of this new AI age, will we choose to animate AI with the powerful metaphors of humanity?
It’s important we get this language right. And soon.
We won’t belabor here the swift sweep of AI through every aspect of our waking lives or note its rush into the healthcare and communications professions. AI is a technical marvel and it’s terribly exciting, in every sense of the phrase.
Instead, let’s be vibrantly aware of how we choose to think about AI – and the story we tell ourselves – as it walks into our office and settles into our most comfortable chair, as if to stay for a while. Is it eyeing our desk?
Metaphorical Shortcuts for the AI Unknown
Carefully choosing the right metaphors for this moment is a consequential conversation in tech and policy circles today.
One researcher has called for a metaphor observatory to capture the proliferation of metaphors attempting to define AI.
Communicators must join this active chat.
After all, it’s not the tech smarts that will direct our adoption of AI; it’s the stories we tell each other about it.
“Investors, policymakers and members of the general public make decisions on how to treat these machines and how to react to them based not on a deep technical understanding of how they work, but on the often metaphorical way in which their abilities and function are communicated,” according to a trio of researchers in Ethics and Information Technology.
Is AI your “partner” or a “machine”? Is it destined to augment or replace you?
We are all susceptible to stumbling into the same linguistic trap as we try to feel our way through the dark.
“Anthropomorphism, or the innate tendency to attribute human traits, emotions, or intentions to non-human entities, is a mechanism we use to grapple with the unknown, by viewing it through a human lens,” writes Andrea Hak in “AI doesn’t hallucinate — why attributing human traits to tech is users’ biggest pitfall.”
“There is a particular danger that we can fall into this trap with AI, as it’s a technology that’s become so pervasive within our society in a very short time, but very few people actually understand what it is and how it works. For our minds to comprehend such a complex topic, we use shortcuts.”
You know well today’s shortcuts for generative AI. You may use them yourself.
An AI’s results can be wrong, but it’s not “lying” because its algorithm has no agenda. It does not “remember” or “review” or “know” or “care” or “understand” or have “intent.” Humans do these things. The machine on your screen doesn’t.
AI’s math can be in error, but don’t call it a “hallucination.” That’s simply good marketing.
Wrestling to Write Well
Few professions appear as immediately susceptible to the rise of generative AI text production as communicators, marketers and storytellers. Maybe, some say, wordsmiths can be swapped for a strong Wi-Fi connection and a smart prompt.
But what do we call the thing that is happening as sentences magically snake down your screen in response to your question?
It’s not writing.
To write is to wrestle with an idea, to host a roundtable in your mind where your knowledge, experience, memories and feelings grapple with your intentions, the rich palette of words and the rhythms of language so that you can communicate with others.
Writing is a deeply human process. The great struggle to write well long precedes the structured sentences that try to, finally, contain it.
Quite often, writing is an act of discovery, leading you into new places as you chase ideas home and find new ones on the way. You write to learn what you think.
“Large language models do not write,” Warner says.
“They generate syntax. They do not think, feel or experience anything. They are fundamentally incapable of judging truth, accuracy or veracity.”
Generative AI is a transformative technical triumph, but it’s not an idea wrestler. It’s a word predictor and product producer.
It uses incredibly complex analytics to guess which words to string together in what order in response to the string of words in your question, based on the trillions of words it has scraped from the writings of other humans and now, other AIs. It’s arranging word “tokens” following the predictable patterns of language to output its word products.
The danger is we and others confuse the product with the process and, thereby, reduce the value of both.
We know what you’re thinking: Does this mean there aren’t writing products that are best left to AI’s speed and efficiency? It’s likely there are. We’re not talking here about churning out announcements for the next breast cancer screening.
Instead, we’re reflecting on the seemingly mystical power of AI to produce swaths of words in seconds that feel thoughtful, even if they are merely the output of a fantastic predicting machine.
“Any sufficiently advanced technology is indistinguishable from magic,” said Arthur C. Clarke.
The Magic of Meaning
Which can only lead us now to ponder the power of the Magic 8 Ball.
And here’s the shocker: the Magic 8 Ball isn’t magical, after all.
It’s you who brings the magic to the little ball through the meaning that you give its answer to the yes or no question you asked before you shook it and peeked at its window.
“Signs point to yes,” or “My sources say no” or “Reply hazy, try again.”
The words on one face of its 20-sided die, emerging from a bubble of dark blue alcohol, might affirm (or crush) your eight-year-old dreams because of the playful illusion you granted the plastic ball.
For a moment, you pretended it was enchanted. AI is having its Magic 8 Ball moment now.
As writers, communicators and storytellers we are all meaning-makers. As AI strides into our work and homes, be ever conscious of the meaning with which you cloak this extraordinary machine. Think about it. Write about it.
Contributors: Emme Nelson Baxter
Image Credit: Shannon Threadgill