AI’s Seductive Siren Song
“Now stop your ship and listen to our voices.
All those who pass this way hear honeyed song
poured from our mouths. The music brings them joy,
and they go on their way with greater knowledge.”
So promise the Sirens to Odysseus. Contrary to the classical artwork produced over the centuries, the Sirens do not attempt to persuade the King of Ithaca with lust but rather their knowledge — that, they promise to transfer to Odysseus. All he has to do is listen.
As a writer, I have reflected often and extensively on the proliferation of AI tools. In both the professional and creative realms — and often in the interspace these two perspectives intertwine — AI tools like ChatGPT and Midjourney have infested everyday life. It’s the hot topic, mentioned at every meeting, discussed ad nauseam on every internet platform, and foisted upon every worker.
The proliferation of these tools has been impressive, to say the least. Last June, AI was a laughingstock. Companies that attempted to leverage it to create blog copy were mocked (often rightly so). Suddenly, the conversation took a complete 180. Now every company has a “stance.” Now, people on Twitter are claiming AI will replace the film industry itself because they saw an AI-generated video that vaguely replicates cinematic form.
To say the least, the AI conversation has been a rapid and passionate one. Many are excited about the potential of these tools. Indeed, even as ChatGPT continues to exhibit fatal flaws — like giving blatantly incorrect information — companies are encouraging their employees to use it to create content out of a misconception that it saves time and money or that it increases productivity. AI can’t seem to make a picture of a hand to save its nonexistent life (not without turning it into something worthy of a Lovecraft story, at least) but that hasn’t stopped many from attempting to use it to enter artistic competitions.
But it’s curious that AI is given so much leniency in the professional world based on what it might, one day, if we dump the world’s supply of drinking water onto it, achieve.
After all, would you trust Salesforce if, half the time, it gave you incorrect client information?
And that’s not even getting into the fact that AI is not creating; it’s imitating.
THE IMITATION GAME
AI is playing a magic trick on us. It’s making us think the coin disappeared, and while we gawk at a pale palm it is pocketing that metallic glint. That’s because AI does not create in the traditional sense. It imitates.
AI art is the equivalent of a toddler’s burped-up breakfast, all mealy and off-white. It regurgitates a result based on what it “thinks” you want to see. If you ask it to write a blog about the 2016 Honda Civic, it will provide generic copy that is, at the most basic level, mimicking a thousand other blogs humans have written. Similarly, when Midjourney is asked to create art, it’s simply taking existing photographs, paintings, and illustrations humans have created and imitating them to try and fit the written description.
AI can stencil Wes Anderson’s style over The Matrix, but it cannot, on its own, achieve the former’s dry wit or the latter’s existentialism.
AI can replicate Kurt Cobain’s voice so it sounds like he is singing “Black Hole Sun” but we will never know how Kurt would actually sing the song. Where he would scream, how he would inflect, how long he would hold a note.
It’s these small details that make art, though. Wes Anderson’s style is not a carbon copy of somebody else’s — it’s something he’s spent his career honing and refining. How Kurt would pause, sing, and scream changes the feel and tone of a song like Black Hole Sun (you only need to watch him during Nirvana’s MTV Unplugged performance to recognize that truth).
It’s these small details AI cannot, and will never be able to, create on its own.
PULLING THE (PRODUCTIVITY) RABBIT OUT OF THE HAT
In the professional world, AI is viewed with excitement because of its productivity potential. After all, if a writer can just ask ChatGPT to write a blog, and the chatbot does so in 60 seconds, that is a massive amount of time saved. A writer will often take a decent chunk of time researching and outlining before they begin drafting.
But here we come to another problem. AI is not actually saving time; it’s performing a temporal transference. That front-end work a writer requires is now being transferred to the back end when AI is leveraged. That’s because, as mentioned before, AI tools produce incorrect information with startling confidence. Not to mention, it sometimes uses incorrect words, structures sentences in fascinatingly weird ways, and is incapable of injecting much-needed nuance. Essentially, unless you’re asking it to write a blog about Newton’s three laws, its results are usually shallow and inconsistent (and even that Newton blog should be fact-checked).
To put it another way, ChatGPT and AI tools don’t free up space on the highway so you can cruise home faster. Instead, they’re the equivalent of merging into the left lane only to come to a slow stop right beside the car you were previously behind.
While this is something writers and artists are acutely aware of, there seems to be a disconnect between what some believe AI is capable of and how it can actually be leveraged. If AI is a scalpel, then some are requesting artists hammer nails with it.
There is another facet to the productivity question: the result any AI tool provides is dependent on the input a human types. You will find no shortage of articles online promising to have “5 secret inputs that unleash ChatGPT’s potential.”
The hard truth is, right now, you are more likely to spend a couple of hours searching for the right inputs and then editing the resultant copy later. In that amount of time, a writer could already be well on their way toward completing the piece. Not only that, the copy will be better and more specifically tailored to the request’s needs.
So, in the end, AI is not saving time, nor is it resulting in better copy. The same is true for art.
But what of the argument that a writer who uses AI is more productive than a writer who doesn't? There is some truth to this. But, again, I think there is a fundamental misunderstanding about what a writer does and who a writer is.
A writer is not just a keyboard monkey, tapping out words in a pretty line. A writer’s process is just as important as the work they deliver. ChatGPT can easily turn from a helpful tool that spurs inspiration to a crutch that severely hinders a writer’s ability to, well, write. And when that happens, whatever productivity benefits the writer saw at the beginning of the process will be diminished.
This is not to say writers shouldn’t use ChatGPT to aid their process. But the tool is not some secret solution to fast, perfect copy, and nobody should be under the impression that it is.
THE HUMAN ELEMENT
There is yet another fatal flaw with AI — it’s artificial.
So much of writing is about voice. Without a compelling and engaging voice, an author is the equivalent of a chef with a blunt knife.
AI doesn’t just lack a voice, it is never capable of gaining one. Given the fact that an author’s literary voice is individual to their style, their perspective, and their work, a chatbot will never be capable of producing unique content. It will only ever be able to copy other writers’ voices.
Once again, imitation rears its face.
This is also the hidden ingredient that proves the age-old adage: content is king. At the end of the day, we like being able to relate with other people. In the professional space, a compelling and authentic voice is what gets people to click a “Register” button — or what persuades them to provide their information in exchange for a piece of long-form content.
In the creative space, a writer’s voice performs a peculiar alchemy. It’s how King makes us afraid of Pennywise the Clown. It’s how Gaiman makes us believe Mr. Wednesday and his foes are real. It’s how Shakespeare’s tragedies have stood the test of centuries. These voices are born from struggle, pain, insight, and tireless work. More importantly, they are the result of human honesty, of an author’s desire and ability to hone their soul and their heart into a page, to weave it like a seamstress would a blanket for their newborn child.
Inherently, AI cannot and will never be able to replicate this authentically, let alone create it.
AI may be able to appropriate a certain style or voice, but it cannot make me believe the words it regurgitates. It can write a scary scene, but it cannot scare me. This is the flaw at the core of AI content that will remain, no matter how successful ChatGPT becomes at replicating human behavior. And this is the core of the AI siren song.
For all its potential, it will never be human.
WHY DOES IT MATTER?
There will be, of course, those who defend AI with their chest.
What does it matter if it can’t write like a human; it can produce more work.
Readers don’t care if something was written with AI; they just want a good story.
I don’t care if content quality dips; I care about quantity and cost.
These are all, of course, opinions that, however much I may disagree with them, individuals can stand by. For some folks, content quality may not matter as much as the amount of content they can push out (and at what cost). To some, an author’s ability to weave sentences together, or an artist’s ability to paint, draw, or design, may not matter as much as the speed at which the result is produced.
But I think there is another fundamental disconnect. Many people don’t like AI content. When ads for various AI copy generators go live on Facebook and Twitter, the comments and replies, respectively, are not full of joy and hope. They are disdainful, angry, and more than a little skeptical.
Similarly, while many authors have “written” AI books, those stories have been met with just as much criticism as praise, if not more. Editors of literary magazines have begun banning stories written with AI — and editors can tell because these stories aren’t well written. They are, quite often, conventional retreads of other, better stories.
I cannot, in good faith, argue with the person who claims content quality does not matter as much as quantity, nor with the person who places more value on how much a piece of content costs to make over the results it will produce. There will always be those who seek speed over effort, who aim to cut corners and add a few cents to their bottom line.
But this gets to the heart of AI’s false promise, of its siren song, and it’s the thing that will lead many astray.
ODYSSEUS’S SURVIVAL
In The Odyssey, Odysseus is warned of the sirens by the goddess Circe. She advises him and his crew to plug their ears with beeswax. By doing so, they can drown out the Sirens’ song and avert certain death. The Ithacan King orders his crew to tie him to the mast, and while they plug their ears, his remain bare, susceptible to the song.
It’s then that the Sirens attempt to sway Odysseus with their knowledge. It’s their knowledge that’s the key, here. Dr. Emily Wilson, who wrote a recent translation of Homer’s epic poem, put it perfectly when she described the Sirens thusly:
This cognitive seduction, of course, is nothing but a red herring. The Sirens are attempting to draw Odysseus and his crew to their doom, to drive them wild and have them sink their ship, never to see Ithaca again.
While the crew steers the ship through the waters, Odysseus goes mad. He begs to be freed, but his crew ties him tighter. Why? Because they know that Odysseus, as strong and noble as he is, will send them to their death as fast as a new sailor with no brine in his shirt.
I think a lot of people who are excited by and drawn toward AI have good intentions. This is a new technology, and in many ways, the cat is out of the bag (if we want to keep up with the Greek mythology links, Pandora has already opened her box). We cannot stuff AI back into the depths of non-existence anymore than we can other technological advancements.
But we can be wary of what it proffers.
AI sings a sweet song, indeed. It promises unlimited potential, cost savings, time savings, and — most of all — accessibility that was previously unthought of. Some may view Midjourney and wonder why they should buy from painters and artists. Others may view ChatGPT and wonder why authors toil away for months, years, at a time on a single book.
But to what end? Why are so many driven rapturous by its purported promise? Why are so many willing to turn their back on proven, capable experts for a tool that has a serious aversion to truth and quality?
I am of course biased. There is an allure to AI that I cannot argue against. But I wonder if that allure, much like the Sirens, is drawing us toward craggy, rock-ridden shores. I wonder if we’ve forgotten to block up our ears with beeswax. I wonder if those who ravenously post about how AI can produce reams of copy at a moment’s notice realize that their ship is dangerously swerving toward ruin, that their future promise is not endless productivity but rather lungfuls of water and broken timbers.
I can only hope they have asked their crew to tie them to the mast.
After all, that’s the only reason Odysseus survived.