The Human Touch

Illustration by Michael DiMilo

By Geoff Carter

Revelations about the latest advances in artificial intelligence is causing widespread concern across the entire spectrum of the creative community. For example, the AI-GPT writing generator, which mimics human writing well enough to fool some teachers and college instructors, is one of the bones of contention in the Writers’ Guild strike in the motion picture industry. Writers want guarantees that AI writing generators will only be used for certain low-level tasks (although it’s not unbelievable that an AI generator could figure out the formula for a typical superhero or action film pretty easily—maybe that’s why the writers are concerned.)

IA technology is more widespread that we might think. Those chatbots you access to help you fix that computer glitch, those navigation tools in your car, the facial recognition software you use instead of a password, the seemingly random reasons some posts come up in Facebook or Instagram, and some parts of your children’s assignments in school are all managed by artificial intelligence. 

Last year’s Colorado State Art Fair competition winner for painting was “Théâtre D’opéra Spatial,” a piece created using Midjourney, an AI program which caused a decidedly negative reaction from other artists, claiming that typing in a few lines of text does not an artist make. Some also accused the creator of the painting of cheating, maintaining that using artificial intelligence to create a painting defiles the artistic process.

While AI writing assistants and organizers have been around for years, recent advances in technology have created concern among academics, journalists, and authors everywhere. Because text is generated so much more quickly by the bots than by humans, many companies are incorporating these into their systems and requiring new workers to be fluent in AI technology. AI has gotten to the point where it is very difficult, if not impossible, to distinguish it from human writing. It is even starting to invade the realm of the novelist and the poet. 

Jasper, Sudowrite, Chapterly, and Writr are programs that generate character concepts, story templates, content composition, title ideas, and even gauges audience reactions for novelists. No wonder the screenwriters are nervous. Since these AI programs typically draw on hundreds—if not thousands—of databases to determine story arcs, character reactions, or dialogue, what we end up with might be frighteningly similar to the work of a human writer, which we typically assume would be better—more genuine.

If that is the case, differentiating between human and digitally conceived work should depend on what makes people intrinsically human. In other words, what do we have that the AI generators don’t? What do we know about life and the human condition that they could never possibly fathom? In other words, could it be possible that AI has, or might someday possibly attain, consciousness and self-awareness? 

Of course, it could be argued that since these writing and art generators are programmed by humans, they have a vast knowledge of all the joys, pains, agonies, and fears that humans experience, and so can access the human experience. Maybe. But does reading about love offer the same sort of emotional complexities, the joys, the disappointments, the ecstasies, and the pains, that living through a relationship entails? Is the gnawing hunger of Paul Neruda’s “Love Sonnet XI” or the creepy devotion of Edgar Allan Poe’s “Annabelle Lee” understandable to a machine? Or does the experience of absorbing these words give enough of an understanding to a lifeless collection of data bits that they might understand it? In other words, is the underlying emotional cognition secondary to the art? 

The human condition is the underlying principle and problem here. If it’s assumed—and it typically is—that art is an intrinsically human product, what is it about being human that makes our species, and by extension, our art, so special? Why can’t it—and shouldn’t it—be imitated? 

First off, people are biological beings. Like all living creatures on Earth, we need to eat, drink, eliminate waste, sleep, protect our young, and reproduce. These basic needs are foreign to the first-person consciousness of a machine. It doesn’t experience hunger, lust, or fatigue. It doesn’t dream. But does that even matter? A reader can feel the despair of Raskolnikov in Crime and Punishment without bearing the guilt of having killed someone. Of course, everyone (everyone human, that is) has felt some degree of anguish. They can relate to this character. An AI generator—as far as we know—cannot, except through the programmed words of past writers.

Human beings are also—presumably—the only species on Earth that is aware of its own mortality. We alone among the fishes and birds and trees and rats know that we’re going to die someday. This knowledge is arguably the foundation for religion, ambition, and art, along with other human endeavors. Does an AI generator know the fear of death, of oblivion, of the unknown? Or if it is assigned to write about “Annabelle Lee”, will it understand the speaker’s insatiable grief at the loss of his great love? Will it understand the courage of those who willingly faced death in All Quiet on the Western Front without knowing death? Again, all it can do is reach back and use second-hand experience. 

The only creatures on the Earth that create art are human beings. A few other animals, like chimps or sea otters, use rudimentary tools, but most of God’s creatures take the world as it comes. AI does not creature art by itself—not yet, but the latest advances in AI technology have shown—to the surprise of their creators—that some of the programs are able to formulate new ideas from the information given them. 

According to Professor Mike Sharples, “It’s not just taking previous words and reusing them, but it’s creating an internal representation, not just at the surface text, but of the ideas and the concepts behind it. It’s creating this neural network, this multilayered network.” (euronews.next).

If these “new ideas” are derivations and reevaluations of past ideas, are they truly new? If so, is art predominantly derivative? Some believe there are only seven basic narrative storylines from which all literature is derived. While it may be true that there are a finite number of narrative types, the variations on them are endless, unique, and—by definition—human. An AI generator might understand humor, quirkiness, sorrow, and emptiness, but would it ever come up with Gravity’s Rainbow, Catch-22, Ulysses, Asteroid City, Satyricon, Moonlight Kingdom, or Edward Scissorhands

It might understand beauty, loneliness, and nature, but could an AI program present us with Starry Night, Christina’s World, Nighthawks, or Impression, Sunrise? Is there something so intensely personal, so inherently human in these pieces that it can’t be translated through any number of data bits? Does Van Gogh’s despair, Heller’s irony, or Anderson’s melancholic winsomeness translate into computerese? Is the human condition that one kernel of style and identity that can never be imitated? 

Time will tell. 

This—by the way—was written by a real human. Really.

Notes

  1. https://www.simplilearn.com/tutorials/artificial-intelligence-tutorial/artificial-intelligence-applications
  2. https://www.nytimes.com/2022/09/02/technology/ai-artificial-intelligence-artists.html
  3. https://www.chapterly.com/experiencedwriters/write?gclid=Cj0KCQjwwvilBhCFARIsADvYi7L6M8Z81zW7ZhncTo7ImvydhpwT46SYn7JggnwS5XvUy-DXUXDtc1EaAkeaEALw_wcB
  4. https://www.euronews.com/next/2022/11/08/ai-writing-is-here-and-its-worryingly-good-can-writers-and-academia-adapt