ALL MY LIFE, for as long as I can remember, people have stopped me in public to tell me things. I have had strangers confess affairs, crimes, secrets to me many times over the years, on airplanes and at farmers’ markets and in cell phone stores. But there’s one revelation that I think of most, uttered by a retired woman blazing through the Oak Park Public Library one sizzling July afternoon.
She was talking to herself, surely in an attempt to hear her own thoughts better over the general buzz that filled OPPL, which was always popping. Freelancers, unhoused people, teenagers who had nowhere else to meet up, mothers with an hour to themselves, veterans, and me—all clamoring together to escape the heat under those high ceilings with gorgeous natural light.
This woman was storming down the midline of the common space, deep in her inner monologue, muttering about a miscommunication with one such library vagabond, someone who had stolen her seat or grabbed a book from her pile. “I don’t know how they could have possibly misunderstood me,” I heard her say to herself. And then, quite suddenly, she grabbed me by the shoulders as I passed her, looked me dead in the eye, and said, “I used to be an English teacher, so I get incredibly frustrated when people don’t express themselves more perfectly!” I looked right back at her, years of acting training preparing me with an instant reply.
“I know exactly what you mean.”
OF THE MANY insidious consequences that accompany the use of artificial intelligence via ChatGPT (environmental impact, job reduction, general end-of-the-world vibes), there is one aspect of it that alarms me the most: it creates the illusion that in crowdsourcing innumerable opinions from an amalgamation of strangers across time, human beings will be able to express themselves perfectly.
I recognize the benefits it has created for a lot of people; my best friend, a worship leader, recently told me ChatGPT created a devotional for her team so theologically rich it moved her to tears. The consulting group I work with regularly uses the same feature to translate important concepts across industries for people who have to work together but have no idea how to talk to each other.
But, by and large, the use of ChatGPT is cloaked in the dark undercurrent of what I would argue are two downright dangerous moral problems: 1) the underlying assumption of AI perfecting human expression is that humans are no longer up to the job of speaking for themselves, and 2) in bypassing the time, effort, and experience it takes to arrive at what you personally think about a given topic (and therefore how you would say it), it not so slowly erodes character, integrity, and original thought in the process.
Admittedly, I am on the opposite end of the ChatGPT spectrum. Were I still in school, I would not be tempted to let AI write my essay, because I simply love the work. I have no desire to let someone else determine what I think, or to let them say it for me–I would rather be wrong in my own voice. But I do know what it’s like to need help. I certainly know what it’s like to feel I don’t know how to say what I mean.
And that’s when I turn to the place where wisdom, character, and belief are forged: community—or those relationships in my life that have been slowly cultivated through navigating disappointment, grief, and life’s endless slings and arrows. A machine will not be able to give me advice that takes into account the context of my life; a machine does not understand my specific weaknesses, proclivities, or excesses. But my friends do. These are the people with whom I have made “more tracks than necessary, some in the wrong direction,” as Wendell Berry would put it. In community, you are forced to reckon with what you actually think by knocking heads against each other and disagreeing, by having long conversations about the same thing over and over again, by failing, by thinking you think one thing only to have life teach you another.
SO, WHEN YOU TAKE into account that we are in a global loneliness crisis, the temptation of befriending AI or having it help you determine what you think comes into stark relief. What’s more, for better or worse, our character is increasingly becoming shaped by a different type of relational substitute: the art that many of us interact with most—film and television.
I’ve written about this for Inkwell before, but the shortchanging of original thought brought on by AI is readily on display in all kinds of entertainment. In the last several years, particularly in blockbusters, this same prevailing ethos of attempting to amalgamate mass human experience into something that resembles mass entertainment is resulting in some really bad movies: constant IP repeat, amorphous villains where you honestly don’t know who’s good or evil, shallow relationships with baffling stakes. It’s like they are showing you the idea of something, as opposed to the thing itself, like a proxy of how a human being might behave and speak.
As a result, box office attendance continues to plummet because people just don’t respond to these non-specific stories where no real question of character is raised. You might feel a wave of emotion in watching a group of ragtag underdogs succeed against a seemingly impossible mission, but you will not think about it beyond your car ride home. It does not require anything of you to watch it. There’s no skin in the game, no real reflection of the complexity of the human experience.
IN CONTRAST, this was the very currency of the big movies of the 70s and 80s, the golden age of cinema. The Godfather is a bona fide blockbuster and considered one of the best movies of all time. Yes, it’s thrilling and full of machismo, but at its core, it’s a character study: Michael Corleone transitioning from an idealistic, honest man into one ruled by power and violence. People watch it for the guns and horses, sure, but what keeps us coming back is that unavoidable feeling that the exact same thing could happen to us under the right circumstances. Human nature is honestly reflected, and the results are implicating and hair-raising.
Another example of this can be found in one of my all-time favorite movies, released 15 years later: Broadcast News. Jane, one of the main characters (played to perfection by Holly Hunter), reminds me of the woman who grabbed me in the library. In the opening scene in a childhood flashback, Jane chastises her father for using the word “obsessive” to describe her, insisting that he must be more precise in his language if he ever wants to communicate effectively. It’s a genius setup character-wise, especially so in light of what happens after. She becomes a celebrated news producer in DC, where she is constantly choosing the path of greatest resistance in her professional life to arrive at the best result. Her character is cut against the grindstone of her own standards within her trusted news station cohort, repeatedly making her work stand out to her higher-ups.
And then she falls in love—with someone whose values are totally opposite from hers, and who lives in a way she reviles: Tom, the station’s new anchor and a symbol of the future of broadcast television (aka, flash over substance). The movie is fascinating for many reasons, chief among them because it deals with the everyday moral decisions we all have to navigate. How do we actually live in line with what we think? How do we behave in line with our beliefs, when life is random and confusing, and since we are bodies and hearts as well as brains?
Jane’s integrity—the core of her identity—is put to the test with major professional and personal cache on the line. She believes in uncompromising journalism and the truth at all costs; Tom thinks truth is relative and bendable when necessary. She takes the long way; he takes shortcuts. She’s writing handwritten notecards; he’s asking ChatGPT how to write his headlines.
The movie was a huge success in its time, making a quadruple profit on its budget, and is a cult classic today. The three main characters feel almost unbearably real (fellow fans will keenly sense the irony at me not even mentioning Albert Brooks’ character in this illustration). It’s the ultimate paradox in storytelling: all three of them are idiosyncratically specific, to the effect that they end up feeling incredibly relatable. They are as neurotic and inconsistent and emotional as anyone. They are truly, madly, deeply alive, following no predictable pattern.
THE DIFFERENCE WE’RE SEEING in large-scale art from then versus now is a direct response to the ways we are letting technology run our lives instead of our communities. The small scale matters because it informs the big scale. It’s the opposite of a virtuous circle. When the way we live our lives keeps negatively impacting the way we tell stories about them–we continue to do what we see reflected back to us.
When we outsource thought, it becomes a quick jump to outsourcing (and therefore abandoning) integrity. When we do not know what we think, we do not know how to act. Moral conviction is forged through knowing what you believe, and then behaving as best as you can in line with that morality.
Spoiler alert: Jane and Tom don’t end up together. Tom makes a professional decision that offends the very core of Jane’s ethics–to him, it’s just business, but to her, it’s personal. She sacrifices a chance at love for the reality of integrity. Every time I watch it, my choice about what I would do if I were in her shoes changes. I’d choose love, but would I live to regret it?; I’d leave too, partnership doesn’t work if you can’t agree on what’s right and wrong. I’ve had to make my own version of this choice before, just like Jane. Sometimes I’ve been right; sometimes wrong. Life will make fools of us all. Even saints behave in ways that mystify them; to err is human. But we learn what we think is right by being wrong. Mistakes and errors lead us to what we know to be true.
It cannot be overstated how important it is, then, for your community—the people you sharpen your thoughts and decisions against—to be people walking that same long, slow road. Few of us will ever be put in a situation where we have to save the world, but most of us will be faced with an ethical quandary, be it personal, professional, or a hybrid, that forces us to choose what is right over what is easy. The ripples will only be felt in our tiny ponds, but depending on what we choose, they will have the impact of a tidal wave on those we love.
IF YOU DON’T KNOW WHAT TO THINK, ask your friend, your neighbor, your teacher, your parents (if you dare). But don’t ask a machine who doesn’t care about you, who can only ever approximate human behavior, not share in it with you. When you ask a person for advice instead of a computer, there’s a second, underlying question, if you’re willing to ask it: “How do you know what you know?” Answers from human beings come with stories, context. A machine doesn’t know what it’s like to fail. A machine doesn’t know what it takes to pick up the broken pieces of disappointment and loss and ask for more life. A machine doesn’t know why you would bother taking the long way home to pass your favorite field. Speed is part and parcel with progress, but time is required to cultivate a soul.
We will never be able to express ourselves perfectly; it doesn’t take very much being alive to know that. Human error is what makes reality interesting. Life itself happens in the moment where we linger a little longer–romance, connection, delight. Ultimately, even in a fallen world, anything can happen, good or bad. Human beings are illogical! Crazy! Selfish! Deranged! AI models are built on the logic of humanity, but the pattern they follow is missing the key ingredient: when have we ever been anything but gorgeously, infinitely fallible? But, to purposefully misquote East of Eden, we need never try to be perfect; only good. Our very ability to choose—“Thou mayest”—is what keeps grace in the equation.
The flipside of that, of course, is that everyone else can choose, too. Someone can choose to betray you. Someone can choose not to love you back. Someone can choose to hurt, ignore, dismiss, deceive you. AI is attractive because it tells you what you want to hear. It is a companion with no skin in the game, literal or figurative. Love requires pushing back, telling you what you don’t want to hear sometimes. A machine will never do that.
Souls do not move at the speed of machines; they were never meant to. I don’t want to know what a machine claims to think. I want to know what I think, what you think, what the woman pacing through the library thinks, so much that she will burst into flames if she doesn’t grab someone and tell them. To arrive at a thought individually rendered and expressed takes quite a bit of time. Even if you are arriving at a conclusion someone else arrived at ages ago (which of course they did), you are arriving at it for yourself. And the distinct way you would express it will differ from that person, which will land differently for someone than how they said it. In the words of one of my favorite songwriters, David Ramirez, “the one thing I know that’ll seal you in stone is what you have to say.”
All you have is what you have to say. You—a singular and unique soul, crafted with utmost care. So take the time to make sure it’s something good. Not perfect—just good.
Jessie Epstein
Writer & Actor
Jessie Epstein is a writer and actor based between Los Angeles and the Midwest. Her work can be read or is forthcoming in Identity Theory, orangepeel, Anti-Heroin Chic, and Heartland Society of Women Writers, among others. Her debut chapbook of poetry, Francesca Dons Beatrice's Cloak: A Lovergirl's Guide through Dante's Inferno, is available through Bottlecap Press. Find more on her Substack and website.
What did you think of this essay? Share your thoughts with a comment!
Thank you for this. I have instinctively recoiled against machines helping me write, even back to the days when Microsoft Clippy started to offer grammar suggestions along with spelling corrections. I thought it was just because I'm a writer and I take offense at a paperclip telling me how to say what I'm trying to say. But you gave words to my gut reactions: in writing, in storytelling, in creating, in life, the process (the why) is just as (sometimes more?) important than the final product. AI generated art may be more perfect, but it lacks the why.
This is incredible! Absolutely love this.