It was dark outside, a late afternoon in the fall of 1986. As I did each week during my last year of graduate school, I was sitting with my thesis director, the poet James Dickey. Campus was lively with classes convening and dismissing, but the darkness pooling outside made me feel we were isolated, marooned together in a place where words were life-or-death matters.
That’s the way I felt, anyway. I couldn’t say how Mr. Dickey felt. Our relationship was strictly that of teacher and student. I thought of him as an old man. He was exactly the age I am now.
I remember that particular meeting because of one ill-chosen word. In a poem that was otherwise finished, a single adjective was clearly wrong. We batted alternatives back and forth across the desk, but none was right. I was determined to find the word that belonged there, the one that clicked into place like the halves of a locket.
Hours later, sometime around 10 o’clock, the right word came to me, popping up out of the depths while my mind was occupied with something else. It was so apt, and I was so exultant, that I went straight to the kitchen, opened the phone book, and looked up Mr. Dickey’s number. When he answered, I said, “‘Pale.’ The word is ‘pale.’”
It didn’t dawn on me at the time that 10 o’clock is awfully late to be calling anyone, let alone an aging professor. But Mr. Dickey was overjoyed about that word, every bit as jubilant as I was. If only for a moment, the world made a kind of sense it hadn’t made before.
I had not thought about that phone call, much less that poem, in many years, but I’ve begun to think about it often. A flurry of “A.I. assistants” has suddenly colonized my inboxes and Word documents and texts. This month they appeared out of nowhere, like a swarm of fruit flies around an overripe banana. Everything I type now is thick with hovering robots suggesting unwelcome robot words.
Outlook supplies “an intelligent email companion.” Yahoo provides “email summaries, messaging-inspired interface and a gamified experience.” Google offers to “supercharge” my ideas. Now when I call a corporation’s customer-service department, I get a robot who asks, “Can I text you a link to chat with our virtual assistant?” The robots that answer phones, it seems, are being sunsetted by robots that can text. But then Apple’s robot takes over to summarize the corporate robot’s message before ever delivering the text itself.
(Microsoft is now informing me that “sunsetted” is not a word. It suggests using “unsettled,” “sonneted” or “unwetted” instead.)
In this brave new world, the search for a word like “pale” has been outsourced to a robot that will never suggest such a word. The yoking of unlikely adjective and noun is still, for now, the province of unwetted poets.
I have spent hours trying to kill these ghosts in my machine. I can sometimes adjust my settings to disable the A.I. assistant, but the next software update turns it right back on again. In some cases, I can’t turn it off at all. The robots are relentless.
The writing teachers I know struggle to persuade their students not to use these tools. They are everywhere now, impossible to swat away. Who could blame a young writer for wondering how using these “assistants” is any different from using spell check or letting Siri supply the next word in a text? Besides, if they don’t use these tools, won’t they be falling behind the many students who do? It’s a fair point.
But letting a robot structure your argument, or flatten your style by removing the quirky elements, is dangerous. It’s a streamlined way to flatten the human mind, to homogenize human thought. We know who we are, at least in part, by finding the words — messy, imprecise, unexpected — to tell others, and ourselves, how we see the world. The world which no one else sees in exactly that way.
Who was it who first said, “I don’t know what I think until I see what I write”? Versions of this statement have been attributed to writers as various as Joan Didion, William Faulkner, Stephen King and Flannery O’Connor. Google’s robot doesn’t know who actually said it, but almost anybody who writes, whatever they write, will tell you it’s true.
In “I, Robot,” the 2004 film loosely inspired by Isaac Asimov’s classic sci-fi novel of the same name, one robot is unlike all the others of its model. It has feelings. It learns to recognize human nuance, to solve problems with human creativity. And with those attributes comes the questions inevitably raised by being human. Twenty-six minutes into the film, the robot asks, plaintively, “What am I?” This is a question writers ask every day. I suspect everyone else does, too.
Sure, there’s a difference between writing a poem and cleaning up a garbled email, between writing a love letter and a Google ad. For some tasks, employing the use of an A.I. assistant might save time without levying a commensurate cost in humanity. Maybe.
I’m still not sure. The practice involved in rote writing tasks may be the very thing that inspires us to open a journal or write a letter or commit to paper a memory from the distant past. “No robot may harm a human being, or through inaction, allow a human being to come to harm” reads Asimov’s first law of robotics. But what if the existence of robots itself is what robs us of our humanity? Is that not a way of bringing humans to harm?
Somewhere in my house there is a bound copy of the master’s thesis I spent two years writing. I remember very little about that poetry collection. I know its title (“Small Comforts”), and I know it included a poem about the nuptial flight of ants. Probably there was one about the taste of ripe figs, too, and at least one about a rat snake. I have such delightful memories of those things, but I’m only guessing that I turned them into poems. So much from that time is lost to memory.
But I remember one poem in which the word “pale” figured prominently. And what I learned in struggling to find it has lasted through nearly four decades. The search for the right word to fill the right place can occupy a lifetime. And, I’m convinced, make a self along the way.
The post I, Human appeared first on New York Times.