“I woke up as the sun was reddening; and that was the one distinct time in my life, the strangest moment of all, when I didn't know who I was.”
When I first read this passage, in Jack Kerouac’s On the Road, I cried. I myself had been traveling alone for several months, and was emotionally vulnerable to the point where a well-written paragraph could bring me to tears.
I say “well-written” because I doubt I would’ve cried if the sun hadn’t been reddening. There’s something in that particular turn of phrase which made the quote feel so special to me. I’m not sure what it was - the relentless indifference of nature? The oppressive warmth I felt radiate from the page?
Anyway, I woke up this morning as the sun took on a shade of muted tangerine. I had enough sense of self to know that I wanted a cappuccino, so I searched up “coffee wifi laptop” in Google Maps, found a cute-looking spot, and rolled out of bed.
By the time I was on the street, however, I realized the cafe wasn’t supposed to open for another twenty minutes, and Google Maps was telling me I was scheduled to arrive in ten. I felt frustrated. What was I supposed to do, awkwardly wait around for the owner to let me in? I started walking slower, but remained anxious.
Then I remembered I didn’t have to follow the Google Maps route at all. I put my phone back in my pocket, took a few extra turns, and had a really lovely walk—arriving exactly when I intended to.
In retrospect, it’s silly I didn’t think of that option sooner, but it also makes sense why I forgot it existed. Google Maps is designed to suggest the most efficient path, and the efficient option is usually what I want. Over time, I got conditioned into thinking that the Google Maps path is “best”—even when another, less efficient outcome might actually be better for my purposes.
The same is true of any “algorithm,” which is definitionally “a process optimizing for a specific outcome.” The process is usually aligned with what we want, but not always, because human desires are complex. This means that machines will necessarily miss the mark at times, even when they’re programmed with good intentions.
Take ChatGPT, which is designed to predict the next most likely word, depending on the prompt. A service like that is always going to miss improbable sequences that might actually work better in writing. For example, no matter how many different ways I prompted it, ChatGPT couldn’t figure out that it’s important for the sun to be reddening (or even red in the first place).
That’s because the sun “reddening” isn’t necessarily predictable. The model is far more likely to spit out words like “rising” or “bright”—losing the emotional weight of the unlikely phrase.
Don’t get me wrong: ChatGPT’s output is fine, and in many cases quite helpful. I happily rely on it to spell-check my essays, and the predictive algorithm is surprisingly good at catching my typos. But sometimes it’s optimizing for something I don’t want; sometimes I want to write unpredictably.
The real problem is when we lose that distinction. It’s easy to surrender our agency to these tools when they make life easier. As I learned with Google Maps, though, once you surrender enough you forget you had agency in the first place. The easy choice becomes the default choice, and you lose sight of what you actually wanted—until the day you wake up and forget who you were.
If you liked this essay, you might like my book Algospeak!! Pre-order it here or register for my NYC book launch on July 14.
“It’s easy to surrender our agency to these tools when they make life easier…once you surrender enough you forget you had agency in the first place.” EXACTLY.
It’s nice to have moments where the absorption in efficiency, especially with technology, wears off !