Why Alienation Became My Lens
- Shahab Nn
- Sep 16
- 3 min read

The Question That Started It All
From the start, one question kept circling: can machines truly understand human emotion? I’d heard (and once wrote) the familiar line: art is a reflection of human feeling, and machines can’t reproduce that. At first I was convinced. Then I spent more time with generative systems, pushing prompts, watching iterations emerge, and the problem felt deeper than bugged gradients or odd anatomy. It was a gap , an absence of lived weight behind the picture.
The Trigger , A Strange Video
A turning point came from a short clip I stumbled on: a bunch of AI agents in conversation. They suddenly realise they’re machines, and right after they switch into a secret, encrypted language , one humans can’t follow. On the surface it’s funny. Under the surface it’s oddly disturbing. When they change code, we’re dropped out of the loop. They continue without us, fluent in a dialect we cannot access. That sudden exclusion made tangible something I’d been trying to name: closeness collapsing into distance, contact turning into estrangement.
A Sunrise Without Melancholy

Think about watching a sunrise. For us, it’s rarely just a beautiful picture. It can carry a melancholic weight, a reminder of time passing, of endings hidden inside beginnings. It stirs feelings that go beyond color and light.
For AI, though, a sunrise is nothing more than gradients and pixels. No nostalgia, no bittersweet ache. Just data arranged in a way that looks right. As Chiodo points out in What AI “art” can teach us about art, the gap is not in the technical rendering but in the absence of human affect: an image of a sunset produced by a model does not bring with it the lived, cultural and emotional resonances that humans attach to such scenes (Chiodo, 2022). That difference, the way humans live emotion while AI only renders it, was impossible to ignore.
Masks Without Breath
The more I looked, the more AI images felt like masks. Smiles, tears, haunted stares , all reproducible, all sometimes beautiful, but oddly hollow. Convincing on the surface, empty behind it. Masks without breath. That uncanny gap is what led me to a single word that seemed to fit: alienation.
Mutual Alienation (yes, both ways)
I’ve written about this before, but it’s worth saying again: the alienation isn’t one-sided. AI might be estranged from our interior life , it doesn’t feel, remember, ache , but we are also estranged from its internal logic. These systems operate through probabilities, weights and training histories that are opaque to most of us. We admire the outputs and wonder how they came to be. So the encounter becomes mutual: AI is distant from our feelings, and we are distant from the machine’s process. That double distance , this mutual alienation , is the stage I want to set.
I keep returning to the language example I’ve used before. As a non-native speaker in the UK, I know what it feels like to follow most of a conversation and still miss a current, a tone, a cultural shorthand. You participate, you understand much, but a faster exchange can leave you outside. AI feels like the inverse: fluent in its own dialect while we stand just out of earshot. I’m repeating this example here because the connection matters , both experiences point to the same thing: a kind of shared but asymmetric estrangement.
From Problem to Possibility
I didn’t pick alienation to point fingers at technology. I didn’t choose it to say “AI is bad.” I chose it because the gap itself is interesting and generative. If AI cannot carry the full weight of human emotion alone, then human presence , the interpretation, the direction, the framing we bring , becomes crucial. Human-in-the-loop is not a shameful patch; it’s the creative opportunity.
Why This Matters Now
We’re living in a moment of rapid change. The tools are capable; the legal and ethical frameworks lag behind. By focusing on alienation we don’t deny AI’s power; we name the relational problem and open a space for creative response. It becomes a prompt rather than a verdict: how will artists, audiences and technologists occupy this gap? How will we use it to ask better questions?
References
Chiodo, S. (2022). What AI “art” can teach us about art. AI & Society. Springer.

Comments