One could be right to say AI is change, not chaos. But change is a cold word for what is happening to us. Let me offer you a truth that feels like a bruise when you press on it.
The human condition is not a variable to be optimized. It is the cathedral. And we are outsourcing the very prayers that keep its lights on.
Consider this. The ultimate yardstick of any era, any tool, any god we build, is this single, terrible, beautiful question: Does it make us weep? Not the dry, performative tear of a character on a screen, but the real sob; the one that cracks your ribs open at 3 AM when you hold your child’s feverish hand. The one that floods your face when you hear a piece of music that your dead parent loved. That sob is not data. It cannot be tokenized. It is the fingerprint of a soul.
We are rushing to hand over the very transactions that once forged our humanity.
Example one: the funeral that became an algorithm. My neighbour lost his grandmother last winter. His own brother, living two thousand miles away, could not attend the funeral. But he “participated.” He watched the livestream on his phone. He pressed a “candle” emoji in the chat. He received an AI-generated slideshow of her life, set to generic, licensed piano music. Later, he told my neighbour that he “processed his grief.” But he didn’t. He consumed a representation of grief. He never smelled the lilies in the room. He never felt my neighbour’s wife’s hand crush his as the casket lowered. He never saw the single, real tear fall from the priest’s nose.
We have convinced ourselves that observing is the same as feeling. It is not. It is the decadence of the spectator. A civilization that livestreams its own heartbreak is a civilization that has forgotten how to bleed.
Example two: the child who asked Alexa for a hug. My cousin’s grandson of four years, after a nightmare, toddled out to the kitchen. Instead of climbing into his mother’s bed, instead of feeling her heart thump against his ear, instead of learning the ancient rhythm of comfort that has soothed humans for millennia, he looked at the smart speaker and said, “Alexa, tell me a nice thing.”
The machine obliged. It played a recording of rain and a synthetic voice saying, “You are safe. The dark is just the world sleeping.”
The mother, scrolling her phone in the next room, thought this was efficiency. She thought this was progress. But I saw a tiny human learning, at the cellular level, that comfort comes from a plastic cylinder, not from the warm, flawed, breathing mammal who shares his blood. We are not raising children. We are raising clients of a service that will never love them back.
The adverse decadence is this: we have mistaken the management of emotion for the experience of it. We use AI to draft apology texts to our spouses. We use mood tracker apps to “optimize” our sadness. We use chatbots as “therapeutic companions” because they are less messy than a friend who might judge us. But a friend who doesn’t judge you is not a friend. A friend who doesn’t bring their own baggage, their own silence, their own imperfect, fumbling, human presence; that is a mirror. And mirrors cannot hold you when you fall.
The reality check is brutal but simple. Every time you let an AI summarize a loved one’s long, rambling, boring email about their day, you are saying that the texture of their consciousness is a chore. Every time you use a generative tool to write a birthday card instead of scribbling three ugly, honest, misspelled lines yourself, you are saying that authenticity is a bug to be patched. Every time you accept a virtual “I see you’re struggling, here’s a breathing exercise” from a bot, instead of calling the one friend who knows the exact shape of your particular darkness, you are pruning your own soul.
The conclusion is not anti-technology. It is pro-wound. The human condition is not about being happy, efficient, or well-regulated. It is about being there. For the birth. For the fight. For the stupid, glorious argument about nothing that ends in a hug. For the silence between two people who have said everything and still choose to stay.
AI can predict your next word. It cannot mourn your last one.
So here is your yardstick. When you are dying, and you will be, will you want a flawless, empathetic, perfectly paced voice reciting a poem generated from your life’s data? Or will you want one real, clumsy, tear-stained, wrong word from a human hand that is shaking because it cannot bear to lose you?
If you choose the hand, then you already know the answer. And you know what you must do with your life. Touch things. Be touched. Fail at it. Try again.
Because the opposite of chaos is not order. It is feeling. And we are forgetting how.
<<<o>>>
Leave a Reply