To be fair, AI does solve real problems. A lot of writing work is less romantic than writers like to pretend. Sometimes you’re not waiting for inspiration. Sometimes you’re trying to name the thing, structure the thing, or simply get moving before the day slips away from you. On those days, AI can be genuinely helpful.
The boss called us into a meeting today and encouraged us to use AI to do our jobs faster. He even had a formula for it. Use AI 1% of the day, which adds up to 7% a week, and from there, we can safely embark on a land battle in Asia. I freely admit I hadn’t had coffee yet, and any talk of artificial intelligence that early in the day puts me in an ill-fitting suit, standing in a bread line while the world around me turns black and white. Still, he had a point, which was annoying of him.
AI is useful. At this point, pretending otherwise feels a little performative. It’s not some passing gimmick anymore, and it’s not just a weird toy for people who enjoy making machines write sonnets about soup. It’s here, it’s capable, and it can make work move faster. At ThoughtLab, where speed and clarity actually matter, I get why that would get everybody’s attention. If there’s a tool that helps smart people get to strong work more quickly, of course, the boss is going to tell us to use it.
My resistance to AI has never really been about whether it works. It does work, at least in the practical sense. My resistance is more instinctive than logical. Something in me still recoils a little when I’m told to hand part of the writing process over to a machine. Maybe that makes me old-fashioned. Maybe it makes me dramatic. I can live with either. Writers are allowed one or two unreasonable reactions a year, and I’ve decided to spend one of mine here.
Even so, I’m not blind to reality. I’m not interested in pretending the thing is useless just because I find it eerie. A blank page is still a blank page. Deadlines still show up. Work still has to get done. If AI can help clear a path through the early fog of a project, I’d be stupid to ignore that completely. The problem for me begins a little later, once the work stops being merely functional and starts asking for something more human.
That’s where my discomfort really starts. Not with usefulness, but with depth. Not with whether AI can produce language, but with what kind of language it can produce when the subject turns inward. It can help with speed, order, and momentum. What I’m less sure about is whether it can help with anything that requires an actual soul behind the words.
The Seduction of a Useful Tool
To be fair, AI does solve real problems. A lot of writing work is less romantic than writers like to pretend. Sometimes you’re not waiting for inspiration. Sometimes you’re trying to name the thing, structure the thing, or simply get moving before the day slips away from you. On those days, AI can be genuinely helpful. It can offer a rough framework when your thoughts are still scattered. It can help you sort the shape of an idea before you’ve fully found the language for it. It can get you past that awful first moment when the page looks bigger than your mind.
That’s not a small gift. Momentum matters. So does clarity. There are days when the hardest part of writing isn’t emotional truth or artistic risk, but simple forward motion. A useful tool that reduces friction has value. It can help you cut through clutter, tighten a loose thought, or take the foggy shape of an idea and give it enough form to start working with it. In practical terms, that can save a writer a lot of time and unnecessary suffering.
I understand why so many people have embraced that. I understand why companies want their employees to get comfortable with it. The modern workplace loves efficiency because efficiency is measurable, and anything measurable tends to become a virtue. If a task can be done in one hour instead of three, that starts to look like wisdom, or at least productivity, and productivity has a way of dressing itself up as moral seriousness. Some work really does benefit from that kind of speed. There’s no heroism in wasting time just because slowness feels more noble.
That’s the seduction of AI, and I don’t mean seduction in some sinister way. I mean that it offers something appealing and often legitimate. It promises relief from the heavy lifting of beginning. It gives shape to vagueness and gives language to half-formed thoughts. On bad days, that can feel almost miraculous. The temptation isn’t just that it helps. The temptation is that after a while, help begins to look a lot like understanding.
That’s where I start to hesitate. Usefulness and depth aren’t the same thing. Speed and insight aren’t the same thing. A system can produce fluent sentences without knowing what any of them cost. That distinction matters everywhere, but it matters especially in writing, where polished language can sometimes hide an empty center. The smoother a machine becomes, the easier it is to mistake that smoothness for wisdom. Sometimes the sentence lands cleanly and still leaves nothing behind.
I can live with that when the work is mechanical. I can even appreciate it. If the task is structural, organizational, or exploratory, AI may be exactly the right kind of assistant. But once the writing moves toward feeling, memory, longing, or faith, I begin to want something else. I begin to want the friction of an actual person trying to say what they mean. That effort, awkward as it can be, is often where the real thing begins.
The Cost of a Sentence
Part of what gives writing its force is that it costs the writer something. Not always in some dramatic, blood-on-the-page way, but in the quieter ways that matter just as much. Time. Attention. Uncertainty. The long, frustrating effort of trying to say something true without flattening it into something easy. A sentence may look effortless when it’s finished, but that doesn’t mean it arrived effortlessly. Often, the finished line is only the visible part of a much messier process.
That cost matters because it leaves a trace. The reader may not know exactly what the writer went through to arrive at a certain line, but they can often feel that something real was paid for it. The words carry a kind of pressure. They’ve been tested and survived hesitation, revision, second thoughts, and whatever private experience made the sentence necessary in the first place.
A machine doesn’t know any of that. It can produce fluent sentences without knowing what any of them cost. It doesn’t wrestle with truth or fear saying the wrong thing. It doesn’t revisit a paragraph three hours later and realize the clean version was less honest than the awkward one. It can generate the appearance of completion, but it cannot participate in the human struggle that sometimes gives writing its depth.
That doesn’t make every difficult sentence good, and it doesn’t mean suffering automatically creates art. But when a piece of writing has real weight, part of that weight often comes from the fact that somebody had to reach for it. Somebody had to stay with the thought long enough for it to become theirs. AI can imitate the finished sentence. What it cannot imitate is the inward cost of arriving there.
Where the Machine Stops Short
The more I use AI, the more I notice the same absence. The structure may be strong. The phrasing may be competent. The result may even sound thoughtful at first glance. What’s missing, again and again, is the inner pressure that gives language its life. It can produce writing that resembles feeling, but resemblance isn’t the same as contact. It knows how emotion is typically described. It doesn’t know what it is to have one.
That difference matters more than style. A good sentence doesn’t move us only because it’s elegant. It moves us because somebody means it. It carries a trace of private life, even when the subject isn’t overtly personal. You can feel when a person has actually suffered, loved, feared, and then found words equal to the experience. You can also feel when language has been arranged without ever having been earned. AI is excellent at arrangement. It’s much less convincing when the writing asks for lived weight.
This becomes obvious as soon as the subject approaches emotional truth. Ask AI for help with a practical task, and it can be terrific. Ask it for sorrow, shame, or awe, and something curious happens. The surface often looks fine. The words make sense. The tone sounds appropriate. Yet the deeper you read, the more you notice that nobody is at home inside the language. The machine can imitate the furniture of feeling, but the room itself stays empty.
Human beings don’t write from neutrality, even when they try. We write from our own damage, hopes, and from the parts of ourselves we can barely explain. We write from memory, embarrassment, or from private grief that’s still looking for a form it can survive inside. Some of the best sentences a person ever writes come from places they’d rather not have visited in the first place. The sentence matters because it carries that journey inside it, whether the reader can name it or not.
A machine has no such journey. It doesn’t wake up uneasy for reasons it can’t locate, miss the dead, or lie awake replaying one sentence from ten years ago and wondering why it still hurts. It doesn’t know what it means to love badly, regret deeply, or feel hope return after it had every reason to stay gone. It can reproduce the language that usually surrounds those experiences, but reproduction isn’t experience. Description isn’t an encounter.
That’s why I still don’t trust AI with the center of emotionally serious writing. I can use it around the edges. I let it help me sort, tighten, and start. I use it as a sounding board when my own thoughts are still tangled. But I don’t want it to carry the emotional core of the work, because it can’t know what that core is made of. It reaches fluency faster than it reaches truth, and if you’re not careful, fluency starts to pass for something more profound than it really is.
The problem isn’t that AI is bad at language. In some ways, it’s disturbingly good at language. The problem is that language without life eventually gives itself away. You begin to feel the absence even when the sentence is clean. The words are in the right order, the meaning is technically available, yet something in you stays untouched because the writing never crosses the distance between sounding human and being human. Very much like my dating life.
A Lyric Knows More Than an Explanation
I noticed this most clearly when I asked AI to interpret one of my favorite song lyrics. The line was from "Eleanor Rigby": "Wearing a face that she keeps in a jar by the door." I’ve loved that lyric for years, partly because I still don’t feel finished with it. It stirs loneliness in me, yes, but also curiosity and a faint, Russian novel-like dread that’s hard to name. The image is strange enough to bypass ordinary understanding and land somewhere deeper. I don’t simply think about it. I feel it.
What I love most is that it never settles. Depending on the day, the line opens in a different direction. Sometimes I hear makeup, the sadness of somebody preparing a public self before she steps outside. Sometimes I hear a social mask, the version of herself the world will tolerate. Now and then, the image feels harsher than either of those, as if she’s lifting an entire identity out of storage before she opens the door. However I hear it, the line keeps its mystery. It doesn’t flatten itself into one clean meaning, and that’s part of its power. This image of Father McKenzie saying, “We have company, Elenore,” and her opening the jar, attaching a fully formed “company face” to her blank mannequin-type head, and then going to meet the guests. Equal parts deep sadness and terror. Also, I still wonder where someone buys a jar of face?
When I asked AI what the lyric meant, the answer wasn’t wrong. That may have been the strangest part. It could identify themes that made obvious sense. It could talk about isolation, performance, and the split between inner life and outward appearance. Those are reasonable observations. You could hardly call them false. Yet the answer still felt dead to me. It described the lyric in the way a person might describe a room they’d only seen in a photograph. The shape was there, but the atmosphere was gone.
That atmosphere is everything. A line of music can enter you before you understand it, and in some cases, it keeps working on you because you never understand it completely. It remains alive because you remain alive. You hear it differently when you’re younger than when you’re older. You hear it differently when you’re lonely than when you’re in love. Some lines travel with you, and over time, they gather your own life into themselves. The meaning isn’t fixed because the hearer isn’t fixed.
A machine can’t bring that kind of inward history to a line. It has no old wound that suddenly answers to an image, no private memory waiting to be touched. It can’t feel recognized by art. It can interpret symbolism, but it can’t have that strange and intimate experience of being known by something that was written decades before you heard it. It can’t live in the space between confusion and recognition where so much of art does its real work.
That’s why I think explanation is often the least interesting thing art can offer us. The point isn’t only what a line means; the point is what it does. It’s the way it alters the weather inside you. A lyric, a poem, or a prayer can stay with a person for years because it refuses to finish itself. It leaves room for return and room for the hearer to bring more of themselves to it the next time. AI can help decode that process, but it can’t participate in it. It can’t feel a line opening some hidden room inside the mind and then standing there with the door still half open. AI doesn’t have more of itself to bring. Itself is finite until programmed.
That may sound like a small distinction, but it isn’t small at all. It’s the difference between extracting meaning and receiving it. One is useful, the other is transformative. Once I noticed that difference in a song lyric, I couldn’t stop seeing it everywhere else, especially when the conversation turned toward spiritual writing.
What Changes When the Subject Is the Soul
The gap becomes much more serious when the writing moves from art into spirituality. A lyric can survive a flat interpretation because the lyric itself still shimmers beyond any explanation. Spiritual writing is more vulnerable than that. Once it loses contact with lived feeling, it can become thin very quickly. The words may remain familiar. Grace, mercy, faith, surrender, all the expected language can still be present. Yet without a human being inside those words, they begin to feel decorative rather than inhabited. To inhabit something, you have to put pieces of yourself into it. AI has no pieces to give.
That’s what troubles me when people talk too confidently about using AI for spiritual material. The machine doesn’t lack information. It has access to more theological language than any one person could hold in memory. It can summarize scripture, compare doctrines, imitate meditative tones, and produce reflections that sound calm, intelligent, and even moving at first pass. But information isn’t the same as witness. Fluency isn’t the same as inward authority. It can arrange the vocabulary of spiritual life without ever having one.
Spiritual writing doesn’t come only from ideas. It comes from contact with mystery, doubt, and longing. It comes from confession, silence, and those moments when a person is trying to make sense of why God feels close, absent, or impossible to name at all. Even deeply faithful writing usually carries some trace of struggle inside it, because faith isn’t merely a concept. It’s lived through fear, hope, and need. The language carries those conditions whether the writer intends it to or not.
A machine can’t enter that condition. It doesn’t fear death, need forgiveness, or kneel beside a bed and wonder if anyone is listening. It can’t know what it means to pray into silence and keep praying anyway. It can’t feel spiritual hunger or be pierced by grace. When it writes about transcendence, it does so without risk. Risk is a distinctly human event. When it writes about mercy, it lacks need. When it writes about redemption, it’s missing anything to redeem.
That seems central to me. Spiritual writing isn’t powerful because it contains the right words. It’s powerful because somebody has gone somewhere inward and come back with language that cost them something to find. Sometimes that language is polished. Sometimes it’s rough. Often it’s both. But what gives it force isn’t elegance alone. What gives it force is that a person has wrestled with something larger than themselves and managed, however imperfectly, to bear witness to it. Jacob wrestled with the angel, and the angel was overcome.
This is why I keep returning to the difference between sounding true and being true. A spiritual sentence can be polished within an inch of its life and still carry no life at all. It can say everything it’s supposed to say and still leave behind the faint impression of emptiness. The best spiritual writing does the opposite. It may be simple. It may even be clumsy in places. Yet it feels inhabited, as if someone has staked part of themselves on what’s being said.
None of this means AI has no place in the process. I do think it can be useful around the edges. It can help organize thoughts, compare passages, and clear away clutter that obscures the real point. Used carefully, it may even help a writer find their way back to what matters. But the center of the work still has to come from a person. Not just a mind, but a person with memory, conscience, contradiction, and need. Somebody whose language has been shaped by the fact that they’re alive and won’t be forever.
If spiritual writing is, at bottom, one soul reaching toward another, then the absence at the heart of AI isn’t a small limitation. It’s the whole question.
The Takeaway
So yes, I’ll use AI. Working at ThoughtLab, I’d be ridiculous not to. We’re paid to think clearly, move quickly, and make strong work, and any tool that genuinely helps with that deserves a place on the desk. I can use it to get unstuck, to test an angle, or to give form to a half-built draft. I can use it to save time without sacrificing it in the name of artistic purity. None of that feels threatening to me. It feels practical, and there’s nothing wrong with practicality.
What I don’t want to do is confuse practical help with human depth. That’s the line I keep coming back to. AI can support work that deals with meaning, or replace the human center of work that asks spiritual questions. It doesn’t know what it means to feel hollow and keep searching. It can never know what it means to receive mercy and realize you didn’t earn it. It has no idea what it means to carry doubt for years and still feel some small ember of faith refusing to go out. Those aren’t abstract themes. They’re lived conditions, and they shape the language before the language ever reaches the page.
The real danger, at least as I see it, isn’t that AI becomes sentient and replaces us. The danger is more ordinary than that. We may start accepting language that sounds true in place of language that’s actually been tested by life. Companies will drop the writer because, well, AI can do that! We may begin mistaking fluency for wisdom, polish for insight, competence for soul. In the workplace, that confusion can make us faster while also making us thinner (not Ozempic-thinner, soul-thinner), and that feels like a terrible bargain when the work depends on saying something real.
For me, this is where the boundary sits. I’m happy to let AI help with the scaffolding. I’m not willing to let it occupy the sanctuary. The inward life of a person, the place where belief, fear, longing, guilt, wonder, and mercy keep colliding, isn’t mechanical. It doesn’t move according to efficiency, or yield its meaning on command. The spiritual life asks for attention. It asks for humility. It asks that we bring our actual selves into the room.
So I’ll go on using AI at ThoughtLab because there’s no prize for pretending the tool is useless when it plainly isn’t. But when the writing turns toward the soul, I still want a human being there first. I want the uncertainty, the private ache, and the half-formed recognition that sometimes leads to a sentence worth keeping. A machine may be able to imitate the shape of that language, and sometimes imitate it well. But it’s still a vague simulacrum of the human experience, and I still don’t believe it can carry the weight of it. And the soul has weight.