That’s the question I left the meeting with. Not whether AI is good or bad, and not whether I’m supposed to embrace it or resist it, but what happens when the thing you’ve spent your life getting good at becomes something the world suddenly describes as easy, instant, and automatic.
I just sat in a meeting where my boss laid out all the ways AI is going to help us move forward, and at first, it was really exciting. You could feel people leaning into the possibilities, thinking about how much faster they could move and how much easier certain parts of their work were about to become. I felt that too, which is probably why the discomfort caught me off guard.
I’m not writing this from some pure position where I get to defend the written word while refusing to touch the machine. I use AI all the time because it helps, it speeds things up, and because pretending otherwise would feel dishonest. That’s part of what makes this so unsettling. I’m not watching the change from outside it. I’m participating in it, benefiting from it, and still wondering what it’s doing to the thing I care about most.
I’m a copywriter. I’ve spent years believing that writing matters, not just as a way to move information around, but as a craft. The right sentence can clarify something tangled. It can shift the emotional temperature in a room, make someone feel understood in a way they didn’t a moment before. I still believe that. What feels less stable is the world around that belief.
Once a machine can produce language quickly, people start talking about writing differently. They start describing it as though it were mostly output, as though the work were simply arranging words in a passable order, and that fluency and writing were basically the same thing. That’s the part I can’t stop thinking about. Not the existence of the tool, but how quickly it can make a human skill look simpler than it really is.
That’s why the Luddite thing never quite lands for me, especially when people toss it out like a joke or a warning. The Luddites weren’t furious at invention itself. They were reacting to the way management used new machines to cheapen labor, flatten expertise, and treat hard-earned skills as if they were suddenly optional. The threat wasn’t only technological change. It was the eagerness to see the machine as a reason to value the worker less. That part doesn’t feel distant to me at all. It feels current.
Not because AI and a loom are the same thing, and not because I think writers are about to become some tragic class of displaced artisans, but because I recognize the feeling of hearing a craft discussed as though its deepest qualities barely count. I know what it’s like to sit in a room full of optimism and realize that the thing you’ve given your working life to is now being described in terms that make it sound quick, automatic, and easy.
And maybe that’s the real tension in this for me. I’ll keep using AI. I’m not interested in pretending it has no value, and I’m not trying to write some noble refusal into existence. What I can’t stop thinking about is what it means to rely on a tool that also shifts the way people talk about your work. It helps me, yes, but it may also be helping create a world that’s less able to recognize why the work mattered in the first place.
That’s the question I left the meeting with. Not whether AI is good or bad, and not whether I’m supposed to embrace it or resist it, but what happens when the thing you’ve spent your life getting good at becomes something the world suddenly describes as easy, instant, and automatic.
Fluency Isn't the Same as Writing
Part of what makes this moment so slippery is that AI really is useful. It can give you structure fast, smooth rough phrasing, and turn a foggy idea into something readable in a matter of seconds. That matters. It helps. Pretending otherwise would make this whole conversation feel false from the start.
What unsettles me is what happens after that. The minute a tool becomes good at producing language, people start talking as though language were the whole job. Writing starts to sound like a throughput problem, something measured by speed and surface polish instead of thought, judgment, or feel.
That’s where the confusion sets in, because writing that matters has never just been about getting to a clean sentence quickly. It comes from noticing what matters beneath the obvious point and feeling when something is technically fine but still dead on the page. It depends on forms of judgment that are hard to quantify and easy to dismiss, especially in a culture that loves whatever moves faster.
Maybe that’s the difference I keep coming back to. Writing isn’t just the delivery of language. It’s a way of seeing, choosing, holding back, pressing harder, and staying with something a little longer when a quicker version would probably do. Two passages can say roughly the same thing and still feel completely different because one of them carries the presence of a mind that actually wrestled with it.
That’s why I don’t think the real risk is that AI will fill the world with bad writing. We’ve always had plenty of language that was empty, forgettable, and just good enough to pass. The risk is that we get so used to fluent output that we stop asking more of it. We stop looking for the pressure of thought inside the sentence, and stop noticing when something has been written and when it has merely been produced.
That’s the part I can’t shake. Not that machines can generate language, but that they may be helping create a culture that expects less from it.
What Human Writing Is Actually Doing
This is the part that gets lost whenever people talk about writing as if it were just words arranged well enough to function. The visible part of writing is the sentence, but the real work starts long before that. It starts in the act of paying attention. A writer is taking in tone, tension, mood, context, all the small signals that tell you what a moment can hold and what it can’t. Good writing doesn’t just deliver language. It reads the room, even when the room is a market, a client, a culture, or a single person on the other end of a sentence.
That’s why the best writing always feels a little harder to explain than people want it to be. You can describe the message, the strategy, even the structure, and still miss what made it land. Sometimes the difference is restraint. Sometimes it’s rhythm. Sometimes it’s the decision not to say the obvious thing because the obvious thing would flatten the feeling instead of deepening it. Those choices don’t always announce themselves, but they shape everything.
And that’s what makes this moment so strange for people who care about language. AI can imitate the visible result well enough to make the deeper work disappear from view. It can produce something that sounds finished, and in many situations, that may be enough. But enough has a way of lowering the ceiling. Once people get used to language that does the job cleanly and quickly, they stop noticing the difference between writing that functions and writing that actually sees.
I think that difference matters more than ever, especially in a culture already drowning in language. We’re not suffering from a shortage of words. We’re surrounded by explanations, captions, posts, pitches, emails, headlines, statements, and summaries, oh my. The problem is not that there isn’t enough language, the problem is that so much of it passes by without touching anything. It fills space, but it doesn’t change the temperature. It doesn’t sharpen a thought or make a person feel newly aware of what they already half-knew.
That’s what human writing still does when it’s good. It doesn’t just move information from one place to another. It creates recognition and gives shape to something blurry. It says the thing underneath the thing. And when that happens, people feel it, even if they can’t always explain why.
Which may be why I keep resisting the idea that writing is becoming less important just because language is becoming easier to generate. If anything, the opposite may be true. The easier it gets to produce sentences, the more valuable it becomes to know which sentences are worth keeping.
The Risk Isn't Just Professional
Maybe that’s why this feels bigger to me than a writer worrying about his job. Of course, there’s a professional anxiety inside it. I’d be lying if I said otherwise. When a machine starts doing something adjacent to the thing you do for a living, it’s hard not to feel that in your chest. But the more I sit with it, the more I think the deeper risk has less to do with employment and more to do with expectation.
Tools don’t just change workflows. They change standards. They teach people what to expect, what to settle for, and what starts to feel normal. And once a culture gets used to language that is fast, polished, and mostly good enough, it starts losing its appetite for the slower thing that actually carries a person inside it.
That shift won’t always look dramatic. It’ll look practical. It’ll sound efficient. It’ll arrive dressed up as common sense. Why spend more time on this? Why labor over a sentence when the tool can get you 80 percent of the way there in ten seconds? Why insist on nuance when fluency is already on the page?
Those are fair questions right up until they become the only questions.
Because the missing part is that 80 percent is not a small gap when the last part of the work is where the meaning lives. The last part is where tone becomes trust, where rhythm becomes persuasion, where a sentence stops sounding correct and starts sounding true. If we get too comfortable treating that difference as negligible, we don’t just make writing worse. We make communication flatter, thought thinner, and the world a little less precise about what it means.
That’s what I keep coming back to. This isn’t only about whether AI can help us write. It can. It already does. And, if you think about it, it has for a while. It’s about whether constant access to instant language starts training us to accept approximation in places where approximation used to matter less. And maybe that’s the real cultural shift hiding underneath all the excitement. Not that machines will speak, but that people may gradually stop demanding that language do more than function.
For anyone who works with words, it’s hard not to feel it personally. But it also belongs to everyone else, because language shapes how we explain things, sell things, comfort people, argue, apologize, lead, teach, and make sense of what’s happening. Once we start lowering the bar there, the loss doesn’t stay in marketing copy or blog posts. It moves outward into everything.
Maybe This Is Where Writers Still Matter Most
The strange thing is that none of this has made me believe writers matter less. If anything, it’s made me think the opposite. The easier it becomes to generate language, the more important it is to have someone in the room who can tell the difference between what works on the surface and what actually lands. That difference may not always look dramatic, but it’s still the difference between noise and meaning.
Maybe that’s where writers still matter most now, not as guardians of some sacred process, and not as the only people allowed to touch language, but as the people who can feel when something is off even before they can explain why. They know when a sentence is doing too much, when a paragraph is hiding instead of saying, or when a piece of writing sounds finished without ever becoming alive. Those are not flashy skills, and they don’t always defend themselves well in conversations about efficiency. But they shape the final thing more than people realize.
That may be part of the discomfort I felt in that meeting. It wasn’t only fear that the tool was getting stronger. It was the realization that the skills writers rely on most are often the hardest to describe in a room that wants measurable gains. Speed is easy to point to, as is volume and even fluency is easy to point to. Taste is harder. Restraint is harder. Emotional precision is harder. But harder to measure doesn’t mean less real. In a lot of cases, it means the opposite.
So maybe the question isn’t whether AI can write. At this point, that’s almost beside the point. Maybe the better question is whether we’ll get better at recognizing the human part of writing once the mechanical part becomes available to everyone. Because if everybody has access to quick language, then what stands out won’t be the ability to produce words. It’ll be the ability to know which words matter, carry weight, and which ones only look like they do.
That doesn’t solve the tension for me, but it does change its shape. I still use AI. I still will. I’m still uneasy. But maybe the writer's role now is not to pretend the tool doesn’t exist or to romanticize suffering at the sentence level. Maybe it’s to bring more judgment, more feeling, and more discernment into a world that is about to be flooded with language that arrives faster than ever and means less than it appears to.
If that’s where this is heading, maybe writers aren't becoming irrelevant. Maybe we’re being pushed toward the part of the work that was ours all along.
The Takeaway
I’m still going to use AI. That much feels settled. I’m not interested in pretending it has no value, and I’m definitely not interested in performing some kind of purity around a tool that is already changing how people work. It helps. I know that because I use it. The contradiction is still there, but at this point I trust it more than I did at the start. It means I’m paying attention.
What I keep coming back to is something simpler than the panic and bigger than the hype. The problem isn’t that machines can generate language; the problem is what happens when we start confusing generated language with writing itself, and in the process forget how much of good writing was never visible on the surface to begin with. That’s the part I don’t want to lose.
Because writing was never just about getting words onto a page. It was about knowing what to say, what to leave out, what a moment could hold, and how to make language feel as if it came from an actual mind rather than a system trained on patterns. That kind of judgment still matters. In a world that can produce words endlessly, it may matter even more.
So that’s where I’ve landed. AI is not the enemy of writing, but it does create a new kind of pressure around it. It forces us to ask whether we still know the difference between something that reads smoothly and something that feels true. It asks whether we can still recognize craft once its appearance becomes cheap and widely available.
That question goes well beyond copywriting. It touches anyone whose work depends on taste, interpretation, emotional precision, or the ability to make something land with another human being. The tool may speed up parts of the process, but it doesn’t erase the part that makes the work worth doing. If anything, it throws that part into sharper relief.
I left the meeting with less certainty than some of the people around me, but with a more useful question. Not whether AI can do more, it can, and it will. The real question is whether we’ll keep making room for the human abilities that were always doing more than people knew how to name.
That’s the part ThoughtLab should care about, too. Not just what AI makes faster, but what it makes easier to overlook.