If you’ve been even half-watching AI lately, you’ve probably run into Matt Shumer’s “Something Big Is Happening” essay,or, at minimum, the tidal wave of takes it kicked up. Shumer’s basic claim is simple: his own coding workflow has shifted from writing code to prompting, reviewing, and signing off on AI output that’s close enough to “done” to feel uncanny. It’s framed as a warning to knowledge workers everywhere: AI has effectively absorbed my job, and yours is next.
What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Get 20% off with the code AIPRO: https://mediacopilot.ai/
There’s already a small library’s worth of response essays picking apart what Shumer gets right and where he leaps too far, and I’m not trying to add another spine to the shelf. But journalism is knowledge work, too, and it recently had its own—slightly less viral—brush with the same existential questions.
The editor of Cleveland.com (a.k.a. The Cleveland Plain Dealer), Chris Quinn, wrote a column describing how a college student who had applied for a reporting job withdrew their application when they found out how the publication uses AI. Besides leveraging the tech to help generate story ideas, the newsroom developed an “AI rewrite specialist” to write stories based on the material that reporters gather. By ditching writing, according to Quinn, their reporters have been able to reclaim an extra workday each week.
The backlash was predictably vicious. On X, Axios reporter Sam Allard earned a lot of likes by comparing what Cleveland.com is doing to being an “AI content farmer,” while various veteran journalists on Substack expressed various degrees of outrage and dismay. Most of the reaction was along the lines of this piece from journalist Stacey Woelfel: “Writing is an integral part of the reporting process.”
The newsroom’s new fault line
That last line is true, but it’s also not the whole story. What Quinn describes can’t be waved away quite so cleanly, because newsrooms have been unbundling reporting work for decades. Reporters regularly collaborate on one article, with one person taking the lead on the draft while others supply interviews, documents, and context; nobody argues the supporting reporters somehow didn’t do “real” reporting. And in breaking-news moments, reporters often text, email, or phone in their notes to an editor or writer who turns the raw feed into publishable copy.
We all understand, at least implicitly, that reporting and writing aren’t the same skill—even if the best journalists make them feel inseparable. What Quinn and Cleveland.com seem to be doing is using AI to make that separation explicit, formal, and scalable.
This also fits the popular, almost comforting story people tell about “responsible” AI in the workplace: let machines take the repeatable work they can do faster, so humans can spend their limited hours on the parts that actually require judgment and presence. For reporters, that’s the human stuff: calling sources, learning what’s new, asking the second question, and earning trust over time.
And here’s the uncomfortable part: AI is now legitimately good at writing. A lot of what we’ve seen over the past few years hasn’t helped its literary reputation (yes, we’re all tired of the rampant em-dashes and the “it’s not X—it’s Y” bits). But if you use the strongest models—and you’re even mildly intentional about prompting and editing—they can deliver clean, coherent, competent prose.
If we’re being honest, “competent prose” is exactly what a large chunk of daily news requires. Many, if not most, reported stories are built to transmit basic information about what happened, with minimal interpretation, and they’re often written in AP style—a set of constraints that’s effectively a template. It’s not quite code, but it’s functional writing, optimized for speed, clarity, and accuracy. The job is to get the facts right, add context, and move.
Seen that way, the reporter isn’t removed from the process so much as repositioned inside it. Shumer describes becoming a supervisor to an AI building machine; journalists may find themselves supervising writing bots, making sure a story is shaped correctly out of the material they’ve gathered. In Quinn’s newsroom, reporters have final say over the copy.
What gets lost when nobody writes
None of this guarantees a happy ending. Some writers can’t report, some reporters can’t write, and plenty of people are good at both. So what happens when the job is redesigned to force a choice? Do you become a feature or opinion writer, where voice and craft are the value, or do you specialize in the reporting side and let an “AI rewrite specialist” (or whatever comes next) handle the draft?
This leads to the biggest worry: skill-building. Even if Quinn is right and this system truly buys back time, how do junior journalists become better writers if they aren’t writing every day? When Woelfel says writing is integral to reporting, I think he means it’s integral to storytelling—the act of deciding what matters, what comes first, what gets emphasized, and what gets left out, all in service of an audience. That’s curation and prioritization as much as expression.
This is the point Ben Affleck was getting at when he drew his famous line between AI as a craftsman and AI as an artist. Craft can be taught, outsourced, templated; artistry is harder to mechanize. But it’s also hard to become an artist if you never get reps as a craftsperson.
The irony of Shumer’s essay is that even as it argues AI will soon disrupt most knowledge work—and even name-checks journalism as an industry in the crosshairs—it’s written in a distinctly human voice. I honestly don’t know if he used AI to fully or partially write the piece, but I’m certain that if he did, he also was meticulous about every word.
That’s the sliver of optimism here. Even if we push some of the craft of writing onto machines, we may not lose as much as the most alarmed reactions assume. Audiences still want a human touch; if that touch moves upstream—from drafting sentences to shaping the narrative and deciding what’s true and important—it’s still a touch. It’s true that no one wants to read AI slop. But it might turn out that the most valuable reporting skill in the future will be the ability to turn slop into stories.
A version of this column appeared in Fast Company.







