• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
The Media Copilot

The Media Copilot

How AI is changing Media, journalism and content creation

  • News
  • Reviews
  • Guides
  • AI Courses
    • AI Quick Start
    • AI for PR & Communications Professionals
    • AI for Journalists
    • Custom AI Training for Teams
  • Newsletter
  • Podcast
  • Events
    • GEO Dinner Series
    • Webinars
  • About

What critics get wrong about Cleveland.com’s AI rewrite experiment

The Cleveland Plain Dealer isn’t “replacing reporters with AI” so much as separating reporting from writing. That still raises hard questions.

AI newsroom
Is Cleveland.com’s AI workflow a slop factory, or does it represent the future of newsrooms, where reporting and writing are separated fully? (Credit: Midjourney)
Mar 3, 2026

By Pete Pachal

If you’ve been even half-watching AI lately, you’ve probably run into Matt Shumer’s “Something Big Is Happening” essay,or, at minimum, the tidal wave of takes it kicked up. Shumer’s basic claim is simple: his own coding workflow has shifted from writing code to prompting, reviewing, and signing off on AI output that’s close enough to “done” to feel uncanny. It’s framed as a warning to knowledge workers everywhere: AI has effectively absorbed my job, and yours is next.

What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Get 20% off with the code AIPRO: https://mediacopilot.ai/

There’s already a small library’s worth of response essays picking apart what Shumer gets right and where he leaps too far, and I’m not trying to add another spine to the shelf. But journalism is knowledge work, too, and it recently had its own—slightly less viral—brush with the same existential questions.

The editor of Cleveland.com (a.k.a. The Cleveland Plain Dealer), Chris Quinn, wrote a column describing how a college student who had applied for a reporting job withdrew their application when they found out how the publication uses AI. Besides leveraging the tech to help generate story ideas, the newsroom developed an “AI rewrite specialist” to write stories based on the material that reporters gather. By ditching writing, according to Quinn, their reporters have been able to reclaim an extra workday each week.

The backlash was predictably vicious. On X, Axios reporter Sam Allard earned a lot of likes by comparing what Cleveland.com is doing to being an “AI content farmer,” while various veteran journalists on Substack expressed various degrees of outrage and dismay. Most of the reaction was along the lines of this piece from journalist Stacey Woelfel: “Writing is an integral part of the reporting process.”

The newsroom’s new fault line

That last line is true, but it’s also not the whole story. What Quinn describes can’t be waved away quite so cleanly, because newsrooms have been unbundling reporting work for decades. Reporters regularly collaborate on one article, with one person taking the lead on the draft while others supply interviews, documents, and context; nobody argues the supporting reporters somehow didn’t do “real” reporting. And in breaking-news moments, reporters often text, email, or phone in their notes to an editor or writer who turns the raw feed into publishable copy.

We all understand, at least implicitly, that reporting and writing aren’t the same skill—even if the best journalists make them feel inseparable. What Quinn and Cleveland.com seem to be doing is using AI to make that separation explicit, formal, and scalable.

This also fits the popular, almost comforting story people tell about “responsible” AI in the workplace: let machines take the repeatable work they can do faster, so humans can spend their limited hours on the parts that actually require judgment and presence. For reporters, that’s the human stuff: calling sources, learning what’s new, asking the second question, and earning trust over time.

And here’s the uncomfortable part: AI is now legitimately good at writing. A lot of what we’ve seen over the past few years hasn’t helped its literary reputation (yes, we’re all tired of the rampant em-dashes and the “it’s not X—it’s Y” bits). But if you use the strongest models—and you’re even mildly intentional about prompting and editing—they can deliver clean, coherent, competent prose.

If we’re being honest, “competent prose” is exactly what a large chunk of daily news requires. Many, if not most, reported stories are built to transmit basic information about what happened, with minimal interpretation, and they’re often written in AP style—a set of constraints that’s effectively a template. It’s not quite code, but it’s functional writing, optimized for speed, clarity, and accuracy. The job is to get the facts right, add context, and move.

Seen that way, the reporter isn’t removed from the process so much as repositioned inside it. Shumer describes becoming a supervisor to an AI building machine; journalists may find themselves supervising writing bots, making sure a story is shaped correctly out of the material they’ve gathered. In Quinn’s newsroom, reporters have final say over the copy.

What gets lost when nobody writes

None of this guarantees a happy ending. Some writers can’t report, some reporters can’t write, and plenty of people are good at both. So what happens when the job is redesigned to force a choice? Do you become a feature or opinion writer, where voice and craft are the value, or do you specialize in the reporting side and let an “AI rewrite specialist” (or whatever comes next) handle the draft?

This leads to the biggest worry: skill-building. Even if Quinn is right and this system truly buys back time, how do junior journalists become better writers if they aren’t writing every day? When Woelfel says writing is integral to reporting, I think he means it’s integral to storytelling—the act of deciding what matters, what comes first, what gets emphasized, and what gets left out, all in service of an audience. That’s curation and prioritization as much as expression.

This is the point Ben Affleck was getting at when he drew his famous line between AI as a craftsman and AI as an artist. Craft can be taught, outsourced, templated; artistry is harder to mechanize. But it’s also hard to become an artist if you never get reps as a craftsperson.

The irony of Shumer’s essay is that even as it argues AI will soon disrupt most knowledge work—and even name-checks journalism as an industry in the crosshairs—it’s written in a distinctly human voice. I honestly don’t know if he used AI to fully or partially write the piece, but I’m certain that if he did, he also was meticulous about every word.

That’s the sliver of optimism here. Even if we push some of the craft of writing onto machines, we may not lose as much as the most alarmed reactions assume. Audiences still want a human touch; if that touch moves upstream—from drafting sentences to shaping the narrative and deciding what’s true and important—it’s still a touch. It’s true that no one wants to read AI slop. But it might turn out that the most valuable reporting skill in the future will be the ability to turn slop into stories.

A version of this column appeared in Fast Company.

Contributors

  • Pete Pachal: Author

    Pete Pachal is the founder of The Media Copilot. In addition to producing the site’s newsletter and podcast, he also teaches courses on how journalists and communications professionals can apply AI tools to their work. Pete has a long career in journalism, previously holding senior roles in global newsrooms such as CoinDesk and Mashable. He’s appeared on Fox Business, CNN, and The Today Show as a thought leader in tech and AI. Pete also puts his encyclopedic knowledge of Doctor Who to good use on the popular podcast, Pull To Open.

Category: AI media analysisTags:AI content| newsroom automation| newsroom AI| journalism| generative AI
Share this post:
FacebookTweetLinkedInEmail
  • Related articles

The new agentic AI battleground: The case for unified architecture

Read moreThe new agentic AI battleground: The case for unified architecture

Why AI content labels keep failing the people who need them most

Read moreWhy AI content labels keep failing the people who need them most
GEO analytics

Inside AI traffic’s 796% growth, and why it converts more ready-to-buy visitors

Read moreInside AI traffic’s 796% growth, and why it converts more ready-to-buy visitors
Editorial illustration of a news article being poured into multiple media format containers

Why liquid content is harder than it looks

Read moreWhy liquid content is harder than it looks
typewriter with AI chatbot

Journalists are opening up about AI, but one mistake shows how fragile that progress is

Read moreJournalists are opening up about AI, but one mistake shows how fragile that progress is

AI is shrinking entry-level hiring while boosting pay for experienced workers, Dallas Fed finds

Read moreAI is shrinking entry-level hiring while boosting pay for experienced workers, Dallas Fed finds

The Media Copilot

The Media Copilot is an independent media organization covering the intersection of AI and media. Founded by journalist Pete Pachal, we produce journalism, analysis, and courses meant to help newsrooms and PR professionals navigate the growing presence of AI in our media ecosystem.

  • LinkedIn
  • X
  • YouTube
  • Instagram
  • TikTok
  • Bluesky
  • About The Media Copilot
  • Advertising & Sponsorships
  • Our Methodology
  • Privacy Policy
  • Membership
  • Newsletter
  • Podcast
  • Contact

© 2026 · All Rights Reserved · Powered by Springwire.ai · RSS