McClatchy Media has been using AI to generate new articles on its Northwest news sites, summarizing existing reporting into listicle-style roundups — often without informing the reporters whose work is being repackaged.
What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Get 20% off with the code AIPRO: https://mediacopilot.ai/
Key Takeaways
- McClatchy uses AI to repurpose reporters’ work into listicles without notice.
- Washington State NewsGuild is negotiating contract language on AI use.
- Leading example of unions stepping in where AI policies haven’t been written.
Now the Washington State NewsGuild is negotiating contract language to establish what AI can and can’t do in McClatchy newsrooms.
How reporters discovered it
Kristine Sherred, food reporter for the Tacoma News Tribune and union co-chair, said colleagues first noticed the practice in summer 2024 when a photographer saw an intern’s article had been turned into a shorter, AI-generated version.
More recently, reporters have seen their stories compiled into AI-generated roundups with headlines like “Six recent criminal sentencings in Pierce County Superior Court.”
Peter Talbot, the News Tribune’s criminal justice reporter, said he often learns about these compilations only when he gets an automated email notification.
“I’ll suddenly see like, ‘Hey, there’s a roundup of the sentencing stories I’ve written from court.’ And it’s like, ‘Oh, OK.’ I guess suddenly there was an article published last week of the last dozen stories I wrote, summarized by AI, I didn’t know that was happening,” Talbot said.
The AI-generated stories tend to have lower page views and minimal comment engagement compared to original reporting, according to Talbot.
What the union wants
Idaho and Washington State NewsGuild members are negotiating a new collective bargaining agreement with AI protections as a priority. Other McClatchy properties, including the Miami Herald, have already secured AI clauses in their contracts.
The union has won agreement that AI will not perform typical newsroom activities like news gathering, public records requests or impersonating reporters.
But disagreement remains over a key provision. The union proposed that any AI-generated content require direction and editorial review by humans. McClatchy countered that this would only apply when content “substantially relies” on human work.
“It is unclear what that means,” Sherred said.

The broader context
McClatchy job postings ask prospective reporters to “take advantage of opportunities to ethically harness and leverage artificial intelligence and other automation to enhance and elevate their work.”
Sherred said she uses some AI tools to support reporting, such as pulling data on restaurants applying for liquor licenses. But the company’s focus seems to be content generation.
McClatchy published an article stating that “Editors have complete control and oversight of content and can make adjustments at any time.” The company did not respond to a request for interview from Northwest Public Broadcasting.
What it means
The McClatchy situation illustrates a common pattern: companies implementing AI for content generation without clear policies on when reporters should be consulted or how their work will be reused.
Contract language matters. Vague terms like “substantially relies” leave room for interpretation that may not favor journalists’ control over their work.
For other newsrooms considering AI-generated compilations or summaries, the key questions are:
- Who decides when and how AI is used?
- Do reporters have input or at least notification?
- What editorial review happens before publication?
- How does this serve readers versus serving page-view metrics?
The answers to those questions increasingly appear in union contracts, not company policies.







