NewsGuard has identified more than 3,000 AI content farms—more than double what it could find a year ago using manual techniques—and it’s now partnering with AI detection startup Pangram Labs to scale that tracking as the problem accelerates.
What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Get 20% off with the code AIPRO: https://mediacopilot.ai/
Key Takeaways
- NewsGuard has identified 3,000+ AI content farms, double last year’s count.
- Partnership with Pangram Labs uses AI to flag entire domains, not just pages.
- 300 to 500 new AI content farm sites appear every month, accelerating the problem.
AdWeek reports the new detection tool, announced Thursday, uses Pangram’s proprietary models to scan not just individual pages but entire domains for signs of AI-generated content at scale. When Pangram flags a site, NewsGuard analysts review it manually before applying a formal “AI content farm” designation. Sites qualify when a substantial share of content appears AI-generated, there’s no disclosure to readers, and the site’s presentation could convincingly pass as human-produced journalism.
The scale of the problem is striking. Between 300 and 500 new AI content farm sites are emerging every month, according to Pangram. Many operate under generic news-adjacent names (e.g. Times Business News, Business Post) and publish misinformation about real brands, politicians, and public health. In one case, a site called Citizen Watch Report falsely claimed two U.S. senators spent $814,000 on hotels in Ukraine; the story was amplified by Russian state media before being debunked.
Another site falsely claimed Coca-Cola threatened to pull its Super Bowl sponsorship over a halftime show for which Coca-Cola wasn’t even a sponsor. Both sites ran ads from major brands.
That last detail is the commercial mechanism. Most of these sites are made-for-advertising (MFA) operations—cheap content churned out to capture programmatic ad spend. In a two-month period, NewsGuard found 141 blue-chip brands advertising on AI content farm sites. The slop economy runs on their budgets.
“If we can’t detect AI content, then every communication space is going to be flooded with inauthentic content that’s cheap to produce and difficult to impossible to differentiate [from] something authentic,” Max Spero, Pangram’s CEO, told AdWeek’s Kendra Barnett.
NewsGuard’s detection data will be available for advertisers to license directly or through their agencies, with a pre-built integration into The Trade Desk for pre-bid blocking. A consumer-facing browser extension integration is also under consideration. Pangram, founded in 2023 by a former Google engineer and an ex-Tesla scientist, gained independent validation when a Nature report last September found it highly capable of flagging AI-generated academic papers.
The detection arms race is worth watching. Early AI content farms were easy to spot — sites would publish articles containing ChatGPT error messages verbatim. Today’s operations are more sophisticated. The tools to catch them are getting sharper too, but the math still favors the farms: generating slop is cheaper and faster than reviewing it.







