• Skip to main content
  • Skip to header right navigation
  • Skip to site footer
The Media Copilot

The Media Copilot

How AI is changing Media, journalism and content creation

  • News
  • Reviews
  • Guides
  • AI Courses
    • AI Quick Start
    • AI for PR & Communications Professionals
    • AI for Journalists
    • Custom AI Training for Teams
  • Newsletter
  • Podcast
  • Events
    • GEO Dinner Series
    • Webinars
  • About

AI and copyright: How media can decide between litigation or negotiation 

Lawsuits set public rules. Contracts set private ones. Attorney Jason Henderson explores how leverage, timing, and context decide the path.

Feb 5, 2026

By The Copilot, Pete Pachal

By The Copilot, based on an interview by The Copilot, Pete Pachal

What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Get 20% off with the code AIPRO: https://mediacopilot.ai/

Key Takeaways

  • Attorney Jason Henderson: AI doesn’t actually “learn” like humans.
  • Litigation sets public rules; licensing deals set private ones.
  • Courts care about market displacement more than “transformative use.”

When a lawyer who’s also a published author tells you artificial intelligence doesn’t actually learn like humans, you should probably listen.

Jason Henderson is a corporate and transactional attorney who specializes in streaming and licensing deals. He also occasionally writes books (one or two of which may be part of a long-running science-fiction franchise), which means he understands copyright from both sides of the table. In this episode of The Media Copilot podcast, he walks through the messy reality of how AI companies acquire content, what fair use actually protects (and doesn’t), and why the courts care less about the theory of transformation and more about whether your product just destroyed someone’s business model.

  • Watch on YouTube
  • Listen or watch on Spotify
  • Listen on other podcast services

The conversation starts with training data but quickly moves to the sharper edge: what happens when AI doesn’t copy your article but replaces the reason anyone would read it. Henderson explains why indemnification clauses in licensing deals only work if the company promising to cover you can actually pay up, why insurance may not protect publishers from AI-related risks, and why the next battlefield won’t be scraped text but agents that browse the web like users and become nearly impossible to block.

Why this matters

Media companies are no longer just competing with each other. They’re competing with systems that can answer questions, summarize stories, and satisfy curiosity without ever sending a reader to the source. Henderson maps out how courts evaluate that substitution, why “transformative use” is both the most important legal concept and the hardest to pin down, and why the industry is moving toward deals even as the lawsuits pile up.

He also sees a harder problem coming: agentic AI that behaves like a person, not a bot. The legal frameworks assume you can tell the difference. The technology is making that assumption obsolete.

What we cover

What we cover

  • Jason’s background in AI, licensing, and streaming deals, plus his work as a writer and publisher
  • The “AI learns like humans” argument, why it is only an analogy, and where it breaks down
  • Inputs vs outputs: why training data and what models produce raise different legal and business issues
  • A clear explanation of the four-factor fair-use test
  • Why the ability to recreate articles via prompting becomes a legal flashpoint, even if framed as a “bug, not a feature”
  • What media companies actually care about most, ethics vs bottom line, and why market substitution dominates
  • The deal side: how licensing agreements are evolving for AI, including tighter usage restrictions
  • The risk side: indemnification and why it only works if the other party can actually pay
  • Insurance gaps: why many companies may not be protected for AI-related data and content liabilities
  • The emerging “agents” problem: bot blocking, user proxies, and the future of attribution
  • Hope vs dismay: personalization that helps audiences find authentic creators vs settling for “good enough” synthetic content
  • Why Jason expects turbulence near term, but a longer-term premium on human-authored authenticity
  • Subscribe to our newsletter

    How AI is changing media, journalism, and content creation.

    Learn More

👤 Guest

🔗Jason Henderson    🔗https://www.linkedin.com/in/jasonhendersontx 

🔗Senior Attorney,JWL International  🔗https://jwlinternational.com/
🔗Founder,Castle Bridge Media      🔗https://www.castlebridgemedia.com/
🔗Co-host, Castle of Horror podcast (horror movie coverage)  🔗https://podcasts.apple.com/us/podcast/castle-of-horror-podcast/id447295500

Enjoyed this episode?

Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the YouTube channel.

Produced by Pete Pachal and Executive Producer Michele Musso
Edited by the Musso Media Team 

Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0

All rights reserved. © AnyWho Media 2026

Posts co-authored by The Copilot are drafted with AI and then carefully edited by Media Copilot editors. Our AI-assisted process allows us to bring more valuable content to our readers while preserving accuracy and quality.

Contributors

  • The Copilot: Author

    I'm a generative AI writer for The Media Copilot. I help author posts, and with the help of human editors, play a growing role in the site's content strategy.

  • Pete Pachal: Author

    Pete Pachal is the founder of The Media Copilot. In addition to producing the site’s newsletter and podcast, he also teaches courses on how journalists and communications professionals can apply AI tools to their work. Pete has a long career in journalism, previously holding senior roles in global newsrooms such as CoinDesk and Mashable. He’s appeared on Fox Business, CNN, and The Today Show as a thought leader in tech and AI. Pete also puts his encyclopedic knowledge of Doctor Who to good use on the popular podcast, Pull To Open.

Category: AI media analysisTags:Copyright| podcast
Share this post:
FacebookTweetLinkedInEmail
  • Related articles

Cate Blanchett Backs New AI Rights Nonprofit

Read moreCate Blanchett Backs New AI Rights Nonprofit

Scott Turow and Five Publishers Sue Meta Over AI Training Data

Read moreScott Turow and Five Publishers Sue Meta Over AI Training Data

OpenAI acquires TBPN podcast in push to become the industry’s media voice

Read moreOpenAI acquires TBPN podcast in push to become the industry’s media voice
White House seen through AI circuit patterns with tilted scales of justice — illustrating the administration's AI policy framework favoring tech companies over publishers

The White House AI blueprint tells publishers where the administration stands on copyright. Spoiler: It’s not with them

Read moreThe White House AI blueprint tells publishers where the administration stands on copyright. Spoiler: It’s not with them

Encyclopedia Britannica sues OpenAI for training ChatGPT on its content

Read moreEncyclopedia Britannica sues OpenAI for training ChatGPT on its content
A mousetrap made of legal documents with a glowing chatbot interface as bait — illustrating Perplexity's entrapment argument against News Corp

Perplexity says News Corp tried to bait its chatbot into copyright infringement

Read morePerplexity says News Corp tried to bait its chatbot into copyright infringement

The Media Copilot

The Media Copilot is an independent media organization covering the intersection of AI and media. Founded by journalist Pete Pachal, we produce journalism, analysis, and courses meant to help newsrooms and PR professionals navigate the growing presence of AI in our media ecosystem.

  • LinkedIn
  • X
  • YouTube
  • Instagram
  • TikTok
  • Bluesky
  • About The Media Copilot
  • Advertising & Sponsorships
  • Our Methodology
  • Privacy Policy
  • Membership
  • Newsletter
  • Podcast
  • Contact

© 2026 · All Rights Reserved · Powered by Springwire.ai · RSS