Cate Blanchett has co-founded a new nonprofit aimed at solving one of AI’s most urgent unsolved problems: the ability for any person to control how their work, likeness, and identity are used by artificial intelligence systems.
What do 1,000 journalists and PR pros know about AI that you don't? They took AI Quick Start, a 1-hour live class from The Media Copilot. 94% satisfaction. Find out how to work smarter with AI in just 60 minutes. Next class May 8. Get 20% off with the code AIPRO: https://mediacopilot.ai/
RSL Media launched Tuesday as a public benefit nonprofit, built around a deceptively simple idea: Human consent should function like a traffic light—allowed, allowed with terms, or prohibited—that AI systems can actually read and respect. The organization has already secured support from a roster of A-list entertainment figures including Javier Bardem, George Clooney, Viola Davis, Tom Hanks, Dame Helen Mirren, Steven Soderbergh, Kristen Stewart, Meryl Streep, and Dame Emma Thompson, as well as Creative Artists Agency and the Music Artists Coalition.
“AI technologies are expanding rampantly, essentially unchecked and unregulated,” Blanchett said in a statement. “In order for humans to remain in front of these technologies, consent must be the first consideration.”
The launch comes less than a year after the Really Simple Licensing standard released version 1.0 of its open protocol that lets publishers define machine-readable terms for AI training on their content. RSL Media builds on that same architecture, extending the principle of machine-readable rights from content licensing to the protection of human creative expression, identity, and likeness.
“The right to decide whether AI can use your work or identity should not be reserved for only those who can afford lawyers or have platforms big enough to be heard,” said Nikki Hexum, co-founder and CEO of RSL Media. “It is a basic human right.”
The organization covers four distinct rights areas: creative works (songs, films, books, art, photography), identity (name, image, voice, movement, personal attributes), characters (fictional figures including their names, voices, and visual depictions), and marks (logos, trademarks, trade dress, brand identifiers). Its co-founders include not only Blanchett and Hexum but also Doug Leeds and Eckart Walther, the latter being the RSS co-creator behind the original RSL standard.
The legal framework was co-authored by James Everingham, former head of engineering at Instagram and CEO of Guild.ai; Jacqueline Sabec, a partner at King, Holmes, Paterno & Soriano; and Francesca Amfitheatrof, former artistic director of watches and jewelry at Louis Vuitton and design director at Tiffany & Company.
A free, public registry launches in June. Once live, anyone will be able to verify their identity through RSLMedia.org, declare permissions for their work and likeness, and have those preferences translated into machine-readable signals that AI platforms can query before use.
The approach mirrors the logic behind IAB Europe’s compensation framework (and the earlier RSL protocol) in aiming to turn consent into infrastructure. RSL Media’s registry goes further, however, by applying that model to individuals rather than publishers, covering identity and likeness alongside creative work.






