"I’ve resigned from my role leading the Audio team at Stability AI, because I don’t agree with the company’s opinion that training generative AI models on copyrighted works is ‘fair use’," announced Ed Newton-Rex, former Stability AI executive on his X (formerly Twitter) after leaving the company.
"Ethical concerns" and "generative AI" are used in the same sentence more and more by the day, and one of the industry executives who wasn't scared to say out loud that generative AI has ethics issues is Ed Newton-Rex, former TikTok’s head AI designer and Stability AI's head of audio.
To address the ethical problem in generative AI, Ed has launched a nonprofit organisation called Fairly Trained which offers a certification program, or L Certification, for generative AI companies that prioritise obtaining consent for their training data.
It takes three key resources to build generative AI models: AI talent, GPUs, and training data.
— Ed Newton-Rex (@ednewtonrex) January 9, 2024
AI companies pay millions of dollars for the first two. It makes no sense that some expect the third for free.
Fairly Trained's certification process requires companies to demonstrate that their training data is either explicitly licensed for such purposes, in the public domain, offered under an appropriate open license, or owned by the company itself. The certification has already been awarded to nine companies, most of which are AI music companies, including music generation platform LifeScore Music, generative AI music platforms Boomy and Soundful, AI-powered royalty free music service Beatoven.ai, music licensing company Rightsify, AI model training solution for record labels Somms.ai, and Tuney, a company that builds AI music tools.
To apply for the certificate, companies are obliged to pay a fee ranging from $500 to $6,000, depending on the size of the organisation’s annual revenue.
"There is a divide emerging between two types of generative AI companies: those who get the consent of training data providers, and those who don’t, claiming they have no legal obligation to do so. Right now it’s hard to tell which AI companies take which approach. Fairly Trained aims to change this."
"We hope the Fairly Trained certification is a badge that consumers and companies who care about creators’ rights can use to help decide which generative AI models to work with," shared Newton-Rex in a blog post.
The initiative has garnered support from notable figures in the AI and music industry, including advisers such as Tom Gruber, co-founder of the startup that became Apple's Siri, and Elizabeth Moody, a partner at law firm Granderson Des Roches and previously associated with Pandora and YouTube. Maria Pallante, CEO of the Association of American Publishers (AAP), and musician Max Richter are also among the esteemed advisers contributing to Fairly Trained's mission.
As the AI industry grapples with questions of ethical data use, Fairly Trained tries to act as a neutral player seeking to establish a standard for fair practices in training generative AI models. The nonprofit's certification program aims to bridge the gap between the AI sector and the concerns surrounding creators' rights, offering a potential solution to ethical considerations in the field of artificial intelligence, which, while undoubtedly impressive, is nonetheless equally intimidating when it comes to the future of art.
"It’s not like most people working on LLMs [large language models] are writers, or most people who work on AI image generators are designers or photographers. But in music a lot of them are musicians. I wonder whether that’s one of the reasons you have more companies in music generation who are taking this [consent-focused] approach. They really don’t want to do what I would call the wrong thing. The unethical thing," shared Newton-Rex in his interview to MusicAlly.