After a long battle, the SAG-AFTRA union has secured protections against artificial intelligence (AI) taking advantage of their artist’s work.
The US trade union, SAG-AFTRA, and major record labels, Warner Music Group, Sony Music Entertainment, University Music Group, and Disney Music Group, have reached a tentative agreement on the use of AI to create imitations, or clones, of Hollywood artists as well as the protection of releasing the digital replication of a music artist’s voice.
Over 200 music artists recently signed an open letter claiming “AI poses an existential threat to their livelihoods”. Pearl Jam, Nicki Minaj, Billie Eilish, Stevie Wonder, Elvis Costello, and a number of others called upon AI developers, tech companies, and platforms to stop using AI to infringe on their creative art.
The deal that was struck covers the five-year period from 2021 to 2026 and has been unanimously approved by the executive committee of SAG-AFTRA. The union represents over 160,000 actors, artists, and other media professionals.
In a statement, Duncan Crabtree-Ireland, national executive director and chief negotiator, said: “SAG-AFTRA and the music industry’s largest record labels have reached a groundbreaking agreement establishing, for the first time, collective bargaining guardrails assuring singers and recording artists ethical and responsible treatment in the use of artificial intelligence in the music industry.
“It is a testament to our mutual unwavering commitment to work together to safeguard the rights, dignity and creative freedom of our members.”
Direct imitations of music from AI have been circulating online over the past year. Just two days ago, an AI-generated sample of a Drake diss-track went viral on X (formerly Twitter), confusing the masses, with many being unable to tell the difference between the real Drake and the AI version.
This is the danger that many artists feared: their precious work being corrupted and distributed to an audience without their approval and without them gaining anything from it. Universal Music Group shared its insight on the core principles for music creation with AI:
- We believe music is central to humanity.
- We believe humanity and music are inseparable.
- We believe that technology has long supported human artistic expression, and if applied sustainably, AI will amplify human creativity.
- We believe that human-created works must be respected and protected.
- We believe that transparency is essential to responsible and trustworthy AI.
- We believe the perspectives of music artists, songwriters, and other creators must be sought after and respected.
- We are proud to help bring music to life.
Similar to these ideals, the agreement featured a direct guideline that the terms “artist”, “singer” and “royalty artist” only includes humans.
“SAG-AFTRA stands firm in the belief that while technology can enhance the creative process, the essence of music must always be rooted in genuine human expression,” Crabtree-Ireland said.
“In this agreement, clear and conspicuous consent, along with minimum compensation requirements and specific details of intended use, are required prior to the release of a sound recording that uses a digital replication of an artist’s voice.”
Generating regulations against AI has become a contested terrain. People see the potential threat against subjects like art, yet garnering support towards having these regulations and protocols can be difficult as not everyone can keep up with the rapid, technological advancements that artificial intelligence is having on a daily basis.
Artists now being able to have a wall of protection so to speak is a step in the right direction. However, it is merely a Band-Aid on a bullet hole as AI regulations will need constant re-evaluation to keep up with the gradual advancements that the technology makes. Only then will the arts be truly protected.
Kace O'Neill
Kace O'Neill is a Graduate Journalist for HR Leader. Kace studied Media Communications and Maori studies at the University of Otago, he has a passion for sports and storytelling.