The SAG-AFTRA and Writers Guild strikes — lasting 118 days and 148 days respectively — may have ended, but concerns about artificial intelligence (AI) taking over jobs in the entertainment industry are only beginning.
AI has emerged with capabilities that range from generating songs by replicating music artists’ voices, to creating deepfakes that lead to misinformation and detrimental images of people whose creative talent is an essential part of their jobs. Consequently, AI touches upon multiple facets of the industry, including TV, film, and music.
With these new abilities, however, entertainers are pushing back and fighting for their rights by demanding stronger regulations.
Also: The ethics of generative AI: How we can harness this powerful technology
Leading experts in the legal and entertainment fields reveal who is most at risk of AI, what protections are being put in place for entertainers, and what these AI tools mean for the future of the entertainment industry.
Entertainment jobs at the highest risk from AI use
AI has the power to impact all types of jobs in the entertainment industry, especially those most vulnerable.
“I think that anyone who uses their voice or is a writer is most concerned. I think other performers have concerns, but I think writers and those who use their voices to earn a living are already seeing the potential threat that AI poses,” Philippa Loengard, executive director of the Kernochan Center for Law, Media and the Arts at Columbia Law School, told ZDNET.
Other roles at risk are background actors, gaffers, and grips, who help with set design and production. Loengard gave an example of a set designer who may spend three weeks creating the background for a set compared to an AI tool that could generate it almost immediately.
Also: What is a Chief AI Officer, and how do you become one?
Louise Nemschoff, a Los Angeles-based entertainment lawyer, also cited that visual artists such as graphic designers and storyboard artists may be impacted by AI. “I think it’s likely that some sort of generative AI tool will be added to the general filmmaking toolkit — certainly in the editing arena,” Nemschoff added.
Workers whose roles may be hit the hardest are calling for stronger protections. The International Alliance of Theatrical Stage Employees (IATSE) is a union of over 170,000 craftspeople in the entertainment industry, including hair and makeup artists, broadcast technicians, and animators. Following negotiations with the Alliance of Motion Picture and Television Producers (AMPTP) in May, the union reached a tentative agreement with Hollywood’s studios and streamers in late June. Though still needing ratification, the provisional deal includes not requiring members to use AI in a way that could potentially displace roles.
“It will be very interesting to see whether or not the production companies and the union are able to come to any sort of agreement, whether or not there will be a strike, or whether the guardrails and models of the Writers Guild and SAG-AFTRA agreements can be applied to those [agreements],” Nemschoff said.
On the other hand, directors may have a slight upper hand over AI. “Directors are, perhaps, less at risk of losing their jobs due to AI. As a director, you have to be there, you have to be looking at what you’re seeing in real time, you have to make judgment calls that AI has much more trouble doing right now,” Loengard said.
Also: 83% of decision-makers at service organizations are increasing their AI investments
Along with the TV and film industry, the music industry is also an area at risk because of the ability to replicate voices and create fake collaborations and mashups of artists using AI. For instance, last year, an AI-generated song featuring rapper Drake and singer The Weeknd circulated on TikTok. Although the song was removed, it had already garnered millions of plays on platforms like YouTube, TikTok, and Spotify, according to The New York Times. TikTok accounts solely dedicated to creating these AI-generated songs have surfaced featuring the voices of deceased music artists covering hits from present-day artists — for example, mid-century crooner Frank Sinatra sings pop star Dua Lipa’s 2020 hit “Levitating.” Plus, this raises questions about whether it is ethical to use AI to bring back the voices of the deceased without consent.
AI also sparked one of the biggest concerns for music artists who may lose opportunities for royalties because platforms circulate AI-generated songs that are not copyrighted or licensed recordings. As recently as January, Universal Music Group (UMG) accused TikTok of “sponsoring artist replacement by AI.” TikTok said UMG put its greed above artists’ interests, but they’ve since resolved the dispute.
All in all, it seems any type of performer — in TV, film, or music — cannot escape the tentacles of AI’s rapid advancement.
Writers: AI and protections for their creative work
Last fall, the Writers Guild of America (WGA) successfully negotiated with the AMPTP to end the writers’ strike. WGA members subsequently voted to ratify the agreement until May 1, 2026. Nemschoff noted the agreement’s four core pillars: consent, credit, compensation, and disclosure.
Consent and disclosure: She explained that AMPTP production companies agreed to obtain consent from and disclose to writers before using generative AI. Conversely, if writers want to employ AI, they, too, have to do the same for the companies. AMPTP will also not require writers to use the technology.
Credit: As animportant aspect for any entertainment industry worker, credit could “influence how much they’re offered for future work,” Nemschoff said. “For example, if you are a credited writer on a hit film, you’re likely to be able to negotiate a higher fee for your next project.” Credit can also affect whether a worker receives additional compensation in the form of residuals or rights to publish their scripts.
Compensation: The WGA negotiated minimum compensation for all film and TV writing, from the first draft of scripts to subsequent revisions and polishes. Nemschoff added that generative AI could threaten writers’ livelihoods if it creates first drafts (which pay higher rates) and leaves writers with only revision and polish opportunities (which are typically lower fees). The agreement guarantees that writers can still receive compensation for the full script even if they’re revising or polishing AI-generated first drafts.
Also: 6 ways AI can help launch your next business venture
The WGA agreement aims to take precautions so that humans do not risk losing credit and compensation opportunities because of AI-generated material.
Moving forward, the agreement also states that signatory production companies must meet with the Writers Guild at least semi-annually if the union requests a discussion or review of a production company’s use of generative AI in the development and production processes of motion pictures.
Screen actors: AI, digital replicas, and synthetic performers
Actors are forced to navigate a complex future where generative AI can create alternate versions of them.
Incidentally, digital replicas are artificial copies created from a human’s voice and/or visual likeness. Similar to writers, actors want to ensure that they get paid when that happens. According to the 2023 TV/Theatrical Contracts, performers must give informed consent and receive compensation for digital replicas.
What’s alarming is that it’s “likely these technologies will make new art forms like new genres or entertainment types that depend on them,” resulting in less reliance on human actors, said John Footen, managing director of Deloitte’s media and entertainment consulting practice.
On a lighter note, though digital replicas aren’t going away, they’ll improve and be less about “replacing the human touch but more about orchestrating a harmonious blend of the real and the virtual,” said Footen, who is also a Fellow of the Society of Motion Picture and Television Engineers (SMPTE).
Also: Do companies have ethical guidelines for AI use? 56% of professionals are unsure, survey says
In spite of digital replicas, key attributes in a performance by an actor may not be as easily replaced by AI. “Adlibbing, gestures, facial expressions, and tones of voice are part of what the actor brings as an artist to the productions,” Nemschoff added.
Like digital replicas, the concern of synthetic performers also raises concerns over consent. Synthetic performers are not digital replicas, and instead are created by generative AI using the name, voice, or likeness of multiple people amalgamated into one person. For example, AI can take Jason Momoa’s hair, Pedro Pascal’s eyes, and Julia Roberts’ smile to create a digitally produced person where neither the whole nor the specific parts are recognizable from the people whose traits were used.
The issue of whether this counts as stealing from actors led to the agreement that the producer of these synthetic performers must give notice to the union and negotiate with the union for appropriate payments.
Also: Do employers want AI skills or AI-enhanced skills? That could depend on you
A-listers or character actors paid significantly more than the negotiated minimum are exempt from additional compensation. However, the agreement stated these performers could individually bargain for more payment if they worked on a film that used digital replicas of them.
“The assumption is that they have agents, lawyers, managers, all of whom are able to work with the actor to negotiate their own deals,” Nemschoff said.
Recognizing that the landscape of AI continues to evolve and requires ongoing reevaluation, the contracts also mentioned that producers agreed to meet with the union to continue discussions on this topic throughout the term of the contract.
Voice actors: Use of AI for audio
Voice actors have taken strides to use digital replicas in an ethical manner. At CES 2024, SAG-AFTRA and Replica Studios, an AI voice technology company, announced a groundbreaking AI voice agreement. In the agreement, voiceover actors — under their fully informed consent and through fair compensation — can license their voices for interactive media projects such as video games during pre-production all the way to the final release, according to SAG-AFTRA.
The update also mentioned that under minimum terms and conditions, voice actors have the choice to withdraw from continued use of their digital voice replicas in new projects. Rather than training AI with data that has been collected without a voice actor’s permission, the agreement ensures that all work is licensed and the voice actor gives consent to use their voice.
Also: Beyond programming: AI spawns a new generation of job roles
Earlier this May, legislators in Albany, New York joined SAG-AFTRA and representatives from other entertainment industry labor unions to support three bills that demanded guardrails around AI use. Some components of the bills include requirements that advertisements disclose synthetic media use, protections against job displacement, and requirements of informed consent and proper legal representation before a digital replica voice or likeness is licensed in place of physical work.
In June, SAG-AFTRA announced to members the new Dynamic AI Audio Commercials Waiver, which allows workers a new employment opportunity “to create highly personalized, audio-only digital ads” with protected terms covering AI. The terms would include informed consent for digital voice replica creation and additional consent to use the digital voice replica in any ad.
The waiver also mentions that performers must give prior written consent for the use of a digital voice replica under the agreement’s terms. Furthermore, when the employment relationship ends, producers must delete all copies of the actor’s voice that were used in the ad and any materials that helped with the creation of the digital voice replica.
The agreements, bills, and waiver represent another step in the right direction for the consented and ethical use of digital replicas. They provide actors with an opportunity to explore the possibilities of AI with their voices while receiving fair compensation for their work.
Still, actors, among other creative artists, are fighting against the use of their voices without their consent. Scarlett Johansson threatened legal action against OpenAI after it rolled out its demo of Sky, a ChatGPT voice which she alleged used hers. She said OpenAI asked her at least twice to license her voice for the newest AI system, but she refused. On the day of the release, CEO Sam Altman tweeted “her,” referencing the movie “Her” in which Johansson played an AI voice assistant. Prior to OpenAI reaching out to Johansson, it hired a voice actor, according to The Washington Post.
“This was not a smart move on Sam Altman’s part,” Nemschoff said. “Soundalikes (or voice replications) have been considered violations of California law since 1992 when Bette Midler won her lawsuit against Ford Motor Company for hiring someone to sound like her in recordings for a series of Ford commercials.”
Since Johansson’s threats of legal action, OpenAI paused Sky, and the company is once again embroiled in legal issues. Loengard said Johansson could bring a right of publicity claim, which prevents unauthorized commercial uses of an individual’s name, image, or likeness associated with their identity.
Johansson’s “right of publicity claim would rest on whether it is her voice that was used or the ‘style’ of her voice,” Loengard said. Under the right of publicity (a state law), Johansson’s actual voice — not the style of her voice — may be protected under many states’ laws.
Musicians: AI in the music industry
The music industry is also an area in which AI can potentially exploit the work of musical artists without their consent, especially on platforms like TikTok.
Consternation over AI-generated TikTok recordings and fair royalty payments to original artists from Universal Music Group boiled over earlier this year. Their existing licensing contract was set to expire on Jan. 31, 2024, and the companies were unable to reach an agreement on artist compensation. That dispute led TikTok to remove music from UMG artists, muting existing clips and making the songs unavailable as new clips.
But on May 1, 2024, the companies reached a new licensing agreement, stating that “TikTok and UMG will work together to ensure AI development across the music industry will protect human artistry,” as well as compensation to artists and songwriters. About a week later, TikTok released a statement that said the company was partnering with the Coalition for Content Provenance and Authenticity (C2PA) as the first video-sharing platform to use Content Credentials technology. Now, content made with AI tools on some other platforms will also be labeled as AI-generated content.
Also: Bank CIO: We don’t need AI whizzes, we need critical thinkers to challenge AI
Footen said the C2PA offers a “technical solution” for music artists who may fall victim to unauthorized AI copies of songs on TikTok or other platforms.
Content Credentials has rolled out for images and videos but will also soon be implemented for audio-only content, according to TikTok. In the next months, the platform will also use Content Credentials for TikTok content so that anyone can see content made with AI on TikTok, according to the update on May 9, 2024.
Also: AI taking on more work doesn’t mean it replaces you. Here are 12 reasons to worry less
SAG-AFTRA is also seeking AI protections from record labels. In mid-April, the union said it had reached a tentative agreement with companies, including Warner Music Group, Sony Music Entertainment Group, Universal Music Group, and Disney Music Group. It will cover the term between Jan. 1, 2021 to Dec. 31, 2026. SAG-AFTRA approved the contract at the end of April, ratifying the 2024 Sound Recordings Code.
Guardrails include that “the terms ‘artist,’ ‘singer,’ and ‘royalty artist’ under this agreement only include humans; and clear and conspicuous consent, along with minimum compensation requirements and specific details of intended use, are required prior to the release of a sound recording that uses a digital replication of an artist’s voice.”
Even as legal parameters are established to protect musicians and their work from AI, some performers are exploring ways of using AI to their advantage. Musician FKA twigs finds the technology can help her focus on her music while she lets a digital replica handle the less creative aspects of her work. On April 30, the British singer and songwriter — while testifying before the U.S. Senate’s Judiciary Subcommittee on Intellectual Property — announced that she was developing a deepfake version of herself.
“I will be engaging my AI twigs later this year to extend my reach and handle my online social media interactions, whilst I continue to focus on my art form from the comfort and solace of my studio,” FKA twigs told the committee. “These and similar emerging technologies are highly valuable tools both artistically and commercially when under the control of the artist.”
Also: AI will have a big impact on jobs this year. Here’s why that could be good news
Whether more music artists will follow suit is unclear. That said, Tennessee is the first state to adopt the ELVIS Act (or the Ensuring Likeness Voice and Image Security Act), which will go into effect July 1, 2024. The act “builds upon existing state rule protecting against unauthorized use of someone’s likeness by adding ‘voice’ to the realm it protects,” the Office of Governor Bill Lee said in a statement.
It is the “first-of-its-kind legislation” to protect performers from unauthorized copies of their voice and likeness and offers another layer of security from fraudulent content.
Other proposed AI legislation
Federal protection for creatives in the entertainment industry could be the next step if either of two bills passes Congress.
The first is the bipartisan Nurture Originals, Foster Art, and Keep Entertainment Safe Act, also called the NO FAKES Act, which states it “would protect the voice and visual likeness of all individuals from unauthorized recreations from generative artificial intelligence.” It was introduced to the Senate in October 2023 when AI tools were skyrocketing and gained many supporters in the entertainment industry. The length of protection would last 70 years after the death of the individual.
Also: How do you get employees to embrace AI? (You may find this sneaky)
Second, in January this year, the No Artificial Intelligence Fake Replicas and Unauthorized Duplications Act, or the No AI FRAUD Act, was introduced to the House of Representatives and would “protect Americans’ individual right to their likeness and voice against AI-generated fakes and forgeries.” The No AI FRAUD Act focuses on broad federal protection of the right of publicity and would also target anyone who disseminates AI-generated content. The length of protection would last 10 years after the death of the individual.
If either act passes, it could provide the entertainment industry with a level of federal protection that’s currently lacking. However, it will be interesting to see how the acts can achieve a balance of protection without creating more issues due to overly broad language, and it appears as though the No AI FRAUD Act could lead to more challenges.
What is AI’s next role in the entertainment industry?
In the future, these agreements and proposals will likely adapt to technological advances. This adjustment is a phenomenon that humans have lived through many times before and are experiencing now.
A January study cited in The Hollywood Reporter surveyed 300 entertainment industry leaders, and found that “three-fourths of respondents indicated that AI tools supported the elimination, reduction, or consolidation of jobs at their companies. Over the next three years, it is estimated that nearly 204,000 positions will be adversely affected.”
However, it is likely that with potential job cuts in the entertainment industry because of AI, new positions requiring the ethical use of AI tools will also appear. The demand for AI ethicists is already climbing in the job market. The use of AI in the entertainment industry will require humans to lay the groundwork for decisions.
“Just like how the switch from film strips to digital formats rendered assistant editor’s traditional work unnecessary, it birthed a host of new roles in post-production, from digital FX wizards to animation virtuosos. It’s not about losing jobs; it’s about upgrading the show,” Footen told ZDNET.
Also: How renaissance technologists are connecting the dots between AI and business
As much as AI tools could revolutionize the entertainment industry, lawmakers must consider the ethical implications of this technology. Creative arts has always prided itself on being rooted in humanity. If AI takes the driver’s seat, there is a possibility that human expression might be diminished to an artificial art form that lacks connection with the audience. Nevertheless, while tech giants continue to tout these emerging technologies, human voices of creative talent are rising above the noise of AI.
+ There are no comments
Add yours