Throughout his acting career spanning more than six decades, James Earl Jones’ voice became an indelible part of his work as a performer.
On screen, Jones, who died Monday at 93, brought to life a reclusive writer who was thrust back into the spotlight in “Field of Dreams” and a haughty king of a fictional country in “Coming To America.” On stage, he won two Tony Awards for “The Great White Hope” and “Fences.” His work as a voice actor — the regal dignity of his portrayal of Mufasa in “The Lion King” and the brooding, deep tones he brought to Darth Vader in “Star Wars” — helped cement his place as a legendary actor among generations of fans.
But since his death, one aspect of Jones’ career has come to light: allowing artificial intelligence to be used to replicate his performance as Darth Vader after he stepped down from the role. Skywalker Sound and Ukrainian company Respeecher used AI to recreate Jones’ villain for the 2022 show “Obi-Wan Kenobi” on Disney+. Mark Hamill’s voice was also “de-aged” with the help of Respeecher for his appearance as Luke Skywalker in “The Mandalorian.”
Voice actors say they fear AI could reduce or eliminate jobs because the technology could be used to replicate one performance in a number of other moves without their consent — a concern that prompted video game performers to raise with the Screen Actors Guild-American Federation of Television and Radio Artists go on strike at the end of July.
Hollywood video game artists have announced a work stoppage — their second in a decade — after more than 18 months of negotiations over a new interactive media deal with gaming industry giants collapsed over protections for artificial intelligence. Union members have said they are not opposed to AI. But artists worry the technology could provide a way for studios to squeeze them out.
Concerns about how film studios will use AI sparked four months of union strikes in the film and television industry last year.
For some, Jones’ decision to have AI replicate his voice raises questions about voice acting as an art form, but it also potentially helps lay the groundwork for transparent AI agreements that fairly compensate an actor for their consenting performances. Zeke Alton, a voice actor and member of SAG-AFTRA’s Interactive Media Agreement Negotiations Committee, said it’s “amazing” that Jones was involved in the process of having his voice replicated.
“If the game companies, the movie companies, would give every actor the same level of consent and compensation transparency that James Earl Jones did, we wouldn’t be striking,” Alton said. “It shows they can do it. They just don’t want to do it for people who they don’t feel have the power to negotiate for themselves.”
Jones, who overcame a childhood stutter, has said in previous interviews that he was “glad that I could talk at all, because there were times when I couldn’t.” He said his goal was for his voice to be clear. Speaking to The Associated Press in 1994, he said he wanted to make Darth Vader “more human and more interesting.” But George Lucas, the filmmaker who created “Star Wars,” advised him to “go back to a very narrow band of expression” because the villain’s mechanical body parts would make it difficult for him to sound more human.
Neither Skywalker Sound nor Respeecher responded to a request for comment. But a sound editor at Skywalker Sound told Vanity Fair that Jones agreed to the use of archival recordings to keep Darth Vader alive and that he oversaw Darth Vader’s performance for the Disney+ show as “a benevolent godfather.”
Voice actor Brock Powell said the ability to use an actor like Jones’ voice forever could eliminate the need for actors who specialize in voice matching. That kind of work provides steady employment for many artists, he said, who can impersonate a famous voice for video games, animated series and other types of media.
“To quote ‘Jurassic Park,’ the scientists were so busy asking whether they could do it, they didn’t even take the time to ask whether we should do it,” Powell said.
According to them, such AI use could also reduce “ingenuity” in acting, as new actors may not get the chance to breathe new life into a role.
Crispin Freeman, an actor who did voice modifications by imitating Orlando Bloom in “Pirates of the Caribbean,” said the technology may make the ability to voice match roles obsolete, but it doesn’t hurt “the ability of future performers to forge their own path” in new roles.
“We should always be creating new stories as we move forward, and not just rely on the old stuff,” he said. “Instead of worrying, ‘Oh, someone else can be Darth Vader,’ why don’t we create a new ‘Star Wars’ character that’s just as compelling as Darth Vader?”
Jones’ contract could be an example of properly negotiating with an actor about their likeness, said Sarah Elmaleh, chair of SAG-AFTRA’s interactive negotiating committee. Elmaleh, a voice actor, said there’s a chance these tools could be used in “meaningful, smart artistic decisions.”
“I worry about a world where we conflate the superficial qualities of someone’s voice with their performance,” she said. “I can’t help but step away from the metaphor that’s baked into the character itself, which is that when you conflate the human with the machine, you become an instrument for other forces, other powers that be.”
Alton, the voice actor, wondered what it would mean if Jones’ voice as Darth Vader was used for another 100 years and people didn’t remember “all the different things that made him the iconic character that he was.”
“It’s just a disembodied voice at that point. It’s part of the neutralization of art that generative AI can do, and it’s kind of a heavy topic, but it’s really important for us as a world to consider what we want our entertainment and our art to be in the future,” he said. “Do we want it to be human, or do we want it to be boring?”