Please wait...
Please wait...

THE AI REPORT

Don't believe everything you hear about AI

AI tech that allows you to use the voices of David Attenborough, Morgan Freeman or Anthony Bourdain already exists, but what are the morals behind its use and is it even legal?

Sir David Attenborough voiced his AI concerns in 2023

You’re doom scrolling a social media app when a video catches your eye. Or rather a voice catches your ear. It is, unmistakably, Sir David Attenborough, but he’s not saying Sir David Attenborough things. Instead, he is extolling the virtues of drilling in the arctic and burning what we find there, heating the planet so wine grapes can grow in Inverness.

The artificial intelligence (AI) technology to achieve this deepfake already exists, and it’s getting more sophisticated all the time. In 2023, Attenborough told Business Insider: “The fact I find this personally distressing may count for nothing in the minds of people who freely share the ability to create false versions of me regardless of my feelings. But it is of the greatest concern to me that one day, and that day may now be very close, someone is going to use AI to deceive others into believing that I am saying things contrary to my beliefs or that misrepresent the wider concerns I have spent a lifetime trying to explain and promote.”

The factual sector has already grappled with controversy in this area. Morgan Neville, director of the 2021 doc Roadrunner: A Film About Anthony Bourdain, revealed he fed 10 hours of the late chef’s voice into AI to create 45 seconds of voiceover for the film following his death by suicide in 2018. It sparked outrage among critics, fans and family members. Neville rather flippantly told the New Yorker: “We can have a documentary ethics panel about it later.”

The time for that panel is probably now. Perhaps in response to Scarlett Johansson lawyering up earlier this year after OpenAI released its chatbot with a voice eerily similar to her own from the 2013 film Her, Facebook-owner Meta recently made sure it licensed the voices of Awkwafina, John Cena, Dame Judy Dench and many more stars for its AI chatbots.

“It’s a good thing to shine a spotlight on because the short answer to who owns David Attenborough’s voice is not Sir David, unfortunately,” says digital consultant and AI specialist Dan Taylor-Watt. “The current legal provisions in the UK don’t have something that directly addresses voice likeness and there is a need for new legislation like we’re seeing in the US.

“Last year there was a UK indie band that put out an album of songs in the style of Oasis, with an AI-generated Liam Gallagher ‘singing’ the lyrics under the name AISIS. Fortunately, Liam was really positive about it and they were using original songs written by themselves, so there was no underlying copyright around the track. It was just a clone of the voice and essentially there isn’t any real legal recompense for that at the moment.”

Content creators are effectively being relied on to not do such things because they are wrong and could damage relationship with talent, Taylor-Watt believes, adding: “When it comes to big, established broadcast and streaming outfits they would be a bit foolish to do that without authorisation.

“The worry comes with online creators where there aren’t those same concerns about established talent relationships and less fear of reputational damage. It’s already happening if you look for David Attenborough AI videos on YouTube. They’re already out there. The genie is out of the bottle.”

The tech has been used fairly harmlessly for comedic purposes – Gareth Southgate’s ‘alternative’ press conferences at Euro 2024 did brisk business on X – but there are already big concerns.

Roadrunner: A Film About Anthony Bourdain controversially incorporate dAI

AI start-up Dubformer specialises in delivering dubbing and voiceover services to the media industry and is using AI voices to do it. Voiceover artists can take their voice and suddenly speak any language in the world using the tech.

On the tech side the company, founder Anton Dvorkovich says: “This is not a completely new issue in AI because the voice is just one aspect of human likeness and we’ve been dealing with deepfake visuals, photographs and videos for quite some time now. This kind of tech can be used for good and it can be used for bad. Fraud is the first thing that comes to mind. They could clone your child’s voice, ring you in distress and ask for money.

“It’s a powerful tool and to defend against it everyone needs to know about these possibilities. Education is crucial. Basically, with the current state of technology, you shouldn’t believe what you hear just as you shouldn’t believe what you see.”

The tech is so far being used mainly by distributors who want to internationalise their library and influencers/content creators who want to reach a broader audience by speaking more languages.

Irina Divnogortseva, Dubformer’s head of media, adds: “First of all, we understand the responsibility. We understand we’re changing the market. We’re very transparent with our clients. We can clone the voice, that’s true, but all the rights are guaranteed by the producer and we have consent. When we do clone a voice, we guarantee it will only be used in the client’s content, nowhere else.”

A Morgan Freeman voiceover can elevate a project

Dvorkovich believes the traditional part of the content industry will recognise the intrinsic value of a celebrity voice and play by the rules. But, of course, in the wild west of social media it’s unlikely people will – and that presents a problem.

“The boundary between social media and traditional media is becoming blurred. A lot of companies that previously used to only do traditional media are exploring new platforms – mostly YouTube, but some are also starting production on TikTok. It will be interesting to see if social media becomes less and less like the wild west.”

Meanwhile, voiceover talent remains a big business. BBC Studios (BBCS), the commercial arm of the BBC, shops Attenborough franchises like Planet Earth, Blue Planet and Frozen Planet, to which his voice adds considerable value. A Morgan Freeman voiceover elevates a project’s standing substantially and he can charge accordingly.

These organisations and people are not going to want to see that value drain away due to deepfakes. But equally, the BBC, Netflix and others may want to use the tech themselves to continue reaping those financial rewards once these icons have passed on. Freeman, who made The Story of Us with Nat Geo, is 87, Attenborough is 98 and another legend of modern voiceover, James Earl Jones, took his incredible baritone with him when he died in September.

Muslim Alim

Muslim Alim, the BBC commissioning editor for daytime and entertainment, is a rare breed in that he regularly preaches the virtues of AI and encourages prodcos to dive into the tech, rather than urging caution and warning it’s going to kill the industry.

He says: “If you’re in the performance sphere you will have to think about every aspect of your likeness – your physical likeness, your visual likeness, your audio likeness and maybe even your thought process. That may sound a bit out there, but if you look at the assistants companies like Microsoft and Google are working on, if you feed into a model on a weekly basis it will get to know your personality and therefore almost act like you.

“What people will find acceptable in the future all depends on uptake. Would it be acceptable to do stuff in the manner of David Attenborough or not? It would all have to be licensable, contracted, permissions given, etc. We’ve already seen actors getting scans of their bodies done so they can appear in more than one project at a time or not have to dedicate six months to filming one thing. Will that come to fruition or will people find it weird? The audience is going to be the judge of all this.

“Fire was the very first technology – very dangerous, but also very useful. So that’s the analogy I would use; AI is no different. [Some] people will be bad actors who will use it for the wrong reasons and it will do some amazing things as well.”

In the US, legislation is moving faster than elsewhere. Tennessee’s ELVIS Act aims to protect artists’ voices, names, images and likenesses from AI misuse. In September, California’s AB 1836 was signed, requiring estate permission to use AI to recreate a deceased person’s voice or likeness.

Nationally, the No Fakes Act protects individuals from unauthorised AI recreations, while the No AI Fraud Act targets cloning and impersonation. The ‘right of publicity’ also safeguards a person’s likeness, name and voice.

Benjamin Field

However, in the UK, the Online Safety Act 2023 only addresses fake pornographic content. Existing ‘passing off’ laws can be circumvented by disclaimers like those used by AISIS. Cases like The New York Times vs OpenAI may offer BBCS a route if it can prove its documentaries were used to train AI models to mimic Sir David.

It’s time to talk to a lawyer. Caitlin McGivern, a senior associate at Harbottle & Lewis, says while there is no UK equivalent to the US ‘right of publicity’ and there is AI specific legislation coming down the track, the country isn’t starting from scratch. “There are already a number of avenues that would potentially be open to someone to bring a claim to stop that kind of copycat voice,” McGivern says.

“David Attenborough could take action against a producer under a tort called passing off. It protects him, or anybody else, from somebody pretending he was involved in something when he wasn’t. There are requirements to bring a claim, such as proving there is goodwill in your name or voice, which for David Attenborough, as a household name, wouldn’t be too difficult to convince a court.

“There also has to be misrepresentation, so if the producer has been pretending David Attenborough was involved. But it could get tricky if the producer has been open from the beginning that it’s an AI voiceover that just sounds like him. It’s counter intuitive because it’s much cheekier, but it would protect the producer from a passing off perspective.

“If he can prove misrepresentation then he has to prove he has suffered harm or damage. If it was a controversial documentary about fossil fuels or gun rights then, again, that’s quite easy to prove. But if it’s just a poor documentary then that’s a matter of subjective taste and could be tricky.”

McGivern also believes a claim of defamation could be brought suggesting reputational damage, depending on what they got the AI to say – for instance, using Attenborough’s voice to make out he had a change of heart about burning fossil fuels and that the climate emergency was a hoax. But again, if it was made clear the doc was using AI to generate the voice or it was clearly a parody it would be harder to bring a claim. Bringing a claim under privacy laws would probably also rely on the voice being used to reveal personal details.

As for the idea of chasing down AI for scraping copyrighted Attenborough docs to teach the machine how to speak like him, McGivern points out there is also a terms of trade issue here. Attenborough passes the rights to the recordings to the production company making the show. They then pass it to Netflix if they’re on a work-for-hire basis with the streamer or keep them if they’re working with a public service broadcaster like the BBC.

“Given the volume of data involved on the input side and the lack of clarity about what’s happening in the machine, it would be very difficult to prove what was used to train it and who is best placed to bring a claim on that,” she adds.

If BP wanted to produce a ‘spoof’ Attenborough documentary about how beneficial the oil company is to the life of sea turtles, it does seem there is little recourse to the law outside the US if it was made clear to the viewer it was a computer-generated voice and not really him.

It is starting to look like an ethical question rather than a legal one, which might be an appropriate place to bring in Deep Fusion – an AI-focused production company launched by Jamie Anderson and Benjamin Field last year. It started out by generating the appearance of Jamie’s late father, Thunderbirds creator Gerry Anderson, for Gerry Anderson: A Life Uncharted and aims to make the ethical use of the technology its calling card. It has also recently received a commission from Night Train Digital to produce a podcast series Virtually Parkinson, hosted by an AI version of late UK chat show host Sir Michael Parkinson.

“There’s a could you do it, should you do it and is it legal to do it [question],” Field says. “The question of whether or not you own your own voice is new territory. I don’t know of any laws in the UK that would immediately stop you, but morally you’re immediately on the wrong side. It’s less a legal responsibility and more the responsibility of the filmmakers and the production company to do the right thing for the benefit of the industry and creatives.

“At Deep Fusion, we want to hold the ethical, moral high ground of saying we don’t do anything without full consent. The Gerry Anderson project was about recreating somebody’s face to go with s ome pre-existing audio. There is a very different conversation to be had when what you’re doing is making somebody say something they didn’t say.

“At no point during that documentary did we ever try to trick the audience. It was there in black and white that it was a recreation. If the audience knows that, actually, it’s not David Attenborough and you’re clear it is an AI recreation of David Attenborough’s voice, then you’re being honest about the dishonesty and moving closer to voice mimicry, akin to [impressionist] Jon Culshaw. The Anthony Bourdain example was just horrific because they took somebody’s voice, they made him say something he never said and they didn’t make that clear. The audience found out and they felt tricked.

“It’s about positioning. There is a world in which you can legawlly and ethically use a person’s voice, whether they’re alive or have passed, and create some new engaging content with that. But it’s about consent and how you present it to the audience.”

So can you? Essentially, yes. Should you? A deeply divisive question. Will you? Oh God yes, we’re about to be inundated.


RECENT DEPARTMENTS SEE MORE