The human voice, Gen-AI and Australian law

8 minute read  11.08.2025 Jaimie Wolbers and John Fairbairn

Are the qualities of an individual's voice protected under Australian law? Although yet to be tested, well-known individuals may be able to take action.


Key takeouts


  • Choosing to use Gen-AI to imitate the voice of a living person for a commercial purpose, without their consent, is a risky proposition.
  • Actions under the Australian Consumer Law, and/or passing off may be available to individuals whose voices are appropriated by third parties, but only where they have established a reputation in the sound of their voice.
  • Singers, actors and broadcasters should closely review any agreements they are entering to ensure they are comfortable with any proposals to use their voice, including to train Generative AI models.

Are the qualities of an individual's voice protected under Australian law?

In June 1988, the Ninth Circuit of the US Court of Appeals found in the case of Midler v Ford Motor Co. (1988) 849 F.2d 460 that:

"… when a distinctive voice of a professional singer is widely known and is deliberately imitated in order to sell a product, the sellers have appropriated what is not theirs and have committed a tort in California."

The case was remanded for trial, where a jury found that an advertising agency (on behalf of its client, the Ford Motor Company) had deliberately imitated Bette Midler's voice in producing a television commercial to promote the Ford Lincoln Mercury. As part of the campaign, the advertising agency had hired a singer to record a "sound-alike" version of Ms Midler's recording of "Do You Want to Dance?" (a song which appeared on her 1973 album, "The Divine Miss M"). There was evidence that the advertising agency had originally approached Ms Midler's management to licence the original recording. There was also evidence from the singer, Ms Ula Hedwig (who had previously worked as a back-up singer to Ms Midler) that she had been instructed to:

"…sound as much like Midler as possible while making the recording."

The jury awarded Ms Midler $400,000 in damages.

The rise of Generative AI

In 2025, Generative AI is in the spotlight. The rapid proliferation and uptake of the technology presents many risks and opportunities for individuals and industries, while policymakers grapple with how to regulate the use of the technology, including within existing legal frameworks.
Some Generative AI programs, such as Google's MusicLM can create audio content. Others such as AI vocal emulators can transform someone's singing voice into that of a well-known artist. In circumstances where that content mimics or imitates the voice of an individual, is there a basis for that individual in Australia to take action to restrain the use of the content or seek compensation for any damage they may suffer?

As noted by the Ninth Circuit in the Midler v Ford case, in comparing Ms Midler's claim to a copyright claim:

"What is put forward as protectible here is more personal than any work of authorship."

It is not clear how far the Californian tort extends – would an individual who is not a professional singer, but who otherwise has a recognisable and distinctive voice have a cause of action if their voice was used without their permission?

The Australian legal position

In Australia, whether copyright law would prevent the emulation will depend heavily on whether and how a recording of a well-known artist's voice was used. If a pre-existing sound recording of an artist's voice is used, then the owner of the copyright in that recording may be able to restrain the use. An example of this was Universal Music compelling the take down of a song that purported to simulate Drake and The Weeknd. However, putting to one side the training of the AI, if the AI generated song does not reproduce a pre-existing sound recording but generates a new voice or vocal that sounds like a well-known artist, copyright law is unlikely to provide a way to stop the sound-alike or impersonation.

The most direct equivalent to the tort identified in the Midler proceedings is the tort of passing off or an action pursuant to the provisions of the Australian Consumer Law (Schedule 2, Competition and Consumer Act 2010 (Cth)), which prohibit engaging in misleading or deceptive conduct (section 18) or making false or misleading representations (section 29).

The action of passing off is founded on the defendant's interference with the plaintiff's reputation or goodwill. An action in passing off is typically pursued where elements of a business' branding, including unregistered trade marks or get-up are adopted by a third party. In order to establish the tort of passing off, it is necessary to establish the "classical trinity" of elements, namely:

  1. Reputation – the applicant must be able to prove that they have a reputation in the relevant brand elements in the relevant market;
  2. Misrepresentation – the respondent must have used the relevant brand elements in a manner that gives rise to a misrepresentation; and
  3. Injury or damage to goodwill – the applicant must have suffered (or be likely to suffer) damage to the goodwill in their identity or brand because of the respondent's conduct.

However, an applicant seeking to rely on the tort of passing off in the context of the use of an imitated version of their voice (by another human, or by AI) is likely to run into difficulties unless the performer is well-known in Australia. The first will be establishing that they have a reputation in their voice, and the second will be establishing damage to the goodwill. If the individual is not a person with a distinctive voice, known for the use of their voice and commercialising the use of their voice (e.g. a singer, actor, podcaster or keynote speaker) they are unlikely to satisfy either of these elements.

The issue of "sound-alike" recordings has previously been considered by Australian Courts. In the case of CBS Records Australia Ltd and Others v Telmak Teleproducts (AUST) Pty Ltd (1987) 8 IPR 473, the Federal Court recognised in an application for an interlocutory injunction that there was at least a "serious question to be tried" under what was then section 52 of the Trade Practices Act 1974 (Cth) (equivalent to what is now section 18 of the Australian Consumer Law), as to whether the promotion and sale of a compilation album consisting of sound-alike recordings constituted misleading or deceptive conduct. In this case, the respondent had produced a compilation album with the title "Chart Sounds 16 Hit Songs No 1". It consisted entirely of "sound- alike" recordings, and the applicant alleged that the manner in which the album was packaged and promoted misrepresented that it embodied performances by the original recording artists. (The matter ultimately settled, so there is no final judgment on the issue.)

In short, the position in Australia will depend on the extent to which the performer who is emulated without their permission is known in Australia and whether it is presented in a way that consumers would consider it to be a recording of the performer rather than an emulation. This will be fact dependent.

Fraud and scams

While the Midler and CBS Records cases reflect classic commercial circumstances where a sound- alike recording of popular music may be produced and published, the circumstances in which a human voice may be imitated are much broader.

Australian philanthropist and mining magnate Andrew Forrest is currently pursuing proceedings in the United States against Meta Platforms. This is in connection with scam activity that has been published on Facebook, that involved the use of his name and likeness to lure victims into investment and cryptocurrency scams. It raises the question as to what role platforms such as Facebook should play in protecting an individual's identity, including their voice.

The Sydney Morning Herald recently reported that recording artists are encountering issues with AI generated music that imitated the artist being uploaded and connected to their profiles on platforms such as Spotify, YouTube, Tidal, Apple Music and Deezer, see here. The article alleges that there are insufficient controls on the platform to prevent bad actors from linking content to an artist's profile as there is no mechanism for the artist (or their management) to approve what is or is not being linked to the profile – it can simply be uploaded via a digital distributor without checks or verification.

The question remains open as to what role platforms such as Meta and Spotify should play in preventing such fraud & scam activity. Also, whether specific regulation ought to be enacted to provide powers to bodies such as the eSafety Commissioner to order the removal of material that imitates an individual's voice, particularly if the individual is not a famous singer, actor or broadcaster.

Commercial considerations

Electing to use Generative AI to imitate the voice of a living person for a commercial purpose in Australia, without that person's consent, is clearly a risky proposition.

At the same time, an overly restrictive approach could stifle creativity and innovation where emulations are used to create new works in a way that is not being passed off as involving the original artists or copying original recordings.

If you are intending to use AI generated sound recordings you should consider:

  • How the model was trained, and whether appropriate licences have been obtained to the sound recordings that were used to train the model? (We have previously discussed the copyright implications of Gen-AI in further detail).
  • If the output imitates the distinct voice of a living person, has that person consented (or been approached for consent) to the creation and proposed use of the output? (In mid-2024 the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA) announced a deal with a third party that where actors can licence their digital voice for use in AI generated digital advertising).
  • Does the audience need to be told that the content you are publishing has been generated by AI? Would a disclaimer to this effect, be sufficient or would something further be required?


Our leading intellectual property team is ready to assist you with balancing the commercial risk associated with the use of GenAI in your business.

Contact

AI Advisory

Illuminate the full picture. AI isn’t just one decision or one part of your business. It’s all of them.

LEARN MORE

eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJuYW1laWQiOiIzMjc5NDI1Ni0xZjBiLTQzNzktOTJiMS01NThmOTc5Zjc4MjQiLCJyb2xlIjoiQXBpVXNlciIsIm5iZiI6MTc1NDk2MTE0NywiZXhwIjoxNzU0OTYyMzQ3LCJpYXQiOjE3NTQ5NjExNDcsImlzcyI6Imh0dHBzOi8vd3d3Lm1pbnRlcmVsbGlzb24uY29tL2FydGljbGVzL2FyZS12b2ljZXMtcHJvdGVjdGVkLXVuZGVyLWF1c3RyYWxpYW4tbGF3IiwiYXVkIjoiaHR0cHM6Ly93d3cubWludGVyZWxsaXNvbi5jb20vYXJ0aWNsZXMvYXJlLXZvaWNlcy1wcm90ZWN0ZWQtdW5kZXItYXVzdHJhbGlhbi1sYXcifQ.6csnbrI6E7YPHehCLhDiH5ccLV_VFxQjZwy6aDjRBE8
https://www.minterellison.com/articles/are-voices-protected-under-australian-law

Tags