โŒ

Normal view

From Taylor Swift to Bollywood, stars turn to the civil courts to fight deepfakes

Ethan Miller/Getty Images

Music superstar Taylor Swift has applied to trademark her voice and image to head off the threat of AI-generated impersonations. But the problem extends much further than pop royalty.

Anyone can be manipulated by the powerful technology: AI-created videos of you endorsing a politician you despise, images on social media of you in a skin-tight Spiderwoman outfit you never wore, a simulation of your voice allowing users to indulge their sexual fantasies โ€ฆ all possible.

The rapid development of deepfakes is amplifying calls for better legal protections for individualsโ€™ images and likenesses. The notorious rollout of new picture-editing capabilities by Xโ€™s Grok chatbot in late 2025 only added to their urgency.

And the law has begun to respond. Australia now criminalises creating and sharing sexually explicit material online, including digitally created material.

In the US, the 2025 Take it Down Act prohibits non-consensual publication of intimate depictions of individuals, including โ€œdigital forgeriesโ€.

In New Zealand, proposed amendments to the Crimes Act and the Harmful Digital Communications Act will improve criminal law responses to sexual deepfakes.

But another legal front is opening up, too: victims are turning to tort law. Part of the civil (rather than criminal) law, tort claims do not require the state to act. People can seek damages and injunctions to shut down or block access to the harmful and humiliating material.

Misappropriation of personality

Some countries, including Canada, South Africa and India, recognise a common law tort of misappropriation of personality.

This targets unauthorised use of a personโ€™s name, likeness and voice, usually for commercial purposes. About half of the states in the US recognise some version of this tort.

Now, the Indian courts are taking the lead in extending the tort to include deepfakes.

Bollywood stars Aishwarya Rai Bachchan and Anil Kapoor have used tort law to shut down websites and other online platforms where deepfakes have been posted โ€“ including fake pornographic videos and chatbots.

Elsewhere, including in New Zealand, the United Kingdom and Australia, the law is much more piecemeal because the common law does not recognise a specific tort of misappropriation of personality.

This means protections need to be cobbled together from more established legal claims, including defamation, breach of confidence and and โ€œpassing offโ€.

A court battle is currently raging in the UK over whether a digitally-assisted resurrection of Peter Cushing in the 2016 Star Wars movie Rogue One is a form of โ€œunjust enrichmentโ€. (Cushing starred in a previous Star Wars episode but died in 1994.)

Anil Kapoor and Aishwarya Rai Bachchan at a screening of their 2018 film Fanney Khan in Mumbai. Azhar Khan/SOPA Images/LightRocket via Getty Images

The right to live with dignity

In the Bollywood cases, the courts explained that deepfakes affect victimsโ€™ โ€œright to live with dignityโ€. The judges linked these tort principles to constitutional protections for โ€œlife and libertyโ€.

Canadian judges have said similar things, linking protections for individualsโ€™ personality to rights in the Canadian Charter of Rights.

Human dignity โ€“ essentially the right not to be a means to othersโ€™ ends โ€“ is at the core of these protections and it recognises the inherent worth of all people. Deepfakes cut right across these fundamental legal commitments.

In the case of Anil Kapoor, the court acknowledged additional harms beyond those he suffered. The legal protections were also for โ€œthe sake of his family and friends who would not like to see his image, name and other elements being misused, especially for such tarnishing and negative useโ€.

This recognises an emerging legal concern with connections between people, not only with the rights of individuals. It also aligns with the increasing role of Mฤori tikanga (law and custom) in New Zealandโ€™s common law.

Another welcome development in the United States is proposed legislation that would enable non-celebrities, not just the rich and famous, to bring damages claims and seek injunctions against deepfakes.

A bill introduced to Congress in April would extend protections to US citizensโ€™ โ€œDNA sequences or traitsโ€ that could be used to replicate or misuse identity in commercial applications.

Protecting victims of deepfakes will require an array of legal responses: criminal, civil, technological and regulatory โ€“ including trademark law, as Taylor Swift is using.

Unfortunately, few of us have the financial means to bring a torts claim. Even so, the emphasis on human dignity in the Bollywood cases reminds us of whatโ€™s at stake: the inherent worth of all people โ€“ celebrities and non-celebrities alike.

The Conversation

Graeme Austin does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

โŒ