How Taylor Swift can explain the AI's risk on the music industry

Amy Martin
February 20 2024 - 5:30am
Will AI ever write a song as good as Taylor Swift does? And is that even the biggest threat? Picture Getty Images, TAS Management
Will AI ever write a song as good as Taylor Swift does? And is that even the biggest threat? Picture Getty Images, TAS Management

There was a time when the topic of Taylor Swift and AI would centre around whether the technology would ever write a song good enough to be on one of her albums.

And then, just a few weeks ago, pornographic deep fakes of the superstar, who is currently in Australia for The Eras Tour, found their way onto Twitter.

It's not the first time AI-generated images such as these were created - and circulated - of a celebrity or politician. But the singer's position as one of the most talked-about people in the world saw the issue gain worldwide attention.

WATCH: Melbourne has put on an amazing display of lights at Flinders Street Station as fans flock to the MCG, purchasing merchandise in preparation for Taylor Swift's first Australian concert.

Not to mention, Swift's relationship with her fans meant they all jumped into action, many collectively reporting the deepfake images to Twitter, while simultaneously posting other images and footage of Swift to drown them out on the social media platform using #protectTaylorSwift.

Still, it took an almost literal army to achieve a less-than-quick result. One image, according to The New York Times, had been viewed 47 million times before the account was suspended.

While it's not a situation you would wish on anyone, Professor Lisa Given, Social Change Enabling Impact Platform director and professor of information sciences at RMIT University, says Swift's involvement does put a spotlight on a largely unmonitored issue. Not just that it happens, but how hard it is to remove these photos once they've been posted.

But she adds, this is more than just the person whose image is being used. It has a knock-on effect to society as a whole.

WATCH: Central Western Daily Taylor Swift correspondent Emily Mann runs us through the chants we need to know before the Eras tour.

"One of the critical things is even when it only happens in isolated cases, or a small handful of cases, it's really about the impact that it has by confusing people because they start to question, is this real?" she says.

"So it's that deep reach of instilling doubt in people and fear around what these technologies can do. But for those individuals, it's a form of abuse. Particularly women are often the targets of these kinds of deepfake campaigns.

"And it's something that the Taylor Swift situation is heightening - it brought it to the world's attention in a very, very clear way. And in a way that's a positive because it means that more people become very familiar with what is possible, but hopefully, it will also kind of push legislators to start to clamp down on how these technologies get used."

So who is affected by deep fakes? Well, primarily it's women. According to a Sensity AI report, 99 per cent of them feature women and 96 per cent are sexually explicit.

Sadly, women being objectified in images is not a new phenomenon. It's simply the technology that's new.

WATCH: MCG operators will review arrangements inside and outside the stadium for Taylor Swift concerts after the first night but are hoping

"Women have been facing image-based abuse for probably centuries - since the creation of cameras. And it used to be that perhaps this was more challenging to do or you had to have technical skill, you had to have certain tools to create these kinds of images and circulate them," Given says.

"What AI does is it puts those tools into more people's hands, it's easy to use, it's becoming more ubiquitous.

"The motivation is often around humiliation, control, abuse, and trying to ruin the reputation of a person. And so even if the photographs are fake and we know that they're all fake, there can still be a lot of shame that the victim feels to be put in that situation."

So does this mean that for Taylor Swift - and others like her - AI poses more of a threat to their image than their music?

In some ways, yes. Given says when it comes to platforms such as ChatGPT they imitate the works of a songwriter (or writer of any type) when given the instruction to.

It takes patterns that writers naturally have - the tendency to rhyme or creating certain imagery, for example - and use them to create a seemingly new song.

WATCH: We all know that Taylor Swift is bringing her Eras Tour to Australia. But what exactly is an era?

"I think that for most people ... it doesn't feel quite right because any kind of artist is also creating something new," Given says.

"They're building on their life experience. They're drawing on new ideas, new concepts they haven't put in front of the audience before.

"And that's the limitation for AI because what it's doing is kind of looking for existing patterns and replicating those patterns."

But also, music is not only created by the Taylor Swifts of the world.

Where it may become a problem, Given says, is the music written for things such as video games or generic music for public spaces. Elevator music, so to speak.

"There's a lot of areas where music is part of our everyday life and we almost don't even notice it, and some of that music may be quite generic, may not even have lyrics, it might just be kind of background music," she says.

"And in the past, those have been created by paid artists. That's something where AI technology may be very adept at actually creating something that serves a purpose."

Amy Martin

Amy Martin

Canberra Times lifestyle reporter

As the lifestyle reporter, I love finding out what makes people tick and giving insight into the different ways that you can enjoy the city we live in. Email: amy.martin@canberratimes.com.au