Close

Well-known Indian celebrities targeted by deep fake porn

deep fake porn

Why are Bollywood actresses and well-known Indian celebrities targeted by deepfake porn makers?

One Bollywood star can be seen making obscene gestures towards the camera while the other is seen posing in skimpy clothing.

It’s just that none of this actually happened.

These are the latest in a series of deepfake videos that have gone viral in recent weeks.

Actress Rashmika Mandana, Priyanka Chopra Jonas and Alia Bhatt are among the Bollywood stars who have been targeted by such videos. In these videos, their faces or voices were replaced by someone else’s faces or voices.

So what’s behind the rise of Bollywood deepfakes?
Deepfakes have been around for a long time and have been targeting celebrities.

Artificial intelligence (AI) expert Aarti Samani told the BBC that ‘Hollywood was still suffering’ and that high-profile victims included actresses such as Natalie Portman and Emma Watson.

But he said recent advances in artificial intelligence have made it even easier to fake audio and video of people.

Mr Aarti said: ‘Over the last six months to a year these tools have become much more sophisticated and this is an indication that we are seeing this type of content in other countries.

‘There are many tools now available, which allow you to create realistic synthetic images at little or no cost, and access is common.’

India also has some unique factors for such content, Samani said, including a large youth population, heavy use of social media, and an “obsession with Bollywood and celebrity culture”.

He added that ‘as a result, video clips spread rapidly, exacerbating the problem’ and the motivation for such videos has doubled.

He said, ‘Bollywood celebrity content creates an attractive clickbait, which generates huge advertising revenue. There is also the possibility of unknown people selling the data contained in this content.’

‘very scary’
Fake photos are often used for porn videos, but fake videos can be made from almost anything.

Recently, a video of the 27-year-old actress Mandana appeared on Instagram, showing the body of another woman in a black dress.

The video went viral on social media, but a journalist from the fact-checking platform Alt News reported that the video was a deepfake.

Rashmika Mandana termed the incident as ‘very scary’ and appealed to people not to share such content.

A video of megastar Priyanka Chopra Jonas also went viral recently. In this case, instead of changing their face, their voice was used to promote a brand as well as present investment ideas.

Actress Alia Bhatt has also been inspired by such videos. A video uses his face making various obscene gestures.

Apart from her, actress Katrina Kaif and other stars have also been targeted. In her case, a picture from the film ‘Tiger 3’ in which she is seen in a towel was replaced with a different outfit, thus revealing more of her body.

Not only Bollywood actresses have been affected by AI, but recently others have also been targeted. Among them is well-known Indian industrialist Ratan Tata, whose deepfake video has been made giving investment advice.

Women on target
But it seems to be specifically targeting women under it.

Research firm Sensitivity AI estimates that between 90 percent and 95 percent of all deepfakes are pornographic. In the majority of them, women have been targeted.

Ivana Bartoletti, global chief privacy officer at Indian technology services and consulting company Wipro, said: ‘I’m starting to fear it.’

He added: ‘For women this is particularly problematic because the media can be used to create images of pornography and violence, and as we all know there is a market for that.

‘It’s always been a problem, and it’s because of the rapid proliferation and availability of these tools and it’s all very surprising now.’

Maz Samani agreed, saying the deepfake problem is “definitely worse for women.”

“Women are often judged on beauty standards, and women’s bodies are presented as products,” she said.

‘Deep faxes take it even further. The non-consensual nature of deepfakes robs women of a dignified portrayal of their bodies and autonomy. It puts power in the hands of criminals.’

Call for action
As deepfake videos continue to spread, governments and tech companies are being asked to crack down on such content.

For its part, the Indian government is cracking down on deepfakes heading into a general election year.

After Mandana’s video went viral, the country’s IT minister Rajeev Chandrasekhar spoke out against DeepFax, saying: ‘These are the latest and most dangerous and harmful form of disinformation and are being spread through platforms. needs to be dealt with.’

Under India’s IT laws, social media platforms have to ensure that ‘no false information is posted by any user.’

Platforms that do not comply can be taken to court under Indian law.

But Ms. Bartoletti said the problem is much broader, with countries around the world focusing on tackling the problem.

He said: ‘It’s not just Bollywood actors. Deepfakes are also targeting politicians, business people and others. Many governments around the world have begun to express concern about the impact of deepfakes on democratic governance in elections, among other things.’

He said social media platforms need to be held accountable, and they need to proactively identify and eliminate deepfakes.

Maz Samani said men’s consent also plays a ‘very important role’ in tackling the problem.

He said: ‘Victims are rightly raising concerns and demanding action, but very few men are speaking out against the issue.’

‘There needs to be more support from men.’

Leave a Reply

Your email address will not be published. Required fields are marked *

0 Comments
scroll to top