Trending News and Blog Site

Let’s talk about the Rashmika Mandanna deepfake situation and how to spot such videos

There’s been a recent deepfake video featuring the actor Rashmika Mandanna, and it’s raising questions about regulating AI. Let’s explore what happened and how you can recognize such deepfake videos.

rashmika mandanna

On November 5, a video appeared on the internet, showing Rashmika Mandanna entering an elevator. It quickly spread on social media. But it turned out to be a highly realistic fake – a deepfake video that swapped Mandanna’s face onto that of a British-Indian influencer. This incident has sparked concerns about the dangers of AI-powered deepfake technology, how to identify and combat it, and how individuals can safeguard themselves from being impersonated.

Before delving further, it’s essential to understand what deepfakes are. Deepfakes involve using AI to manipulate media like photos, videos, and audio to an incredibly convincing extent, making them appear genuine. Mandanna fell victim to the latest use of this technology.

The Rashmika Mandanna deepfake controversy

A short six-second video clip of the actor Rashmika Mandanna was widely shared on the internet. The original person who posted the video is not known, but it quickly gained popularity. In this video, it appeared that Rashmika Mandanna was entering an elevator. However, a journalist named Abhishek from AltNews soon pointed out that this video was, in fact, a deepfake. In a series of posts, he stressed the urgent need for India to establish a legal and regulatory framework to address the issue of deepfakes.

Abhishek also clarified that the viral video, which many people believed to be real, was a deepfake featuring Zara Patel, a British-Indian individual with 415,000 followers on Instagram. Patel had initially posted the video on Instagram on October 9. The quality of the deepfake was so convincing that it could easily deceive regular social media users.

Read Also:- Human-like robots exist, but they’re not very graceful. Are they necessary?

Amitabh Bachchan and Rajeev Chandrasekhar’s responses to the video

After the deepfake video of Rashmika Mandanna was exposed, it garnered reactions from various celebrities and leaders. Amitabh Bachchan, who is set to appear alongside Mandanna in the upcoming film “Goodbye,” expressed the need for legal action against such incidents. Union Minister Rajeev Chandrasekhar emphasized the government’s commitment to ensuring the safety and trust of internet users, highlighting deepfakes as a highly dangerous form of misinformation that requires action by online platforms.

Rashmika Mandanna herself shared her concerns, stating how terrifying it is not only for her but for anyone vulnerable to the misuse of technology. Zara Patel, the woman whose video was deepfaked, issued a statement expressing her distress and concern for the future of women and girls on social media. She urged people to fact-check information on the internet, emphasizing that not everything online is real.

This incident has brought attention to the potential harm and misuse of deepfake technology, sparking discussions about the need for legal regulations and safeguards against such deceptive content.

How to identify deepfake videos and safeguard yourself against them

The Massachusetts Institute of Technology (MIT), which is known for its AI and ML research, has shared useful tips to help people distinguish between deepfake and real videos. Here are a few of these tips:

1. Look at the face: Deepfake videos often focus on altering facial appearances.

2. Observe blinking: Check if the person in the video blinks appropriately; excessive or insufficient blinking can be a sign of a deepfake.

3. Lip movements: Some deepfakes rely on lip-syncing, so assess if the lip movements appear natural.

In the Mandanna/Patel deepfake video, these issues were noticeable, even in a brief six-second clip, with careful observation.

It’s also crucial to protect yourself from deepfakes since scammers are starting to use them to deceive victims, making them believe they are interacting with someone they know during video or audio calls.

Read Also:- Discover the benefits of the JioMotive OBD device for cars and get a 58% discount

To protect yourself

1. Ask the person to wave their hands in front of their face. Deepfake videos can’t handle disruptions like this effectively.

2. Don’t send money hastily if you receive suspicious videos from friends or family. Always call another number or a different family member to verify.

3. Verify their identity by asking something personal.

For most people, the risk of being deepfaked is low because creating these superimposed videos requires a vast amount of personal data. If you don’t have many photos and videos of yourself online, it’s challenging for AI to make an accurate deepfake, especially when showing your face from different angles.

Leave A Reply

Your email address will not be published.