Deepfake Video of Rashmika Mandanna Goes Viral, Sparking Calls for Legal ActionAmitabh Bachchan retweeted the journalist's post, expressing his support for taking legal action aga... Read more at: https://www.onmanorama.com/entertainment/entertainment-news/2023/11/06/rashmika-mandanna-deepfake-video-viral-amitabh-bachchan-reacts.html

In a disturbing case of deepfake technology being misused, a video featuring the face of South Indian actress Rashmika Mandanna morphed onto another woman’s body has gone viral on the internet. The video, which shows the woman entering an elevator, has been widely circulated on social media platforms, raising concerns about the potential for misuse of deepfakes to spread misinformation and harm individuals’ reputations.

The video was first shared on Instagram on October 9 by Zara Patel, a British-Indian woman with a large social media following. The original video featured Patel entering an elevator, but her face was later digitally manipulated to resemble Mandanna’s. The deepfake video was then shared widely on social media, with many users believing it to be authentic footage of the actress.

The viral video caught the attention of Bollywood veteran Amitabh Bachchan, who expressed his support for Mandanna and called for legal action against those responsible for creating the deepfake. “Yes, this is a strong case for legal,” Bachchan tweeted.

The incident has sparked a debate about the potential dangers of deepfake technology, which has the ability to create highly realistic and convincing videos that can be used to spread misinformation or damage someone’s reputation. In the case of Rashmika Mandanna, the deepfake video could potentially be used to create false rumors or even blackmail the actress.

Experts have warned that deepfake technology is becoming increasingly sophisticated and accessible, making it easier for malicious actors to create and spread fake videos. They have called for stricter regulations and more awareness about the dangers of deepfakes in order to protect individuals from harm.

The Need for Legal Action and Regulatory Frameworks

The viral deepfake video of Rashmika Mandanna highlights the urgent need for legal action and regulatory frameworks to address the misuse of deepfakes. Currently, there is no specific legislation in India that specifically addresses deepfakes, although there are laws against defamation and cyberbullying that could potentially be applied to such cases.

In other countries, such as the United States, there is growing momentum for legislation to regulate deepfakes. In 2021, the California legislature passed a bill that would make it illegal to create or distribute deepfakes without someone’s consent. The bill is currently awaiting the governor’s signature.

In addition to legal action, there is also a need for greater awareness and education about the dangers of deepfakes. The public needs to be aware of how deepfakes can be used to spread misinformation and harm individuals, and they need to be able to identify and report fake videos when they see them.

Here are some tips for spotting deepfakes:

  • Pay attention to the quality of the video.Deepfakes are often created using low-quality video footage, so if the video looks grainy or pixelated, it may be a fake.
  • Look for inconsistencies in the person’s movements or speech.Deepfakes are not always perfect, and there may be inconsistencies in the way the person’s face moves or their voice sounds.
  • Be skeptical of videos that seem too good to be true.If a video seems too sensational or shocking, it may be a fake.

If you think you have seen a deepfake, it is important to report it to the social media platform where you saw it. You can also report deepfakes to the National Center for Missing and Exploited Children (NCMEC).

The deepfake video of Rashmika Mandanna is a serious issue that needs to be addressed. It is important to raise awareness about the dangers of deepfakes and to develop effective ways to combat them.

Leave a Reply

Your email address will not be published. Required fields are marked *