Ratan Tata latest to get deepfaked, scammers promote investment scam using fake video

In a fake video that has now gone viral, scammers used a fake Instagram account with the name Sona Agarwal and posted a deepfaked video featuring Ratan Tata, offering followers a risk-free opportunity to “exaggerate investments” with a guarantee

Prominent business tycoon Ratan Tata has fallen victim to the growing trend of deepfake technology.

Usually reserved, the former chairman of the Tata Group recently used Instagram Stories to disavow a video in which he was falsely portrayed giving investment advice.

The video, shared by an Instagram user named Sona Agarwal, featured Tata supposedly offering followers a risk-free opportunity to “exaggerate investments” with a supposed guarantee.

Related Articles

Rashmika,

Rashmika, Katrina’s deepfake video tip of the iceberg, Ratan Tata’s fake video used by scammers

Rashmika,

‘A new threat to democracy’: Govt will come up with new regulations soon to tackle deepfakes, says IT Minister

The caption accompanying the video urged viewers to take advantage of this purported opportunity. Tata responded by labelling the video and its caption as “FAKE” in both the video and a screenshot of the caption.

In an earlier warning on October 30, Tata cautioned his followers against trusting random videos that used his name or image.

He specifically debunked videos falsely claiming that he provided suggestions to the International Cricket Council regarding fines and rewards for cricketers.

Tata clarified that he had no involvement in cricket and advised against believing similar misinformation unless it originates from his official platforms.

This incident highlights the increasing use of deepfake technology to create convincing yet false content featuring well-known individuals.

This incident follows a recent trend of manipulated videos circulating on the internet, with actresses Rashmika Mandanna, Katrina Kaif, Alia Bhatt and then Priyanka Chopra also falling victims to digital manipulation.

Unlike previous instances, Chopra’s manipulated video did not superimpose her likeness onto contentious content; instead, her original voice and lines from a legitimate video were replaced with fabricated brand promotion.

As concerns grow over the pervasive use of artificial intelligence in creating deceptive content, public figures like Ratan Tata are taking a stand to protect their identities from exploitation on social media platforms.

The incident serves as a stark reminder for the public to exercise caution and verify the authenticity of information circulating online, especially when it involves prominent personalities endorsing financial schemes.

(With inputs from agencies)

 

Reference

Denial of responsibility! My Droll is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment