Deepfake Videos Misuse Sudha Murty’s Image and Voice to Lure Investors
Sudha Murty, chairperson of the Infosys Foundation and a member of the Rajya Sabha, has issued a very serious and thought-provoking warning that should serve as a reminder for all social media users to reconsider their actions. Online videos generated using AI are misleadingly showing her as a promoter of investment schemes and someone offering easy money. Murty, through X, stated that she is not in any way connected to these videos and has never given financial or investment advice.
The situation is very serious because the clips appear highly realistic and have clear audio quality. Produced using sophisticated AI technology, they misuse her image and voice to gain people’s trust and lure them into risky schemes. Murty said that some people known to her have already lost money after believing these false endorsements.
Put simply, this is not merely a scam, it is a warning sign of the growing threat posed by deepfakes. Murty urged people to remain vigilant, verify information only through trusted banks and official agencies, and never base financial decisions on social media videos. Her message is clear and urgent: do not be deceived by digital illusions, protect your money and stay safe.
“I Never Talk About Financial Investments” Said Sudha Murty
“Many people I know have invested and lost money,” Murty said. “I never talk about financial investments or do anything with money. I talk about work, India’s culture, women, and education,” she added.
She urged people to verify facts through official sources, report fraudulent content, and stay alert. “Remain safe. Jai Hind,” she concluded.
Victims Trapped By Realistic Deepfakes And Fake Investment Links
Sudha Murty asserted that the consequences of AI-created scams are already tangible and even more serious. It is said that a few people known to her have been defrauded after trusting fake videos that wrongly show her endorsing so-called investment schemes promising unusually high returns. These deepfakes are created in a way that makes them highly convincing and articulate, making it easy for an unsuspecting audience to accept the claims.
The authenticity of the videos is further reinforced by links that often redirect users to fraudulent sites where they lose personal details, including banking information and phone numbers, a common trick used by scammers. The use of technologies such as voice cloning and facial manipulation clearly shows the extent to which scammers go to make their frauds look real by exploiting public trust in well-known personalities. Murty’s alert highlights the harsh reality of these digital traps and serves as a strong reminder that financial decisions should never be based on social media videos, no matter how realistic they may appear.
Aishwarya is a journalism graduate with over three years of experience thriving in the buzzing corporate media world. She’s got a knack for decoding business news, tracking the twists and turns of the stock market, covering the masala of the entertainment world, and sometimes her stories come with just the right sprinkle of political commentary. She has worked with several organizations, interned at ZEE and gained professional skills at TV9 and News24, And now is learning and writing at NewsX, she’s no stranger to the newsroom hustle. Her storytelling style is fast-paced, creative, and perfectly tailored to connect with both the platform and its audience. Moto: Approaching every story from the reader’s point of view, backing up her insights with solid facts.
Always bold with her opinions, she also never misses the chance to weave in expert voices, keeping things balanced and insightful. In short, Aishwarya brings a fresh, sharp, and fact-driven voice to every story she touches.