Hyderabad Woman Loses Rs 1.4 Lakh in AI Scam, Receives Call From Fraudster Posing As Her Nephew
Hyderabad Woman Loses Rs 1.4 Lakh in AI Scam, Receives Call From Fraudster Posing As Her Nephew
AI scam: Woman falls for Rs 1.4 lakh fraud after receiving call from someone who posed as her nephew residing in Canada.

Just when the drawbacks of Artificial Intelligence (AI) weren’t sufficiently emphasised, a troubling incident from Hyderabad surfaced, revealing how a woman fell victim to an AI scam and lost money. In one of the rare instances in India, a 59-year-old woman in Hyderabad fell prey to an AI voice scam, losing Rs 1.4 lakh. According to a report by the Times of India, the scammer, sounding exactly like the woman’s nephew in Canada, claimed to be in distress and urgently in need of money.

The woman received the call late at night, and the caller informed her about an accident he supposedly had and the imminent threat of being jailed. Pleading for secrecy, he requested the woman to transfer the money discreetly. The woman shared her experience, stating, “He sounded just like my nephew and spoke exactly in the Punjabi we speak at home with all the nuances. He called me late in the night and said he had an accident and was about to be jailed. He requested me to transfer money and keep this conversation a secret.”

Also Read: Pune Man Forced By Bumble Partner to Pay Rs 23,000 at Cafe in ‘Viral’ Dating Scam

Unfortunately, the woman transferred the money into the scammer’s account, only realising later that she had fallen victim to a scam. City police officials, acknowledging the rarity of AI voice scams, advised residents to exercise greater caution.

Cyber experts also pointed out that individuals with family members in countries like Canada and Israel have recently been targeted by AI voice scams.

Prasad Patibandla, the Director of Operations at the Centre for Research on Cyber Intelligence and Digital Forensics (CRCIDF) in Delhi, shed light on the intricacies of these scams. He explained, “AI voice imitating tools can mimic a person’s voice precisely by utilizing data available in the public domain, such as social media recordings or even sales calls made by fraudsters. Creating a sense of urgency by fabricating a distressed situation in a foreign country adds to the effectiveness of these scams.”

Also Read: Noida Woman Caught Red-handed While Trying to Scam Residents, Confession Video Goes Viral

Just another day in the wild world of scams, keeping us on our toes!

What's your reaction?

Comments

https://terka.info/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!