Emergency call from partner for money can be fraud:AI voice scams are on rise, here’s how a passphrase can save you

Imagine this: your phone rings. It’s your spouse. They sound terrified. There’s been an accident. Money is needed immediately. Your heart races. That’s exactly what happened to podcast host Dustin Burnham.
Below is the conversation that took place between him and the scammer, who was pretending to be his wife.
The call was fake. The number had been spoofed. The voice, which sounded exactly like his wife’s, was generated using AI. His son was perfectly safe. That one question about a secret passphrase saved him thousands of dollars. The rise of AI voice scams What Burnham experienced is part of a fast-growing global problem known as AI voice phishing. Scammers can now copy someone’s voice using just a few seconds of audio taken from social media videos, podcasts, or voice notes.
With modern AI tools, they can create a nearly perfect imitation and use it to make urgent, emotional calls. The most common trick? A “family emergency.” Fraudsters pretend there’s been: Some go even further. They combine cloned voice calls with short manipulated videos to create panic and push victims into sending money quickly. These scams are built around one thing: urgency. If you panic, you don’t think. Why experts say it may get worse With AI tools becoming cheaper and easier to use, experts warn that this is just the beginning. Nikita Bier, product head at social media platform X, predicted things could escalate quickly. He wrote: In less than 90 days, all channels that we thought were safe from spam and automation will be so flooded that they will no longer be usable. He also pointed to open-source AI tools that allow almost anyone to build automated systems capable of sending convincing messages at scale. Also read: Galgotias University flaunts Chinese robot as its own, sparks row, walks out of AI Impact Summit after being asked to leave

It’s not just a US problem; India is seeing it too AI voice scams are rising in India as well. In March 2025, a 72-year-old woman in Hyderabad reportedly lost Rs 1.97 lakh after receiving a cloned voice call. In another case in Indore, scammers allegedly copied a teacher’s voice and duped her of Rs 1 lakh. A 2023 McAfee study found that 38% of Indians could not tell the difference between a real voice and an AI-generated one. The Government of India’s Digital India initiative has also warned citizens: “Beware of AI voice cloning! Scammers can mimic the voice of trusted contacts.” The simple solution: A secret passphrase What saved Burnham was something surprisingly simple, a pre-decided family passphrase. A passphrase is a secret word or sentence that only close family members know. If someone calls in distress asking for money, you ask for the passphrase. If they can’t say it, you know something is wrong. The FBI has recommended that families create a “secret word or phrase” to verify identity during emergencies. Financial institutions like Starling Bank have also published guidelines encouraging people to do the same. Erin Englund, director of threat analytics at BioCatch, explained why this works: Fraudsters will use manipulation tactics to put the victim in a vulnerable state where they act out of panic… Having a passphrase enables victims to quickly validate the legitimacy of an unusual interaction and take control. It forces a pause. And that pause can prevent financial disaster. Also read: Watch all released videos from Epstein files in one place

How to create a safe passphrase A good passphrase should follow a few rules: Rachel Tobac, CEO of SocialProof Security, warned: Avoid joking about your code word in your text messages or social media posts… It’s got to be kept private. Burnham suggested something clever. Instead of directly asking, “What’s the passphrase?” — which alerts scammers — use a trigger line that sounds unrelated. For example, you might say, “I’m eating banana cream pie.”
Your partner would then respond with a pre-agreed coded reply. Also read: Apple announces unexpected March 4 launch event

Verification is the new trust We are entering a time when hearing a loved one’s voice is no longer proof that it’s really them. AI can recreate tone, emotion, even panic. That means trust alone is no longer enough. Verification has become an act of care. A simple, secret passphrase, known only to you, could be the difference between panic and protection, between losing your savings and keeping them safe.

Leave a Reply

Your email address will not be published. Required fields are marked *

Enquire now

Give us a call or fill in the form below and we will contact you. We endeavor to answer all inquiries within 24 hours on business days.