top of page
Writer's pictureAndy Murphy

AI Voice Kidnapping Scams


AI Voice Kidnapping Scams Representation

What Are AI Voice Cloning Kidnappings?


Unless you’ve been living under a rock, or have healthy social media habits, then you’ve heard all of the recent stories about voice cloning and virtual kidnapping scams. The new hot topic in the news is how scammers are using AI voice cloning technology to scam people into thinking a loved one has been kidnapped.


I’m going to tell you how this scam works, my take on it, and what you can do to combat this type of problem. And I’ll wrap it up with where I think this is going in the future.


How Virtual Kidnapping Scams Work


The typical scam so far goes like this:

  • A parent will get a phone call from an unknown number. When they answer they think they’re hearing the voice of their child who is in danger.

  • The child will speak briefly saying they’ve been taken and will sound, understandably, scared to death.

  • Then the kidnapper gets on the phone to demand ransom and to work out a payment method. (This could be a digital exchange or a drop in a public place.)

  • When the money has been exchanged, the child will be released.


But here’s the funny thing about these scams in the news over the past two months, they’re not working.

Get 20% off a DeleteMe privacy plan.

The Reality of AI Voice Cloning Virtual Kidnappings


Out of the few recent stories about AI voice cloning virtual kidnappings, the parents reach out to the kid and quickly discover their child is safe. I don’t know of a single case where AI voice cloning virtual kidnapping schemes has resulted in a payment. When I research these stories, I can’t find an article that says the name of the specific website or software that was used. Or that police have caught the scammers and they know how it was done.


There’s only a lot of conjecture about what could have been used.


In legal speak that means there’s no hard evidence, voice cloning is actually being used in virtual kidnapping scams.

But wait, the FTC has issued a warning about voice cloning in scams, so that makes all of these stories real, right? Back in March, the FTC issued a consumer alert about voice cloning scams for fake family emergencies. They give a great overview of how the scam would work, but they don’t offer any stats or real cases that use the technology.


Have AI Voice Cloning Scams Happened Before?


There is a case from 2019 where a UK-based energy firm was spoofed in a supposed AI-voice-generated scam to impersonate the CEO. In this case, an employee was duped into thinking that the CEO was calling and asking for a payment to be sent to a vendor.


One payment was made before the company caught on and realized there was a problem. But there’s no recording of the AI voice and we have to take the word of the human on the phone that it was an AI-cloned voice. And if you drill down into this 2019 story, it’s only the belief that the CEO’s voice was cloned.


And I will fully admit this is a very hard case to make for law enforcement. If the victims aren’t recording their phone calls, then police only have the word of the person on the phone that it was a cloned voice. These phone scams are very hard to track down making it even harder for police to investigate.


What Does AI Voice Cloning Mean for Parents?

So what does that mean for us? Do I think that AI-Voice cloning has been used to perpetrate a crime before in human history? Yes, I do. Is AI-voice cloning being used to virtually kidnap children in suburbs? No, I do not.


What I think is happening here is that when parents are faced with the horrific scenario of kidnapping the voice that they hear on the phone is cognitively close enough to make them believe it’s their child. We’ve heard this before from our friends Greg Williams and Brian Marren at Arcadia Cognerati.


In fact, when I was discussing this with Gonzalo at Combat MF, he made the point to say that the amygdala gets high-jacked and people believe that they are hearing their child’s voice. I think he’s right.


People become convinced that they are hearing their kid’s voice and you can’t convince them otherwise. They truly believe it.


Virtual Kidnappings Are Not New


Listen, phone extortion and virtual kidnapings are not new. Criminals use open-source intelligence to find out the names of the target’s kids. Maybe they even follow the child on social media and know what they're doing to enhance the realism of the scam.


Then they get someone who can generally mimic the panicked voice of the child who talks on the phone just long enough to reel the victim in. That’s what the scammers want to happen. That’s their plan.


Generally speaking, it’s a lot easier to get a young woman accomplice in on the scam than it is to find audio clips of the target’s kids, run them through AI, and then play them back in a real-time conversation that sounds natural. I want you to critically think here. What’s more likely to happen to the average person?


What To Do If You Get An AI Voice-Cloned Kidnapping Call


So for now, what do you do if you get one of these calls? Cloned AI voices aside what you need to do is confirm the identity of the voice on the other end of the phone. You can do that with the old-school family codeword.


Yes, the same codeword I’ve been asking you to have to identify a safe person for your child. In the past, these words were used when someone tried to pick up your child. The codeword would be some secret, but simple word like pancakes.


When the person showed up, the child was to ask for the codeword and if it was given, the child knew the parents had sent them. My parents used this once to send someone to pick me up from school. I can attest that it works.


When faced with such a phone call, ask the child what the codeword is. If they don’t know, then you hang up. You confirm the location of your child and then call the police to report the scam.

If you fail to set up a codeword in time, you can always ask what the nickname for the dog is. We all know dogs have their real name and like 15 different nicknames. Get one of those nicknames from the voice on the other end. In essence, you’re asking for information that only they would know and that would not be easily found on social media.


Going forward, I do think that AI voice-cloning scams will increase. As law enforcement has said for a while the technology is getting easier and more readily available. So you do need to be aware of this problem, just don’t believe all of the hype.


Also know that if you are a podcaster, social media influencer, or YouTuber then there are great examples of your voice on the internet just waiting to be cloned. So if you get a phone call from me, you can rest assured it’s fake. Because I don’t like to talk on the phone.


Watch AI Voice Cloning Virtual Kidnapps on YouTube


Andy Murphy

Andy Murphy founded The Secure Dad in 2016 with the aspiration to help families live safer, happier lives. What started as a personal blog about family safety has turned into an award-winning podcast, an Amazon best-selling book, and online courses. He focuses his efforts in the areas of home security, situational awareness, and online safety.

 

Andy is a husband and father. His interests include coaching youth basketball, hiking, and trying to figure out his 3D printer.

 

TheSecureDad.com

DeleteMe_250.png
Nord300x250.png
Troomi_WebBanner_300x250.png
Get Updates from Andy
bottom of page