top of page

It’s Happening…AI Kidnapping Faked For Ransom Money

Published: January 19, 2024 on our newsletter Security Fraud News & Alerts Newsletter.



Many of us wonder about the advent of artificial intelligence (AI) and how it will be used for wrongdoing. Well, wonder no more. The unthinkable happened when an Arizona mother received a phone call from her 15-year-old daughter saying she was kidnapped and being held for $1 million ransom. What unfolded exposed a cruel scam using AI to clone her daughter’s voice. It’s not the first time that criminals exploited AI for their benefit, and it surely won’t be the last.


Just a Few Seconds: AI Voice-Cloning Schemes


The phone number was unfamiliar, but the mother answered it only to hear her daughter pleading for her help. She never doubted it was her daughter’s voice, a chilling goal of the bogus kidnapping. Eventually, she was able to confirm her daughter was safe where she was supposed to be – away on a skiing trip.


AI specialist and Arizona State University computer science professor, Jim Stickley of Stickley on Security, lends his expertise to help explain the rise in AI voice-cloning. He says it takes only a few seconds of a person’s voice to clone it, including creating voice inflections and emotions, making it sound 100% authentic.



Social media, the FBI has long warned us, is where criminals find targets for fraudulent crimes. The Arizona mother says her daughter’s voice isn’t on social media but can be found in public interviews having to do with her school activities.


Tools to Prevent and Detect


FBI special agent, Dan Mayo, offers some sound advice to help prevent and detect AI voice-cloning fraud. He says public profiles are “allowing yourself to be scammed by people like this…and when they get ahold of that, they’re going to dig into you.”


That’s sound advice for those posting too much information (TMI) on their social media accounts. That TMI includes photos showing personally identifiable information like a car license plate or a boarding pass for that upcoming vacation.


Stickley also advises asking questions about the alleged kidnap victim that only someone who knows them has the answers to – and the scammer doesn’t. You can also create “code words” so that only you know and can be exchanged when there is doubt. Calls from unknown phone numbers and area codes, including international numbers, are reasons for suspicion.


For all the groundbreaking good AI has to offer the world, like most things, there are those who abuse it for personal gain. As for AI’s roll with cybersecurity, buckle up for a wild ride and stay informed to stay safe.


Want to schedule a conversation? Please email us at advisor@nadicent.com


bottom of page