Parents are used to worrying about their kids, but they are not used to hearing a convincing recording of their child sobbing on the phone and demanding money. That is exactly what police say is happening in a wave of new scams that lean on fear, urgency, and even artificial intelligence to squeeze cash and personal data out of families. Local departments are now racing to warn residents before that first terrifying call comes in.
The basic playbook is simple: a stranger claims a child has been kidnapped or is in trouble, then pressures a parent to pay up or hand over sensitive details. What is changing is the tech and the cover stories, from fake AMBER Alert officials to AI voice clones that sound uncannily like a son or daughter. The goal for families is not to memorize every twist, but to recognize the red flags fast enough to hang up and check on their kids for real.
How the “kidnapped child” calls actually work
In one Kansas community, The Olathe Police Department says it has already logged two reports of a scam that is new to the area and specifically aimed at parents. Callers claim a child has been abducted, then demand money while insisting the parent stay on the line. The department is urging residents to share details of the scam so they can help protect others in the community.
Officers in the same city have gone a step further and laid out a script for what to do if that call comes in. They tell parents to Hang up immediately, then Contact the child or another relative to verify they are safe, and finally Call911 instead of sending money or sharing bank details. Federal consumer officials describe similar “virtual kidnapping” schemes where scammers threaten that they know where a family lives and say they will kill relatives if the victim does not pay, a pattern laid out in a warning about threatening phone scams.
AI is giving old scams a brutal upgrade
What makes the latest wave so unnerving is the way artificial intelligence is being folded into an old con. Police in one region describe it as a familiar kidnapping script with a new twist, where AI technology is used to make it sound like the call is coming from the supposed victim, a detail highlighted in coverage of a case involving Jonath and his family. In those calls, the voice on the line does not just claim to be a child, it sounds like them, right down to the cadence and background noise…