Brooklyn Park Police Say AI Tool Speeds Child-Exploitation Probes

Brooklyn Park police say a new artificial-intelligence tool is turning marathon digital searches into sprint work, shrinking some child-exploitation and violent-crime phone and cloud reviews from weeks to just hours. Detectives credit the software with surfacing additional victims and reviving stalled weapons cases after uncovering photos and geolocation data. The overhaul followed a roughly $9,975 donation that paid for a forensic workstation and a software license for the department’s juvenile unit.

How the tool speeds investigations

“Out of 200,000 photos, if I miss one or something, I can hit enter, and it’s going to find me everything that’s visually similar,” Sgt. Jake Tuzinski said, describing a system that scans thousands of images at once and pulls together similar faces, backgrounds, clothing or weapons, Tuzinski told KSTP. The software flags images that have not been reviewed and prompts investigators to categorize and tag victims so it can search across a suspect’s phone and cloud accounts. Tuzinski said a full phone extraction that used to tie up a week can now be processed in an hour or two.

How Brooklyn Park paid for the software

The tools were funded largely through an Our Rescue donation to the Brooklyn Park Police Department. City council materials show the nonprofit contributed $8,974.52 and the department added a $1,000 match for a roughly $9,974.52 project. The donation package included parts for a forensic workstation, a forensic write-blocker field kit and a “Magnet Griffeye Advanced Machine License” for multimedia analysis, according to the City of Brooklyn Park. A memorandum of understanding requires quarterly reporting so donors can track how the equipment is being used.

Why advocates say speed matters

Minneapolis-based CornerHouse, which provides forensic interviews, advocacy and trauma-informed therapy for child-abuse survivors, says that faster removal of abusive images and quicker case turnarounds can reduce the mental-health harms victims face, according to CornerHouse. National figures highlight the scale of the problem: roughly one in four girls and one in 13 boys experience sexual abuse before age 18, a rate noted by the National Sexual Violence Resource Center. CornerHouse’s services include a community call line and coordinated forensic interviewing that connect families with legal and mental-health support.

A wider thread: AI misuse and prosecutions

The darker side of AI has been showing up in court documents too. A federal press release this week said a former school employee pleaded guilty to creating morphed AI images of at least 91 minor victims and producing more than 690 illegal images, illustrating why investigators want faster in-house tools, according to the U.S. Attorney’s Office. Prosecutors said the case is part of the Department of Justice’s Project Safe Childhood initiative and involved the Secret Service, the state Bureau of Criminal Apprehension and local police departments.

New state law targets ‘nudification’ tools

Gov. Tim Walz signed a bill this week that bans non-consensual “nudification” and other AI-generated intimate images, with the law set to take effect Aug. 1, 2026, according to WDIO. The measure allows victims to sue and gives the attorney general power to seek penalties against companies that host or promote such tools, though experts say going after overseas operators could be a tough lift. Details on potential penalties and the enforcement challenges were also discussed by Ars Technica…

Story continues

TRENDING NOW

LATEST LOCAL NEWS