Rise in AI-Generated Sextortion Scams Fuels Suicide Concerns: Canopy’s AI Solution Offers Protection
Cases of sextortion scams have skyrocketed in recent years, leading to growing concerns over the mental well-being and safety of victims. According to the FBI, sextortion cases have increased by a staggering 322% between February 2022 and February 2023, with an additional uptick noticed since April. These cases involve criminals using artificial intelligence (AI) to twist innocent pictures into sexually explicit deepfakes, which are then weaponized against vulnerable teens and preteens. Tragically, these incidents have resulted in multiple suicides.
In response to this alarming trend, Canopy, an AI-driven company, has developed a software solution to combat sextortion scams. Canopy’s AI platform, developed over 14 years, is capable of quickly detecting and blocking sexually explicit images and videos. Yaron Litwin, Canopy’s executive, explained that the platform acts as an additional layer of protection, preventing children and teenagers from sharing even innocent pictures that could be exploited by criminals. Canopy’s software can also filter out explicit content in real time, providing a safer online browsing experience.
Litwin emphasized that their AI technology aims to protect children and is referred to as AI for good. The company is also collaborating with the FBI to assist in the filtering of sexual abuse material and provide investigators with tools to safeguard their mental health from the distressing images they encounter during their work. By swiftly identifying inappropriate content, AI can alleviate the need for manual inspection, protecting both victims and law enforcement personnel.
However, as AI technology advances, developers face challenges in discerning between real images and AI-generated fakes. The FBI describes sextortion as a crime where victims are coerced into providing sexually explicit photos or videos, which are then threateningly shared publicly or with their loved ones. Malicious actors exploit content manipulation technologies to create lifelike sexually-themed images from victims’ photos, circulating them on social media and pornographic websites.
Teenagers, primarily males aged 10 to 17, are among the most targeted victims of sextortion. Although young girls can also fall prey to these scams, the statistics show a higher number of boys being victimized. Tragically, at least a dozen sextortion-related suicides have been reported this year alone. The FBI warns that reported numbers may not reflect the true extent of the problem, as many victims feel ashamed and do not report the crimes.
To combat this growing issue, organizations like the National Center for Missing & Exploited Children (NCMEC) offer resources to assist victims. NCMEC’s Take It Down service helps victims remove or halt the online sharing of explicit images or videos. Additionally, the FBI provides recommendations for safe content sharing online and resources to support extortion victims.
As the battle between AI developers, criminals, and their victims intensifies, it is crucial to prioritize the protection and well-being of vulnerable individuals. The rise in AI-generated sextortion scams demands a comprehensive approach that combines technological solutions with education and support networks to safeguard the mental health and safety of potential victims.