Online communities are utilizing AI character chatbots for harmful behavior.
Media coverage of Graphika's report Character Flaws details our deep dive into chatbot personas representing sexualized minors, those advocating eating disorders or self-harm, and those with hateful or violent extremist tendencies.
Coverage from Mashable touts Graphika's exploration of "the creation and proliferation of harmful chatbots across the internet's most popular AI character platforms, finding tens of thousands of potentially dangerous roleplay bots built by niche digital communities that work around popular models like ChatGPT, Claude, and Gemini."
When it came to the sexualized minor personas and the volume observed, the article also highlighted that Graphika "found more than 10,000 chatbots with such labels."
Graphika Associate Analyst Daniel Siegel, one of the report's authors, discussed the potential harms with Fast Companyand shared that "there’s a lot of efforts within these adversarial communities to jailbreak or get around the safeguards to produce this material that in many instances, is child sexual abuse material."
The generative AI revolution has significantly ramped up efforts and capabilities in these communities.
Siegel told Cyberscoop that the latest advances have allowed these groups to leverage “the most intelligent technology that humanity has really ever invented for the purposes of creating sexualized minors or extremist personas.”
“It’s an interesting situation in which technology, or humanity’s greatest feat, is kind of being weaponized by these niche communities to feed their harmful aims,” Siegel summarized.
Graphika ATLAS delivers ongoing insights about evolving online threats and narratives like these through our Generative AI Harms Feed. Subscribers find constant monitoring on narratives to stay informed on how different actors leverage AI tools to conduct online and offline harm.
To learn more on why organizations look to Graphika ATLAS for social media intelligence, schedule a custom demo with a member of our team.