(Don’t) Look at This Photograph

Graphika Report

Monday March 30, 2026

(Don’t) Look at This Photograph

Matthew Patane

Download Now

Examining the Tactics AI Nudifier and Undressing Services Use for Promotion and Revenue Generation

Overview

Companies and services that provide synthetic, AI-generated nonconsensual intimate imagery (NCII), also known as AI nudifier or AI undressing services, continue to proliferate, expand, and adapt. For several years, Graphika, Indicator, the Institute for Strategic Dialogue, Bellingcat, other research organizations, and news outlets have reported extensively on the AI nudifier industry, highlighting its pervasiveness. At the heart of this industry is a profit-driven motivation that perpetuates harm and harassment against individuals – primarily women and girls – who have not consented to their likenesses being used for sexual purposes.

This report builds on and complements prior research by examining the tactics, techniques, and procedures (TTPs) NCII services use that Graphika documented through our intelligence monitoring between August 2025 and March 2026. It is based on open-source research and investigative techniques, including targeted search engine queries, domain and source code analysis, and ad transparency tools. It highlights how NCII services remain tenacious and adaptive despite government regulation, public scrutiny, and social media platform moderation.

Throughout, we refer to NCII services as AI nudifier or AI undressing services, websites, or apps. These phrases refer to the same practice: online services that enable customers to upload images and create and disseminate AI-generated NCII of individuals. This imagery, which includes videos, typically depicts the individuals as nude, removing their clothes, or performing sexual acts.

We also refer to AI companion (e.g., AI girlfriend/boyfriend) and AI adult entertainment services, which we consider to be different from NCII services. AI companion and adult entertainment services may offer users the ability to create synthetic sexually explicit content, but they do not necessarily utilize images of real people. We only defined a service as an NCII service if we assessed with high confidence that it provided synthetic NCII generation options or explicitly advertised itself as such. We did not test any of the services discussed in this report.

Download Now

The Best of Graphika in Your Inbox

Sign up for updates via our email newsletter.