Investigating explicit non-consensual deepfakes/AI and disinfo w/OSINT
"I am a doctor, but not that kind of doctor."
The footage speaks for itself.
The footage speaks for itself.
Check out our investigation alongside @evidentmedia.org
Check out our investigation alongside @evidentmedia.org
Some get only a few thousand views. Some get over 150K views, like this one here.
Most of these videos aren't good enough to pass for the real thing, imho, but they do fool some.
Some get only a few thousand views. Some get over 150K views, like this one here.
Most of these videos aren't good enough to pass for the real thing, imho, but they do fool some.
I cannot fathom why people spend time on Sora creating their own AI videos of families being ripped apart to post online.
I cannot fathom why people spend time on Sora creating their own AI videos of families being ripped apart to post online.
Clothoff has repeatedly listed innocent companies on their website to point the blame at others. I hope they have more evidence than just that.
Clothoff has repeatedly listed innocent companies on their website to point the blame at others. I hope they have more evidence than just that.
Their discover page showed publicly shared AI porn made by users. While no new content is being shared, they still host older images
Their discover page showed publicly shared AI porn made by users. While no new content is being shared, they still host older images
It has been months later, no new Telegram, and no new photos on their 'discover' page.
It has been months later, no new Telegram, and no new photos on their 'discover' page.
religionnews.com/2025/10/07/i...
religionnews.com/2025/10/07/i...
And this makes sense unfortunately. In my own analysis of websites that offer deepfake porn services, the US and the UK are often the largest markets for users. There is a demand for this content.
And this makes sense unfortunately. In my own analysis of websites that offer deepfake porn services, the US and the UK are often the largest markets for users. There is a demand for this content.
Clothoff is a notorious AI-nudifying service that has repeatedly created porn of children.
Clothoff is a notorious AI-nudifying service that has repeatedly created porn of children.
You HAVE to mention if this is new or not, otherwise people will assume it is recent.
You HAVE to mention if this is new or not, otherwise people will assume it is recent.
Not from this weekend.
Not from this weekend.
So there WHERE is accurate.
So there WHERE is accurate.