This isn’t about X failing to moderate CSAM, which is an Online Safety Act issue. It is about the company and its technology being actively involved in its generation.
This isn’t about X failing to moderate CSAM, which is an Online Safety Act issue. It is about the company and its technology being actively involved in its generation.
Somehow, I don't think it's going to play out like that, though.
Somehow, I don't think it's going to play out like that, though.
1.2 million children dying.
Abhorrent. Evil. Indefensible.
1.2 million children dying.
Abhorrent. Evil. Indefensible.
So you can have things that are clearly on review a foul or whatever, but because it doesn’t fit some criteria on whether it was a reasonable mistake they pretend they can’t see it
So you can have things that are clearly on review a foul or whatever, but because it doesn’t fit some criteria on whether it was a reasonable mistake they pretend they can’t see it