This is awesome, but since you mention you used a nudity detector and haven't reviewed them...
I would expect for a popular service you would have gigabytes of random legal nudity for every actual Interpol-worthy image, right? I guess Interpol is OK sifting through all that?
And wouldn't that mean that for all of those nude-but-legal pics you were unnecessarily disclosing regular user PII by sending along access logs to law enforcement? Not to say it wasn't worth it in the end
I asked interpol upfront if it would be okay to send them the data for them to review and they said OK.
There would probably not be a reason to upload "legal nudity" on a demo site of a selfhostable service and also I wipe the whole database in irregular intervals.
But actually now I'm using the CSAM detection tool from cloudflare which automatically detects, flags and reports CSAM which takes the hassle out for me. Not sure about false positives though but afterall. It's just a demo site and nobody should upload anything other than test things
The more detailed stories in the link do not mention that the material was forwarded unreviewed. In fact, they specifically mention that this concerns 16 images.
Note that they said "I didn't want to look through all of these images", not the ones that are detected. They probably manually reviewed the ones that are detected by the automated system before sending them to Interpol.
I would expect for a popular service you would have gigabytes of random legal nudity for every actual Interpol-worthy image, right? I guess Interpol is OK sifting through all that?
And wouldn't that mean that for all of those nude-but-legal pics you were unnecessarily disclosing regular user PII by sending along access logs to law enforcement? Not to say it wasn't worth it in the end