How not?
you can’t really just catch CSAM
@The Fishcake🐶🐾 was describing what they do, its not perfect but apparently it works decently well, at least for public uploaders
Yeah, fairly simple indeed. Scan for nudity and then scan the same image for the age of people in it. Then human has to verify if it’s CSAM and report it if it is. Fairly effective given the simplicity, and good at catching AI CSAM
I don't know what sort of people you follow or searches you do, but I have never encountered one single instance of child porn in Nostr. If you find illegal stuff, call the cops. It's not your job to police people.
Err, where do you think people upload their media when they use nostr? Who do you think takes care of hosting it and also becomes a target for uploads that shouldn’t be happening?
i havent seen any cp from a nostr account but i saw one yesterday on mostrdotpub relay coming from fedi
So call the cops. Pretending that relay owners can do something about (without destroying Nostr) is 1. typical techie hubris by the relay operators themselves; 2. naive and counterproductive since it opens the door to Law Enforcement to make unreasonable demands until 1. happens, i.e., Nostr becomes another Facebook.