rumor says nostr has a csam problem?
What’s being done to prevent it? Child abusers and sickos will seek refuge and community anywhere online that turns a blind eye to their perversions.
If Nostr actually works you can't really prevent them from using it. But you don't have to look at it. Best way to censor them through the media hosts which host that stuff. But if they start using their own hosts etc...
I’d hate for NOSTR as a protocol or Damus, etc., as a service to be targeted by law enforcement or legislation limiting access. Or, perhaps worse, negative public sentiment due to bad press coverage. Consider Silkroad, and the pariah it’s been turned into… It’s best we police ourselves as a community and give no appearance of quarter to such bad actors. It’s certainly not easy, but one solution would be for media servers to filter CSAM by default, and applications can detect if the default has been changed or not when connecting to node / servers. It’s not perfect but it’s a start… 🤷🏻♂️
I've never encountered that sort of content, so I'm not sure how big of a problem it is. To my understanding, most social platforms have these issues.
I’d like to think that we, the digital dreamers, have an opportunity to do it better. If a perfect money is possible to solve, this can be solved too.