Oddbean new post about | logout
 Actually you are right there seems to be no purpose of speaking about these types of issues with you. 

I’ve heard many others speak about this topic on nostr and they seem more equipped to tackle this topic in a helpful way. Have a nice day. 
 Totally agree, less talking and more doing. 
 You good? Your vibe seems harsh. 
 Sorry, didn’t mean to sound harsh. Just a little drained by the constant battle with the illegal content (mouse and cat games). Did not mean to offend you 
 I’m not offended I just didn’t prefer the tone. I think you just tapped into something with this comment. It is definitely draining. Devs and mods feel the brunt of some of the world’s most horrific evils. It’s a pressure that I wouldn’t wish on my worst enemy. 

One of the saddest mistakes that humans made was waiting until the abuse of children got so bad offline that it ended up online and we blamed the internet for it. This response was wrong. In order for CSAM to have ended up online two crimes have to have already been committed, possibly three or more. Attacking it online is midway through instead of attacking the root cause. 

My hope is that as a society everyone can agree to start to work towards preventing the abuse of children before it happens and ends up online in the first place. 

The decentralized/censorship resistant nature of nostr will also serve survivors in many ways so there is a bright side! Survivors will be able to share their experience, strength and hope anonymously if they like and no abuser or person in position of power will be able to remove their voice. 

Listening to the voices of survivors gives us a glimpse of how the crime could have possibly been prevented before it happens. 

I hope that thought brings some peace. 🫂 
 Thank you! There is another unfortunate thing going on, a generated AI CSAM which is limitless in variety and hard to tackle. So everything is evolving, including the crime. 
 That will have an innovative solution. Frankly, AI will probably fix that for itself. As wild as that sounds. 
 I have an idea. Let’s hold politicians & law enforcement responsible for CSAM occurring within their jurisdictions. It makes far more sense than holding the owner of a global online platform w/ a billion users responsible.

If we see child porn going down, an executive officer swings from a rope. Kinda sounds nice. 
 There probably is a conversation there but I’d need to think about that. 
 We need technology that simply exists and cannot be stopped by them. Whatever we have to do to create that, we need to. Otherwise, they will always find a way to bring it under control and render it exactly the same as all the existing institutions. You are dealing with a mafia here.

Who do you think generates half of this CSAM? Maybe the people using sex with kids on tape for blackmail don’t get to tell us what we’re doing wrong by providing for human freedom?