Oddbean new post about | logout
 The head of Trust and Safety over on Bluesky is being extremely transparent about their challenges with moderation around CSAM since the influx of users. I appreciate the transparency. It’s worth checking out this thread to have an idea of what challenges are expected as Nostr grows. 

https://bsky.app/profile/aaron.bsky.team/post/3l3gerugkbt27 https://i.nostr.build/V3U2eVBkn2EGst09.jpg  
 I’d have to write a long post but what this individual is saying definitely makes sense. The reporting process to NCMEC is difficult to do when there is an unexpected influx of illegal content. It’s difficult for the major platforms to handle at scale on a daily basis. I’ll be interested to see how this goes for Bluesky. I feel bad for the mods. That’s a difficult job. 
 We already seeing named challenges, we just are not talking about it much. It is a problem, with automation or otherwise. 
 Probably should talk about it. 
 To what purpose? Ya’ll all are screaming “decentralized everything”, “give users all content and let them pick what they want to see”, while some of the media hosts struggle to stay legal and not get accused for hosting CSAM with minimal tools. So there is nothing really to talk about, if you ask me. We just try to innovate and moderate best we can while applying automation where possible without going broke. 
 @Laeserin sheeeeed! You are spoiling me 😂🫂🫂🫂🫠 
 Actually you are right there seems to be no purpose of speaking about these types of issues with you. 

I’ve heard many others speak about this topic on nostr and they seem more equipped to tackle this topic in a helpful way. Have a nice day. 
 Totally agree, less talking and more doing. 
 You good? Your vibe seems harsh. 
 Sorry, didn’t mean to sound harsh. Just a little drained by the constant battle with the illegal content (mouse and cat games). Did not mean to offend you 
 I’m not offended I just didn’t prefer the tone. I think you just tapped into something with this comment. It is definitely draining. Devs and mods feel the brunt of some of the world’s most horrific evils. It’s a pressure that I wouldn’t wish on my worst enemy. 

One of the saddest mistakes that humans made was waiting until the abuse of children got so bad offline that it ended up online and we blamed the internet for it. This response was wrong. In order for CSAM to have ended up online two crimes have to have already been committed, possibly three or more. Attacking it online is midway through instead of attacking the root cause. 

My hope is that as a society everyone can agree to start to work towards preventing the abuse of children before it happens and ends up online in the first place. 

The decentralized/censorship resistant nature of nostr will also serve survivors in many ways so there is a bright side! Survivors will be able to share their experience, strength and hope anonymously if they like and no abuser or person in position of power will be able to remove their voice. 

Listening to the voices of survivors gives us a glimpse of how the crime could have possibly been prevented before it happens. 

I hope that thought brings some peace. 🫂 
 Thank you! There is another unfortunate thing going on, a generated AI CSAM which is limitless in variety and hard to tackle. So everything is evolving, including the crime. 
 That will have an innovative solution. Frankly, AI will probably fix that for itself. As wild as that sounds. 
 I have an idea. Let’s hold politicians & law enforcement responsible for CSAM occurring within their jurisdictions. It makes far more sense than holding the owner of a global online platform w/ a billion users responsible.

If we see child porn going down, an executive officer swings from a rope. Kinda sounds nice. 
 There probably is a conversation there but I’d need to think about that. 
 We need technology that simply exists and cannot be stopped by them. Whatever we have to do to create that, we need to. Otherwise, they will always find a way to bring it under control and render it exactly the same as all the existing institutions. You are dealing with a mafia here.

Who do you think generates half of this CSAM? Maybe the people using sex with kids on tape for blackmail don’t get to tell us what we’re doing wrong by providing for human freedom? 
 Ah, so the Durov arrest had its intended effect on you. Don’t you feel
manipulated? You should. It’s a ridiculous notion that a platform is responsible for every activity of its users and it won’t hold up in the U.S. because our Founders had the foresight to put red lines down on this.

If it does, who cares about social media because it’s time for war. 
 I swear to God, these subhuman psychopaths roll these ridiculous pretexts out and then everyone tap dances to them at face value. Whatever censorship tools you create, they will pass a bill for “the children” requiring you to also censor exposure & criticism of them. I promise.

If it’s not decentralized, there’s no point in crypto. There’s no point in any of this. VISA and Meta can do it better. We are AT WAR for human freedom. Act like it. 
 Have an account over there I can follow? 
 I used Bluesky at first but I haven’t touched it in well over a year. I’ll probably never go back. Frankly I’d rather go to threads if I was going to seek out a platform to use for fun but I don’t see a need. 😂 
 Fair. 😁 
 Is it fun over there? lol 👀 
 It was fun when it was new and the main apps all run circles around nostr or mastodon in terms of polish and features, but the community is VERY left-wing. Even for my tastes. Lots of pro-Hamas types over there. 
 Who gives a shit what’s happening on that censored piece of shit?