Oddbean new post about | logout
 nostr.build is in this list, with 8 reports of “apparent child pornography”. That’s not a lot, but more than we should want to see. 

While image uploads are more vulnerable to be taken down than other Nostr content (because an image is only hosted in one location) there are a growing number of these upload services available for Nostr clients. 

I’m not sure how illegal image content should be restricted on Nostr, but I’m eager to see this conversation unfold.  
 It’s the most heated conversation that nostr can have and with good reason. There are a ton of emotions involved on both sides, understandably. 💜 
 I’ll be here with plenty 💜 
and a bit of 🍿 
 I guess the mainstream image hosts can use AI to flag and take down images. This would cause any bad material to go to hosts that accept child porn material. This would potentially have the effect of 1) take away the majority of bad stuff from global  2) concentrate material and bad actors into several image hosts making it simpler for law enforcement (if they even do anything).