Oddbean new post about | logout
 Hello Nostr, if you are in a great mood just skip this post; its depressing. 

So I had not encountered it before, but yesterday I crossed paths with Child Sexual Abuse Material on Nostr. In my regular internet usage over the years I have rarely come across this stuff, though I guess if I were to look for it I would find it eventually;
That is to say, the status quo is that it does exist, but most people most of the time wont have to deal with it. I think this is important to realize that the world is not perfect as it is, when reflecting on these matters in the context of Nostr.

It goes without saying, but just to be clear: yes I think we should all learn how to tie nooses and identify adequate oak trees. 

However marginalized CSAM is, some people want governments to go above and beyond to combat it. Prime example currently is the ‘Chat control’  regulation proposed out of the EU, which wants to install bigbrother client side on your phone to scan every single thing you do in order to flag any suspicious behavior/content, before it gets encrypted. How understandable the motivation might be, even advocacy groups and agencies dealing with the CSAM problem are against this type of stuff, if not just simply because they are already swamped with work/processing of material as it is; opening the floodgates with false positives wont help anything and probably make the situation worse. Aside from the obvious objections to forcibly installing big brother on peoples hardware of course.

Back to Nostr. On the one hand we have the end-user, that does not want to get confronted by this material. From this perspective, CSAM is just one of the many things a user might want to filter out, along with other material that might not be illegal per se but just NSFW etc. Whatever means we find to do this, failure by those mechanisms to do so is bad, unwanted etc. but not a direct systemic risk to Nostr; like I mentioned in the beginning, it is not impossible to accidentally come across this type of stuff on the internet today as is, and the whole world is still using it. 

But it does become a systemic issue from the relay perspective. Here, it is not some incidental bad experience that can be clicked away. It is a crime to host this type of material which brings in the risk of prosecution for ‘simply running a relay’ that some asshole decided to nuke with CSAM or other illegal material.

But here my optimism comes in. Nostr is pro censorship; the theory is that every relay can moderate to their hearts content, because users are ultimately always able to route around such obstacles (very much like ‘the internet’ itself). This means that that relays should be able to adjust their policies and methods of moderation to their capacity to deal with unwanted content and risk appetite. From a locked down white-list only relay on one side of the spectrum, all the way to an open relay with heavy sophisticated analytics for assessment and filtering, and everything in between: albeit that it wont deliver us a perfect solution in all cases, it will remove the dark cloud of systemic risk to the protocol/network, because we are able to sufficiently marginalize the phenomena.

On a last note: when talking about filtering/assessing for this content it gets complicated really quickly. You can imagine some AI performing such a task, or using lists of known content to filter; however you want to do it, you first come to the question on how you construct that stuff in the first place; it requires gathering such content and human eyes looking at it. And then subsequently you produce tooling that can be flipped around and used as a search engine to seek and find such material instead of filtering it away. So yeah, there are no graceful perfect solutions I am afraid.

Well, there is one of course….
https://cdn.satellite.earth/a92bdd80dbd45e00636a9db615061eef168c3164a0e1bfa1abfb0784e74cd24e.mp3 
 i know it's totally irrelevant but how did you get to this message, what hosting service was it, what kind of npub was it, filtering out images of people you don't follow is a smart move, even alt text seems like a smart move that way. 
 Unfortunatly i can reliably reproduce the steps to get to the CSAM i came across, but i wont share it. But it was not something that popped up in my feed, but as a result of a search query (to something completely unrelated to the CSAM obviously).

In this case they were kind 1063 events, hosted on one of the bigger relays that a lot of people use.
Normally 1063 event contain a URL and content (in this case a picture) is hosted somewhere else. Here, it was not a URL but the raw file in base64 encoding, which the client is then supposed to translate to a webp (though this is not part of the NIP-94 spec).

How clients handle this varies, i happened to use one at the time that is able to handle this stuff, so it displays the picture direcly. Most other clients i have tried dont and just produce a raw base64-string (luckely in this case) without transforming it into a webp picture. Or a download button that does nothing (because there is no actual url there) 
 This is why I disabled global search on https://advancednostrsearch.vercel.app 
 What relays were they coming from? 

Were the npub posting it NIP05 verified? 
 #Primal, it's time to get a grip on your media storing servers.
 nostr:nevent1qvzqqqqqqypzqh4yvjqytwcl7g3x2hwaxmndemwugdvscfsfp3yxhmecaazsmfdaqqsrhpgcqtts0pvtl90ny7tf9ap73xt36jm4hzvkxyhly8d40txhhkqev9j38 
 On amethyst, you can't even report some things. They just show up as images with no user or or menu to report 
 And the filter/block lists don't work. I'm still seeing notes by npubs I've already blocked due to spammy behavior. 
 Block/Mute does work, it's possible another client is overwriting you lists if you swap clients often. 
 Sadly, these images are base64 encoded images and don't show up in the normal fashion. I told Vitor about this in the past. I believe we'll get more controls for these types images in the future. 
 Andrew Torba that runs Gab Social noticed most of this porn stuff was coming from one country ... Israel. So he blocked all IPs coming from Israel. That pretty much solved Gab's porn problems.

Of course they falsely twisted that into saying he blocked the IPs because he is racist against Jews.

Gab doesn't allow any form of porn. 
 #nostr 支持审查!而且,审查是中继站的责任。 
 In the end, a Llama Meta model will filter the content.
 
 Sadly, this is a very complex issue not only for nostr  but for the entirety of the internet and offline. First off, thank you for sharing and bringing this up because I had the same questions and posted it on here. Second, I strongly believe that concerned users like us would play a massive part too to combat this type of content better than any AI or ML. I actively report contents like this and I came across it few hours ago. I always make an effort to report these things. Like you said, it is in the best interests of any hosting owner to make sure they do not host "illegal" content such as CSAM. Atm, I have no better answers or one solution that fits all the scenario. But I have faith that collectively we use our own discernment to report this kind of content. Ty! ❤ 
 If I'm not wrong, @The Fishcake🐶🐾 some time ago wrote about his experience about filtering out CSAM content. 
 Yep, a hard problem with no simple solution and a lot of legal baggage. Ask any specific questions if you have them, will try to answer what I can 🫡🫂 
 chat control is a scam because if they get everyone to give the government a golden key, the golden key will be acquired by the exact people it is supposedly trying to fight and they will bypass the whole system and be able to use it as a weapon against their enemies

web of trust and paid relays are the main defenses we have, and it just doesn't follow that one nest of bullshit that happens to be running nodes in this protocol means anything for the rest of us

it's all the same, criminals gonna crime, and the cost of acting as a police is a cost that cannot be forced upon people who are doing their due dilligence, using the best filtering tech they can

those who want to actually wage war on these criminals they volunteer to the task and good for them, but it's a hard battle because the enemy is the lowest of the low, who will use every weapon against them... anti-sexual-abuse defenders of innocence have to be anonymous, they have to shield themselves from being physically located, and they have to be prepared to physically relocate in order to wage this war

so, yeah, that is such a large number of costs involved in being an effective combatant in this, and the fact that these criminals have infiltrated the government anyway suggests that the last people we should be allowing or trusting to stop child abuse is the government