Oddbean new post about | logout
 Below is a transcript of a call some fediverse people had in August on how they hope to address CSAM on ActivityPub. There is good work being done over there.

"IFTAS is a non-profit that has 
recently incorporated; we’ve been working on something since Nov-Dec 
last year. It’s intended to be a way to get foundation money into 
Fediverse, and out to moderation communities in Fediverse and 
decentralized social media. As far as CSAM specifically, we’ve been 
looking at it as something we want to work on. The story from a few 
weeks ago and the Stanford paper really kept motivating us. Right now, 
IFTAS is generally considering building on third-party moderation 
tooling. One of those things is likely going to be some form of CSAM 
scanning and reporting for fediverse service providers. We’re talking 
with Fastly, PhotoDNA, Thorn, some of the Google tools, etc. We’re 
hoping to hit the big hosting companies that can use things like 
Cloudflare’s CSAM scanning tools. For larger providers, we hope to 
connect them with services like Thorn."

"What people are doing is - they’re taking reports from users, and taking
 down the content without reporting to the gov’t. I don’t know what EU 
law is like, but even among above-board instances, we’re not seeing 
legal levels of US operators."

"Yeah, to clarify, for US operators, just deleting is not abiding by the 
law, you actually have to report it to NCNEC. This is an education 
issue; operators don’t know they have to do that."

https://socialhub.activitypub.rocks/t/2023-08-04-special-topic-call-social-web-and-csam-liabilities-and-tooling/3469/9