Oddbean new post about | logout
 There is PhotoDNA that is free of charge and does it well. I don’t think we should spend our limited resources on that. 
If I were to do it, I could have a age detection AI model and nude detection model in a multimodel setup, but PDNA is definitely better and an industry standard 🐶🐾🫡
DM me if you wish, but consider the above 
 Well, you already said very similar strategy that i want to propose. No need to DM amymore 😄

Yes, PhotoDNA for known data while multimodel setup using nude detection + age detection model for unknown data. It is also my proposal idea. 
 We could work on the multimodal thingy together then. @quentin and I have been experimenting with general detection of safe/unsafe media. I can share my repo (still closed source for now until I am sure the code is safe and clean) and take it from there. Interested? 🐶🐾🤔 
 Yes, i'm interested. I think i can probably join helping 1 month later. I need to finish some of pending work in nfrelay.app for the moment. 
 Excellent! The more the merrier! Ping me in the DM if/when your are available 🐶🐾🫡 
 Yes, will do 🫡 
 Ok, I implemented age prediction for unsafe content, we’ll see how it goes 🐶🐾🫡 
 Oh nice, is it already running for nostr.build and nostrcheck.me? 👀 . 

Hopefully it can help both of your services. 
 It runs on NB, @quentin will need to update api call and it’ll work 🐶🐾🫡 
 Glad to hear. 🫂 
 nostr content filtering experts @The Fishcake🐶🐾 @Rif'at Ahdi R @quentin  
 this is very good tip for #photoDNA or any selfhosted AI model @quentin to integrate this foss software into nostrcheck package. (using external free/paid indexing or content searching means dependency on another 3rd party service i.e. BAD by design) - there is nohing gives 100% guarantee in cat n mouse game BUT MITIGATING 80% solves good part the problem. also look POW and/or NIP-42 also which limit attackers limit resource also.