@jb55 wen content warnings pls, we have younglings here 💜🥺
Are you being serious? I imagine an AI image/video content screening service is extremely expensive or tough to code
no i am not serious, let the <16 years old see this kind of content unwillingly and see what happens
Have you tried this? https://i.nostr.build/zg88.jpg
You think a child should expect flawless content safety from clients less than a year old dev’d by underpaid and understaffed teams accessing a relatively-brand-new immutable protocol? Will is not responsible for parenting children. There is zero implication of expected child safety here.
damus is also 17+ and has blurred images/videos by default. I don’t know how people expect that a browser will be able to police the internet. Maybe when AI is good enough to detect “woman fucking a dog” in realtime on-device.
There is an option in settings to not show images automatically. I think it works for videos and for people you don’t follow
What? You don't need AI. There is a NIP for this. https://github.com/nostr-protocol/nips/blob/master/36.md
Damus does not support it AFAIK @jb55
This doesn’t solve anything
does not solve this problem for this user. would be nice to have though. or are you opposed to that nip in general?
No it was my idea, damus web implemented it first. This nip author just codified it. It just doesn’t solve the problem of always screening “bad” content from users.