Oddbean new post about | logout
 @chadlupkes the thing is I think we need more controls. Ideally Nostr clients wouldn’t display content as replies from people they muttered and reported. I know this is a controversial position, but the way I see it being a reply guy in my threads is like coming in to my personal space when I don’t want it. Just like we can kick out a rude houseguest, they aren’t banished from the town, just my small part of it. 

I want nostr to be a space for queer Palestinian drag queens like @Mama Ganuush  who are being shadow banned and deplatformed by Meta and Instagram. I want nostr to be a space for activists who face state surveillance and repression. 

And I’m fine with there being people on nostr who hate me, so long as I don’t have listen to them or have them inject themselves in to my content and conversations.  
 Controls that would do what you're asking for are called algorithms.  They could be built into a single client, but I wouldn't want them to be built into the protocol layer.  Seeing something I don't like in a conversation thread is unavoidable, because no matter how much we would wish that all humans are reasonable and caring individuals that we would enjoy interacting with, it's never going to be true.  The only way to build a feed that makes us happy is to be able to see the wall that protects that feed, and that means we have to see the ugly to recognize it and have simple solutions in place to deal with it.

The way I understand things, and please correct me if I'm wrong, there are two ways to build a wall.

Mute, which prevents anything from a profile from showing up in our feed.

Block, which mutes and also prevents anything from our profile from showing up on their feed.

We should be able to control the wall.  If anyone or anything else controls what we see, we're no different than any of the corporate platforms, allowing others to control our feed.  That's my concern. 
 How about opt-in crowd-sourced mute/block lists that can be shared between friends? Imagine if an abusive person was constantly having to request to be unmuted/unblocked by individuals who used a popular mute/block list. Social shaming is more effective when it comes from the community rather than an authority. They might actually change their behavior. 
 That's good thinking, but again that should be at a client level, not the protocol level.  If a particular community want to develop an app based on the idea that they know best who their members should or should not see, they can build and maintain it.  We don't need and I don't think most of us would want to give up that control.

However, that does identify a need for private groups, where a specific set of people can speak freely on specific topics without being exposed to the rest of the world.  One of the biggest drivers of growth for Facebook and Twitter was the difference in how they handled privacy.  On Twitter, now X, everything is public.  But Facebook created private groups where people could organize and actually conduct business, making decisions with internal consensus rules or governance that worked for them.  This is a separate topic than Mute or Block, but it is a critical piece of how socialedia has evolved over the last few decades that Nostr needs to catch up with. Even X is trying communities, although they seem to be crap like everything else on there. 
 "how social media has evolved"...