Controls that would do what you're asking for are called algorithms. They could be built into a single client, but I wouldn't want them to be built into the protocol layer. Seeing something I don't like in a conversation thread is unavoidable, because no matter how much we would wish that all humans are reasonable and caring individuals that we would enjoy interacting with, it's never going to be true. The only way to build a feed that makes us happy is to be able to see the wall that protects that feed, and that means we have to see the ugly to recognize it and have simple solutions in place to deal with it. The way I understand things, and please correct me if I'm wrong, there are two ways to build a wall. Mute, which prevents anything from a profile from showing up in our feed. Block, which mutes and also prevents anything from our profile from showing up on their feed. We should be able to control the wall. If anyone or anything else controls what we see, we're no different than any of the corporate platforms, allowing others to control our feed. That's my concern.
How about opt-in crowd-sourced mute/block lists that can be shared between friends? Imagine if an abusive person was constantly having to request to be unmuted/unblocked by individuals who used a popular mute/block list. Social shaming is more effective when it comes from the community rather than an authority. They might actually change their behavior.
That's good thinking, but again that should be at a client level, not the protocol level. If a particular community want to develop an app based on the idea that they know best who their members should or should not see, they can build and maintain it. We don't need and I don't think most of us would want to give up that control. However, that does identify a need for private groups, where a specific set of people can speak freely on specific topics without being exposed to the rest of the world. One of the biggest drivers of growth for Facebook and Twitter was the difference in how they handled privacy. On Twitter, now X, everything is public. But Facebook created private groups where people could organize and actually conduct business, making decisions with internal consensus rules or governance that worked for them. This is a separate topic than Mute or Block, but it is a critical piece of how socialedia has evolved over the last few decades that Nostr needs to catch up with. Even X is trying communities, although they seem to be crap like everything else on there.
"how social media has evolved"...