Its the unison and communication of both sides of moderation that i want to emphasize. The simplest way to do that is through exclusive relays (nostrplebs) where the paid aspect of it is already one barrier, but bad actors are then informed on by users to the relay operators to kick them off. When that fails and the relay operator/community mods won't remove the people you don't want to interact with there's WOT, another computational method to filter your feed. When WOT fails, there's manually muting.
Three different scales of filtering that help eachother. Relay mod/level filtering removes 90% of bad actors to your community, this helps reduce the WOT calculation which is computationally expensive. Lastly, muting as a last resort, but by that time the pool of possible people to mute is drastically reduced. Any social media will not last if you need to manually mute hundreds of accounts.
The UX isn't there because right now we assume nostr itself is one single community -> you join the centralized relays and only follow the users you care about. But this means you'll share the same streets with people opposed to you.
This is not a problem of "diversity in opinions" or "being pro censorship" or however you frame it. I'll talk to anyone willing to engage in conversation, but don't come to me shouting and wishing death upon me. I don't care about the name calling, but you're clearly not in a frame of mind to communicate.
Upholding intolerence of intolerence and removing bad faith communication/activity from your community will always be important.
The problem is we don't have tools fine grained enough for the users, relay operators or community leaders to cururate a community and isolate from the external environment.
When nostr has communities of users conpletely opposed to each other in ideals flourishing and unaware of eachother, thats when we've hit a critical mass. Fragmented and decentralized enough such that it is strengthened and can continue to exist even when its users are the pole's apart in ideals.