Look how much crap is blocked by just requiring the user to have a kind 0 on the relay already. That's like 90% of it. https://image.nostr.build/51e05c9ce00885ea211a0f991015de023dc20649f5934ab23fa1a1a69c53d093.png
Reply guy had a pfp tho right
This gets him, though. https://image.nostr.build/28d6c12870bd69f859d08a31cf4d9cf9ef7b25c48ba1ea319aec7aabe0cac919.png
Nice. What about the reply girl with random female names and the AI pfp?
It reminds me of this: https://www.feministcurrent.com/2019/10/09/there-is-no-problem-with-trans-people-in-bathrooms/ If you look like a human and talk like a human, then there's nothing to solve. It's only annoying when you can tell they're AI. And if you can tell, it should be possible to prevent it.
Unrelated: What's the purpose of the AbortSignal here?
Signals provide a way to cancel requests that may be based on a timeout or any other reason. They're very powerful especially when used in a pool, because you can pass one signal down to many children then abort them all from the top.
The reason it's used here is so we can time out database requests.
so they did it. onboarding impossible?
chat gpt helped me out here, i guess: This Nostr post refers to how effective a simple requirement can be in filtering out unwanted or low-quality content on a relay (a server in the Nostr protocol, which is a decentralized social network protocol). Here's a breakdown of the key parts: "Requiring the user to have a kind 0 on the relay": In Nostr, events (messages, posts, updates) are categorized into different "kinds." A kind 0 event typically refers to a profile metadata update (such as name, bio, or avatar). Requiring a user to have a kind 0 event on the relay means that only users who have updated their profile metadata on that particular relay can post or engage on it. "That's like 90% of it": The author is suggesting that by implementing this basic requirement (having a kind 0 event on the relay), you can block about 90% of unwanted content, spam, or malicious posts. "Look how much crap is blocked": The phrase "crap" here refers to low-quality content, spam, bots, or malicious actors. The author is pointing out that many of these issues are filtered out by simply requiring a small commitment from users (updating their profile metadata). In essence, the post highlights that a basic filter like requiring users to provide a profile (kind 0 event) can drastically reduce undesirable content on a Nostr relay.
But this requires rebroadcasting your profile metadata each time you add a new relay, doesn’t it?
I feel like if you don't even put your profile data on a relay, are you really even using that relay?