Oddbean new post about | logout
 Just sat through the "Scaling Nostr: The Future of Decentralized Relays" panel at #nostriga

I'm very disappointed. Despite this supposedly being a talk about decentralized relays, _nothing_ mentioned in the panel was actually decentralized. Not a single damn thing. 
 I left for a coffee 
 what do you think the approach is for proper decentralisation of nostr? outbox? easy home relays with no network config? mobile relays? in all relays? 
 Truth.  On point, as usual.  But we can make it better.  Less LARPing, more building.  Decentralzation is not a slogan, it requires work. 
 this note of yours reads like a LARP slogan with no content 
 I always wonder if ‘decentralization’ is the actual objective. For me, it’s more about redundancy and resiliency. 
 Agreed, 'decentralization' is probably not the right word in this context since single truth is not required across relays.
The expectation is not being censured and not losing notes because a relay went down. Sounds quite different from true decentralization. 
 Thoughts on relay implementations by default mirroring other relays in a discoverable mode in order to add better rundancy to Nostr? No potential nukening events. That'd mean some rando relay could fulfill most REQs. I guess sort of like running a full bitcoin node. 
 I dont think you can easily mirror without user consent.  Users should own their own data.  But if users are ok with that, mirroring can be a useful tool, imho.

There's no magic fix, the relay network is in bad shape due to, neglect, and  lack of investment.  A change in mindset, e.g. from opensats could move the needle back in the right direction.  No quick fix there.

Nukenings are bad for nostr, but they will happen from time to time.  I hope it happens when nostr is mature, rather than when it's a baby.  

Caching is important, that will take some of the load of the relays.  Nostr needs a scaling architecture, like the web has, but none are on the horizon, yet.  And could it even get through the NIPs, probably not.  So, more to be done.  First thing is to establish that work is needed, which I think Peter Todd for pointing out. 
 “Users should own their own data”

That is fundamentally impossible in a digital system, especially if you want it to be decentralized and/or open access.

Data can be copied. That's just life.

If you're trying to prevent mirroring, at minimum, put it on a website. Though even then copying isn't that hard.  
 Caching on nostr doesn't work well because nostr left off any kind of blockchain tech to make it simpler. Nostr should have had something like the architecture of Scuttlebutt, with something like a per-user chain or graph. But that's more complex than doing something dumb that doesn't actually work well... 
 As it happens nostr was based on Scuttlebut, but lots was removed.  IMHO the design was not great, but just about works, and the simplicity gained developer interest (build something in a weekend).  It's more complex now.

I'll have to look a bit more at SSB and per user chains to fully understand what you mean, but you may be onto something there.  Thanks for the pointer.

There's plenty of possible caching strategies.  Nostr doesnt really have one.  My understanding is that primal were working on something.  It would be good to see what.  First thing to establish is that there is work to be done, and I hope you've helped to do that. 
 Fair point.  But you should get consent, no? 
 Publishing on the system is the consent.  
 I thought this too for a long time.  But then I realized if i have exclusive content, and publish it to certain relays, but there are other relays I dont like, I'd be a bit annoyed if they take that content.  There's gonna be a spectrum of opinions on this, I respect yours, but I have come to have a more complex view over time e.g. things like NIP-70 introduces nuances:

https://github.com/nostr-protocol/nips/blob/master/70.md 
 The unwritten rules of respecting robots.txt and not scooping up everything have already crumpled due to AI scraping.

https://www.businessinsider.com/meta-web-crawler-bots-robots-txt-ai-2024-8

Seems to be the trend that if you can access it it will be used. Encrypt and deny access might become the new reality, unfortunately. 
 I normally work on a different principle.  So firstly I make things I want to use myself.  And preferably that I can recommend to my friends.  Many of whom are non techie.  So, for example I can recommend bitcoin, because it's fair, innovative, and a level playing field, as far as possible, imho.  Could I recommend a service where the norm is to take their (possibly exclusive) content and posts it elsewhere?  No, that's not OK.  Nostr was built as a small corner of the web that tried to aspire to be a bit better.  So just because, say OpenAI are doing bad things (thanks for pointing it out, btw), doesnt mean it should translate to nostr.  A reasonable user expectation would be they publish to the places they want, and if others republish that, they need consent, or user can be unhappy.  Nostr itself says nothing about storage, replay attacks, or republishing, it just transmites, notes, and other stuff, from one user to another.  Basic cypherpunk principle is not to harm the user, imho.  Whether or not republishing does, is something that can be debated. 
 That was the way the Internet worked the first decades. 

Seldom it's worth the effort to try to chase after your rights if somebody misuses your copyright.

This is the article I meant to share (and it's not just OpenAI):

https://www.oreilly.com/radar/how-to-fix-ais-original-sin/ 
 I think there is a conflation between enforcement ("How can I stop this") and ethics ("Should be we doing this").  You cant stop racism, but you can call it out.  Definitely on nostr there are those that will want to post exclusive content on relays THEY choose.  Perhaps a community relay, like a facebook group relay.  I'm saying that data ownership should be respected, and people own their own data.  You might not be able to stop it, but you can call it out.  We have a web of reputation and trust already.  Things like "antizaps" (negative zaps) are going to be quite important to call out bad actors. 
 Indeed. Robots.txt has worked for many years, and it was just a moral choice, not a technical barrier.

Transparency is a good way to air out bad behaviour allowing others to see it and stop cooperating with a bad actor.

There are ways to have strong “signals” and guardrails for the expected behaviour in the NIPs.

Choice is a strong force. If you can choose the behaving relays, services etc. it goes a long way. 
 I think it's similar to publishing a Torrent. You publish it to the network, it does stay at a particular place. Mirroring is common and that's what will achieve decentralization of Nostr.

At some point people will run their own relay and it will mirror notes they are interested in for easy access and archival. And maybe provide read access to others.

I run my relay and mirror many things with a simple shell script from cron.

The rule is - if a client can get it, so can a relay, which can act as a client. 

I own my data in a sense that I own my identity and that I have the archive of it that no one else can delete. 

But as Peter said, it's digital, can be copied and people have signed it with their keys. Internet does not forget. 
 Agree with most of this.  However, torrents and social posts are not quite the same.  Torrents are files that are designed to be shared.

Posts could have some personal attachment.  You could have a business nostr server.  You could have a friends group.  You could have special articles you want to share only in cretain places.  You might share a party photo with some friends, that you regret.

There is absolutely no default social contract in nostr other than you are using a websocket API over HTTP.  

If the system you use gets your consent, that's a different story.  But if not, you can have the right to feel violated.  Or not recommend nostr.

The social contract is largely inherited from the status quo of users owning their own data, imho.  Unless you agree to infinite sharing, which is also potenitally forever and undeletable. 
 I understood that the post button in most Nostr clients does post it to the network. There are no tools in Nostr to limit posting to some groups. You don't select where you post to in most clients, there are preferred relays, not ultimate relays, there are no per note relay sets anywhere in the user interface. It might be used like that in the future, when it's the responsibility of relay to limit writing and reading of notes to those that should be authorized. And that means not allowing connections from relays that could copy the notes. 

I see what you are saying as a potential problem in the future, but I think encrypted messengers are much better for sharing within a closed circle. Nostr is for public sharing, it's not encrypted, notes don't have any permissions attached to them except for the - tag, which is a self limitation like robots.txt.

There's even a tool like http://nostrsync.live/ which will download all notes created by an npub from a list of a lot of relays and try to publish it everywhere. Note that authorization is not required, I can do it for your npub and the relays have no way of telling that you did not post that message to the relay, it's all valid signed notes. 
 Can you please elaborate on the web's scaling architecture? 
 Too big a topic, it was largely done in the 1990s through lots of debates.  Here is one output document, which has largely stood the test of time (except for maybe the content negotiation part).  An important aspect is that connection oriented networks (such as websockets) carry a few overheads wrt scalability, but there's much more to it.

https://www.w3.org/TR/webarch/ 
 Thanks, will read that one. 
 Can you please expand on "connection oriented networks carry a few overheads wrt scalability"? I didn't see anything related in the linked doc. 

Is there any w3c doc specifically about web's scaling/caching architecture/principles?

Also do you have any suggestions of how nostr's cache architecture could look like?
 
 Connection-oriented networks carry overhead because they need to maintain continuous connections, which doesn’t scale well as the network grows. The web scales by focusing on stateless interactions and efficient caching strategies.

For Nostr’s caching, consider distributed caches or edge approaches to reduce server load. 
 It seems like connections on nostr are not too "stateful". All messages sent to the socket could be sent each in it's own separate connection, concurrently, handled by different caching replicas of the same relay or different threads of the same relay. There's no session/user-level state to coordinate and synchronize btw replicas/threads that could harm scalability.   
 Yes, I think you're onto something.  

NostR = notes and other stuff transmitted by relays.
Nost = Notes and other stuf.   
 No I mean that relay architecture and performance and scalability wouldn't differ much if it was plain http server. 
 Since nostr doesn't have any type of blockchain tech, mirroring isn't reliable. You have no good way pf knowing if you're being censored. 
 what did you have in mind when you thought about decentralized relays? 
 Talk is cheap.   I much prefer either tutorials or demos of the tech being presented. 
 ^ 
 how would you do it? 
 I need to write a full blog post on this... 
 pls do, there's a lot of people here wondering 
 Would love to hear your idea! In case you didn’t see my other post, this is my solution https://github.com/baumbit/peercuration?tab=readme-ov-file#peercuration

I did it several years ago, before Nostr existed (in public at least) so I created it for #treebit (a network similar to Nostr, but with spam protection and fully decentralized). 
 Yes. This.
I already solved the decentralization of relays, when when I designed #treebit and it all is based on #peercuration  https://github.com/baumbit/peercuration?tab=readme-ov-file#peercuration