Oddbean new post about | logout
 in the past 330 days, the damus relay has transferred 104 TB of notes 
 👀 
 I’ll transfer 104TB of YOUR notes… 
 Is that a lot? 
 yes 
 😀 https://image.nostr.build/b9c9c9ece28bc6e2faf502f9e806f55e42aa7300d66bcded75bf2ee1c307ea4c.jpg  
 lmao 
 All the time lol 
 82 TB of it is GM 
 GM 
 82,00000001 now.

GM! 
 Also how many Sats for a custom meme these days? Lol asking for a friend 
 I don’t even know how to respond to this. 🤣

Meme services are free. 🫂 
 Remembering the 90s when standard hosting packages came with 500mb of monthly bandwidth  
 Things have scaled a lot since then, 32 TB per month on a 4 euro budget provider, these days, 400 TB a year probably has a marginal cost of under $10

https://contabo.com/en/vps/ 
 How much more do you think your home internet could take? 
 my home internet connection could serve the average load of the damus relay. we are still small! 
 including media or just plain text notes? 
 just notes 
 damus relay only serves notes, we don't host media 
 Are notes stored on a hardware in your home or cloud storage? Is it doable or easy for a regular person to be a host? 
 cloud. You can run your own relay if you know how to run servers on your own network 
 If government ordered a website/ domain 
/ cloud services for the website to be shutdown, would the storage of notes be gone? Could the notes be stored in ssd at home to prevent a loss in that case? 
 Why so large for just text? I have no knowledge of how big or small it should be but i assumed text only to take up much less space 🤔  
 This is transfer not storage 
 I wonder how much of that is reply guy 
 I wonder how much of that is reply guy. 
 no other stuff ? 
 Bruh 
 Way to go! Keep stacking ❤️ 
 
nostr:nevent1qqsyd4rugvhx8swr9u9r7jaycp3kq64xkg7xxa9vv6k63v80ra65gpcppemhxue69uhkummn9ekx7mp0qgsr9cvzwc652r4m83d86ykplrnm9dg5gwdvzzn8ameanlvut35wy3grqsqqqqqp4hppdx 
 Thank you 🙏 
 when is damus coming to android? 
 Get in touch with me now via simpleX🔗 
https://simplex.chat/contact#/?v=2-7&smp=smp%3A%2F%2FenEkec4hlR3UtKx2NMpOUK_K4ZuDxjWBO1d9Y4YXVaA%3D%40smp14.simplex.im%2FSEKmN13uyX4OJlUxco0z9Nm9bPd4pK_K%23%2F%3Fv%3D1-3%26dh%3DMCowBQYDK2VuAyEA7kBBMVqOcbeghKMFplSPoiLQ903hkK3qBKtF_C1-LEo%253D%26srv%3Daspkyu2sopsnizbyfabtsicikr2s4r3ti35jogbcekhm3fsoeyjvgrid.onion 
 Thanks for the reply nostr:nprofile1qqsgydql3q4ka27d9wnlrmus4tvkrnc8ftc4h8h5fgyln54gl0a7dgspzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgqg5waehxw309aex2mrp0yhxgctdw4eju6t0qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyfehcpn! Very cool. I am trying to focus on wrapping my head around Nostr currently - its ecosystem and potential. The project seems promising and will keep it on my short list as something to try out in the future. Might be a decent platform for converting friends and family over too! 
 👏 
 Beautiful 
 🚀 
 Amazing!🥰 
 how is this financed?? 
 servers are cheap 
 the replyguy has been busy 
 Nice! What %age of the whole network is that? 
 How much is the current average daily bandwith of a Twitter or Meta relay server nostr:nprofile1qqsgydql3q4ka27d9wnlrmus4tvkrnc8ftc4h8h5fgyln54gl0a7dgspzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgqg5waehxw309aex2mrp0yhxgctdw4eju6t0qy2hwumn8ghj7un9d3shjtnddaehgu3wwp6kyfehcpn?

I also wonder if Nostr uses Choke and Kad and which specifications. 
 How much sats and how many zapped? So that we can clearly show the world that micro-economy works in this circle 
 How much notes did the relay store? 
 Let's fill every relay with valuable content. 
 would this sort of load wear out a consumer level SSD ? 
 Probably, though most consumer level drives would still last at least a few years at that rate. 
 with lmdb/strfry most notes will already be paged into memory, transfers are coming direct from ram. 
 So the notes bypass the internal drives entirely? 
 This post is about data transfer, the 100tb is about serving notes already stored on disk that is paged into memory 
 Ah I see. Curious, what is the total size of a fully synced relay currently? How close to fully synced is an average relay if you have access to that data? 
 fully synced doesn’t really mean anything. Its not a blockchain. Nodes will never be fully synced with each other. Some nodes are private and disconnected from each other forming subnetworks. If I were to estimate it, i would guess 500GB? 
 The gzipped dump of events I recently shared from nostr.band is 512Gb 
 How many notes is this? 
 That's seems like so much data, but  wonder how much this compares legacy social media.
 
 Damn! 
 nuke it 💣 
 solid traffic 
 👏👏👏 
 If I knew how, I would add dedicated statistics for the relay on http://ftp.halifax.rwth-aachen.de. Since January (I don't have older data at hand), the server sent out 9 PByte of data. 104 TByte would be 1% of that. 
 Doesn't seem correct the value. To download even 3 PB from a beefy server you'd need a powerful web uplink and that would be running 24/7 at max load.



 
 It's a 20 Gbit uplink running at an average of 4 Gbit. Seems correct to me. 
 Fantastic stuff! I really thought that was a typo but you are indeed correct.

Would recon that you and the others in Aachen are hosting the fastest nostr relay on the globe at the moment. Computing at its finest. 
 you can generate stats with goaccess if you have a reverse proxy in front of it   
 Awesome! Now.

BUY BITCOIN
- CryptoCloaks
 
 which is the best nostr android client?   primal , amethyst? any other? 
 Sadly i managed to wipe the old logs but relay.nostr.net has generated 659gb of traffic in the last 53days. 

Its tiny compared to damus but its serving its users 24//7. 

And I don't have tor stats.  

If anyone is curious about data the latest report can be seen a https://relay.nostr.net/r/report.html 

nostr:note1gm28csewv0quxtc28a96fsrrvp42dv3uvd62ce4d4zcw78m4gsrs2urjn3  
 Why not purge notes older than a month?

nostr:nevent1qqsyd4rugvhx8swr9u9r7jaycp3kq64xkg7xxa9vv6k63v80ra65gpcppemhxue69uhkummn9ekx7mp0qgsr9cvzwc652r4m83d86ykplrnm9dg5gwdvzzn8ameanlvut35wy3grqsqqqqqp4hppdx 
 104 terabytes of text notes! 📖 Ever wondered how massive that is? That's roughly equivalent to...

104 trillion characters

About 208 million average-sized books

Nearly 29.7 million copies of War and Peace by Leo Tolstoy

If someone started reading these  notes today without any breaks, it would take over 158 millennia to finish!

Incredible to think about the sheer volume of information we've moved! 📚

#statistic 
 When’s your next Damus relay wipe?