As I uploaded a video to https://satellite.earth/cdn today just to keep a backup of it, I thought about how wasteful that was since probably many others have already saved that same video in many other places, and I thought about how good it would be if we could just store a hash of these things and everybody would help save and distribute the same piece of content. Oh, what a wonderful world, much less broken links, much less lost media, it would be amazing.
Turns out I just invented IPFS, now I just need 10 years to try to develop this amazing idea into working software that doesn't consume 500GB of RAM 24/7 in everybody's computer forever and still can't download a single text file from a computer in the same LAN.
NIP-96 to the rescue. It's not perfect but will do for now.
PS: Njump now supports kind:1063
And the bandwidth! Omg the bandwidth.
nostr:nevent1qqszhkc0qkuzq3240vqlwqranq5w7gkfttlpsj20v4g5c00c3989p0spp4mhxue69uhkummn9ekx7mqzyqalp33lewf5vdq847t6te0wvnags0gs0mu72kz8938tn24wlfze6qcyqqqqqqgg3pxna
is it permanent? can we have an API for here.news?
Iroh does it and use blake3 hashing algorithm, which is distributable
The biggest problem with IPFS for me is the bandwidth usage. I have this absurd 1.2 terabyte data cap that was imposed by Xfinity during the pandemic, and I know it's complete garbage because states in the Northeastern US told Xfinity to shut up and roll it back. California didn't, despite being such a "forward-thinking" and "advanced" state.