I’m thinking of hosting a large number of small JSON files. Like over 100k files, each a few KB at most. My go-to would usually be S3 buckets. But I’m wondering if another HTTP host would make sense? CloudFlare maybe? What do you think? #asknostr
seems like a job for a badger database attached to a simple web interface... i assume you will have them query by the json file hashes and those will be formated in a canonical form based on their data structure? probably could be done by making the filenames these hashes and just a simple http filesystem server too, nginx can do this exact thing i think
making a simple file webserver in Go is like 20 lines of code and if you have this collection of json files writing a thing to hash them and then rename them to the hash would be another 20 lines of code... or if it was like nostr events with the code in the json, you could just as easily write a thing that loops through the files, grabs the id field and renames the file
Stick Caddy on a cheap VPS, can then do lots of other things with it too
mongo
I kinda don't want to run my own server for this. If I was going to do that, I'd probably just write small server with Express that stores the blobs in memory. It's less than 1GB total data. (And if I was going to use a datastore, I'd probably use CouchDB. My frontend already uses PouchDB to cache content on the client side.)
Just dont put 100k files in a directory ;)