Oddbean new post about | logout
 It all depends on how much data is needed for each custom feed. It will be hard to make good trending topics work with only local info.

It's similar to translations. In-device translations work really well for simple things/notes but if the text requires advanced knowledge to translate well, online services win.  
 All you are trying to do is get the use to achieve a wide range of topics @rabble what do you think? Think of the days before personalized responses  
 good point, I wonder if it makes sense for these script to have a negentropy/time sync initiation step to make sure it has all the data it needs first before it runs. 
 Negentropy / time syncs are not optional, IMHO. We will have to add to the mandatory core of Nostr if we want to scale this. We can't keep downloading everything over and over again just to check if the local instance has all the data it needs. 

I am getting more into DVMs that return a Snark/Stark proof these days. I think it could provide some interesting possibilities for verifiable computation. 

If you have the IDs of the events you want to include locally, you can send the IDs alone for a verifiable DVM to compute and return with a proof that it did what it was supposed to do. 

It gets even better if we can add some homophobic encryption to event data. 

Future is bright.  
 I'm studing zkvm and zkwasm. These are nice techs 👍 
 Too bad all of these ZK implementations wasted time in the shitcoin world. They don't need blockchains to offer verifiable computations. 
 I think it's incredibly cool to see the lead Amethyst and Damus developer exchange ideas. I love this!