Oddbean new post about | logout
 Imagine you're browsing a feed and your client fetching metadata for all those people on the fly. You can either fetch an event for each or you can fetch 20 events for each and reduce them to a metadata object by applying diffs locally. Do you really think the second is the best solution in the real world?

Also, since you're just fetching random people's profiles you cannot ever be sure (or how could you be?) that their sequence of diffs is complete. You would actually need a blockchain to be sure. 
 We already can't make it work with replaceable events, imagine having to fetch an unreliable stream of events from an unspecified location.


https://image.nostr.build/f7c47b6b2e14db8ef0416ff2285d4987dae2c01ba305f267fb4b23e908ec8eb7.jpg 
 I wouldn't use diffs, I would just grab the most recent event and use that. You might occasionally miss the correct one, but with caching you'll eventually find it and hang on to it. DVMs can do a lot of the heavy lifting with bigger caches for stuff like this to reduce computation and bandwidth client-side. In my mind, DVMs are just nostr clients that run on a server.