Oddbean new post about | logout
 You might notice that the datasets doesn't seem to be live anymore, it's because I'm currently working on the backend (parser) and need to recompute a whole lot of stuff. Sadly, one of the cons of having the same machine for development and production 
 I noticed, waiting eagerly and patiently upon the return 
 Every time you add a new dataset, do you have to run the parser again all the way to the genesis block? 
 It depends,

if a dataset needs to be parsed from the chain then yes but then the speed varies a lot depending on what you need to parse.

if I only need to parse let's say the block's weight it gonna be super fast because I can skip a lot of things that I already parsed and don't need for that and so, keep the local databases that I have from the previous iteration.
Whereas if I want to parse (or reparse for whatever reason) the amount of sats per address then I need to recompute all the wallet's history and the databases linked to them which is very expensive.

There is also the much more common case when a dataset can be computed from other datasets which very very fast 
 Perfect, that’s great. 
 Please note that it's not perfect and could still be better and will be in time but it's still pretty good and was so much worse !