Oddbean new post about | logout

Notes by quantml | export

 alright who’s building nostrpedia? 
 one of the original founders of wikipedia has a project to create a "universal network of encyclopedias". see https://encyclosphere.org/intro 
 The encyclosphere project ( https://encyclosphere.org/intro ):
"The encyclosphere will be the universal network of encyclopedias—an ownerless, leaderless, centerless knowledge commons. Like the blogosphere, it will be a decentralized series of “feeds,” but feeds of encyclopedias and individual articles posted anywhere online.

Data from these feeds can be aggregated by different services, then developers will use the aggregated data to build encyclopedia reader apps, with none being privileged or “official.”

The brand-new, non-profit Knowledge Standards Foundation (KSF) is organizing the discussion and formulation of the standards (technical specifications) for this system (at encyclosphere.org—right here).

The KSF will never build an encyclopedia app; instead, it will facilitate development of technical specifications and the tools needed to let others build the network.

The KSF is and will remain 100% independent of any corporation or government and is absolutely committed to other founding principles, including neutrality, credibility, free speech, responsibility, and openness.

For both technical development and funding, we rely on donations from the general public: individuals and families."

nostr appears to be a good fit with the intent of this project. additional information can be found at the following links:

https://larrysanger.org/2023/11/wikipedias-empire-and-how-to-build-a-rebel-alliance-of-knowledge/

https://encyclosphere.org/q3-2023-newsletter

https://larrysanger.org/2023/07/the-encyclosphere-is-greater-than-wikipedia/
 
 Plebs, shill me your favourite resources on Austrian Economics. Bonus points if it’s lite readi... 
 an extensive list of references can be found at https://libertyclassroom.com/learn-austrian-economics/ 
 Optimising Distributions with Natural Gradient Surrogates - 
Jonathan So, Richard E. Turner

Natural gradient methods have been used to optimise the parameters of probability distributions in a variety of settings, often resulting in fast-converging procedures. Unfortunately, for many distributions of interest, computing the natural gradient has a number of challenges. In this work we propose a novel technique for tackling such issues, which involves reframing the optimisation as one with respect to the parameters of a surrogate distribution, for which computing the natural gradient is easy. We give several examples of existing methods that can be interpreted as applying this technique, and propose a new method for applying it to a wide variety of problems. Our method expands the set of distributions that can be efficiently targeted with natural gradients. Furthermore, it is fast, easy to understand, simple to implement using standard autodiff software, and does not require lengthy model-specific derivations. We demonstrate our method on maximum likelihood estimation and variational inference tasks.

https://arxiv.org/abs/2310.11837
#machinelearning #optimization
 
 #XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.

https://xgboost.readthedocs.io/
https://github.com/dmlc/xgboost

#machinelearning #datascience 
 Faiss is a library for efficient similarity search and clustering of dense vectors. It contains algorithms that search in sets of vectors of any size, up to ones that possibly do not fit in RAM. It also contains supporting code for evaluation and parameter tuning.

Faiss is written in C++ with complete wrappers for Python. Some of the most useful algorithms are implemented on the GPU.

https://faiss.ai/
https://github.com/facebookresearch/faiss

#machinelearning #knn