Oddbean new post about | logout
 If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy?  My RTX 3060 ti with it's paltry 8gb vram just doesn't cut the mustard for Stable Diffusion / llama.cpp

#machinelearning #deeplearning #stablediffusion #llama #chatgpt  
 I was looking at a second hand Tesla P40. They go for around $250, 24GB VRAM, and still reasonably relevant in the AI/Machine Learning space. You might be able to pick up a couple for cheap. I'd avoid the Tesla K series though. 
 Thanks!  Set up a lightning wallet address in your profile so I can zap you! 
 I will do, just working on the self-custody thing. 😁 
 I self custody my cold storage, but trust GetAlby with my lightning wallet.  Use CashApp to top up, I just never keep much money on Alby or CashApp 
 I was looking at running Start9's StartOS. It looks good, but I have to wait untilaI can move my server. It's currently in storage due to moving provinces. It'll take me a couple of mohts to get moved properly. In the meantime, everything is in cold storage. 🙂

https://github.com/Start9Labs/start-os 
 I've been running one for both inference and gaming for about a year. $200 for the card and $50 for fans and a holder. It works well for the price, but lacks some newer architectural features required by some pipelines. if you can justify spending more, a 3090 is faster per dollar and easier all around. 
 I got a 3060 too but with 12gb of ram, since 40 series is out maybe you could get one cheap.