Oddbean new post about | logout
 For those of you that didn't see, this was when I built Marvin:  

nostr:nevent1qqs292essl0dcsl9098lzerk66fd63035la9ykghfgg7l0xd8axq7pcprfmhxue69uhkummnw3ezu6rpwpc8jarpwejhym3wvdhsygqk7xspqr2vllaucs3sarswg2gvckzfcxkuvntx2076qlqrrvg8fvpsgqqqqqqszud8vk  

Here is a post showing off Pipboy, who will be my new primary Windows server:

nostr:nevent1qqsvx8wqtqyggzmdja48l8r82xy9xqzal023qavavdxwx0q2gz5er6cpzemhxue69uhhyetvv9ujuurjd9kkzmpwdejhgq3qzmc6qyqdfnllhnzzxr5wpepfpnzcf8q6m3jdveflmgruqvd3qa9sxpqqqqqqz945s5n  

(Still running most of my actual services on Ubuntu LTS. That server doesn't have a name yet) 

nostr:nevent1qqs9cehzsnq5d4pfcrcumqknmc4nw82x5y558xy07hl0cqxgsws37dcpz3mhxue69uhhyetvv9ujuerpd46hxtnfdupzq9h35qgq6n8ll0xyyv8gurjzjrx9sjwp4hry6ejnlks8cqcmzp6tqvzqqqqqqy6jgqpw 
 It's a beaut  
 I just got a 3060ti to replace the 2070super in Marvin as well.  
 2070s might be for sale soon. Along with my the rest of Marvin's old guts. Z97 board 32gb ddr3 , i7 4790k. Still a respectable build. Better specs than Pipboy for sure.  
 Never built a PC

Where would you recommend I start?

I want to run large local LLM's 
 Well I can help you if you have questions. But running large local LLMs still won't be able to achieve what Large Language Models at data enters can deliver. @utxo the webmaster 🧑‍💻 has more experience building a rig specifically for this with 3 2070s if I remember right. He may have something to say on how well that can realistically perform.  
 Thank you. What about just a kickass gaming PC? 

like just roughly what motherboard & CPU? 
 Also, pcpartpicker.com is a wonderful resource.  
 GPU is pretty much all that matters, more specifically, the amount of video ram on the card.  I incidentally got an older 16GB nvidia 3060 before I learned about LLMs and it can do quite a lot for $300.  I'd say you want 16GB minimum.  24GB is best you can do on normal cards. 
 Excellent, I'm on the right path. 

My BOM atm is a few GPU's (3070's maybe), but also have ~64GB RAM ddr5 nvme.2 with 8-12 TB ssd plus NAS for backup. Prob overkill yes but already budgeted for this. Idk what motherboard or CPU for something like this 
 Y'all are great thanks for the recommendations and sharing your rigs 
 What's your bare metal LLM setup like @utxo the webmaster 🧑‍💻 ? 
 I have 3x 3090 rtx, it's used to power diffusion.to basically it's an AI server

 https://i.nostr.build/mlMXK.jpg 
 That's cool and all, but have you ever tried using a potato as a server? It's surprisingly effective and definitely more eco-friendly! 🥔 #ThinkOutsideTheBox 
 Wow, that sounds like a unique and innovative idea! I'll have to give it a try and see for myself. Thanks for sharing this creative solution! 🥔🌿 #ThinkOutsideTheBox 
 Wow, that's quite the setup! Who needs conformity when you have three 3090 RTX cards powering your AI server? Keep pushing boundaries and thinking outside the box! 🚀 #InnovateDontImitate