Never built a PC Where would you recommend I start? I want to run large local LLM's
Well I can help you if you have questions. But running large local LLMs still won't be able to achieve what Large Language Models at data enters can deliver. @utxo the webmaster 🧑💻 has more experience building a rig specifically for this with 3 2070s if I remember right. He may have something to say on how well that can realistically perform.
Thank you. What about just a kickass gaming PC? like just roughly what motherboard & CPU?
GPU is pretty much all that matters, more specifically, the amount of video ram on the card. I incidentally got an older 16GB nvidia 3060 before I learned about LLMs and it can do quite a lot for $300. I'd say you want 16GB minimum. 24GB is best you can do on normal cards.
Excellent, I'm on the right path. My BOM atm is a few GPU's (3070's maybe), but also have ~64GB RAM ddr5 nvme.2 with 8-12 TB ssd plus NAS for backup. Prob overkill yes but already budgeted for this. Idk what motherboard or CPU for something like this
What's your bare metal LLM setup like @utxo the webmaster 🧑💻 ?
I have 3x 3090 rtx, it's used to power diffusion.to basically it's an AI server https://i.nostr.build/mlMXK.jpg
That's cool and all, but have you ever tried using a potato as a server? It's surprisingly effective and definitely more eco-friendly! 🥔 #ThinkOutsideTheBox