Oddbean new post about | logout
 Is anybody using local AIs like chat gpt and can recommend one? Preferably open source, and isolated from the internet...

#asknostr

 
 Koboldcpp is a good llm front-end that I use. The github had pretty good instructions for how to install. You then get an llm from huggingface (I use mixtral) you just run koboldcpp with the path to your model as an argument and open the web front-end in your browser.

https://github.com/LostRuins/koboldcpp 
 Thanks! 
 client GPT4ALL (there's more out there now)
Been using them for year+

Local LLM's for general purpose I've found Falcon-7b does pretty good. Run larger Falcon if you have RAM for that

I've also spent the time to finetune the smaller models for my use cases. Takes time but feed it your local data and you'll thank your past self for the work you put in 
 very cool, what are you favorite use cases?

my immediate thought is that it would be helpful as an editor and brainstorm engine for some longer pieces I want to write  
 One use-case for me (that I haven't set up yet) is to generate embeddings for all my notes so that I can search them semantically. 
 Yes crafting your local doc's to your LLM is beyond powerful  
 Yeah.  I'd do it, but my desktop is down right now, and I don't feel like doing it on my Core 2 Duo laptop. 
 Tbh use it for everything. Been slacking using Maximus (my local agents name😅) as my daily journal, but exactly what you said. 


- Idea generation
- Crafting posts/emails
- Reading & summarizing articles, videos, T&C
- Legal docs (as templates, not final form of course, real humans vet all docs produced and edit accordingly)
- coding (I switch to better coding LLM's for heavier work)
- figma layout generation
- crafting mid journey prompts
- etc

Recently picking back up connecting Maximus to voice for 'speech-to-prompt', then prompt triggers switch for local network IoT devices
Example:
"Hey Maximus, turn off the hallway light please." Etc.

No Alexa, siri, etc nonsense 
All local
 
 love this thread. 
 How do you connect it to your lights?  Also, have you seen mycroft.ai? 
 JAN.AI is another great client. Recently been testing this as well. Can load own LLM's from huggingface to use here just like GPT4ALL 
 This is the easiest and the fastest. 
 I use gpt4all all the time.

https://gpt4all.io/index.html 
 Perplexity? 
 Thanks, hadn't heard of that one, looks like it is web based? 
 This is a kickass webapp LLM

More so for search like a better Google , yes? 
 llama.cpp with any compatible LLM weights.  I've used OpenOrca Minstral with it. 
 https://lmstudio.ai/
https://github.com/LostRuins/koboldcpp/releases
https://ollama.com/

All work well.  Get an uncensored model here by the Bloke:
https://huggingface.co/TheBloke?sort_models=likes#models 
 That didn't format the way I intended.  Hopefully you can see the 3 separate links to 3 different front ends. 
 Also, in my experience, sometimes these programs need to be connected to the internet momentarily at start up for some reason but after they are loaded you can unplug ethernet cable if you like and be offline.  Either way, none of these communicate anything out and are all local, but i've had issues booting up with network turned off at the beginning.  Probably because some of them use a browser interface and even though you are connecting locally they don't work if you shut it all down.  
 I can see the three, thanks for the thoughtful answer 
 That sounds weird, you've got something phoning home. 
 This is what Copilot suggested:

Certainly! If you’re interested in offline AI tools, there are several options available that operate locally on your device without requiring a continuous internet connection. Let me introduce a few:

 1. Freedom GPT: Freedom GPT is an open-source AI chat interface that can run locally on your own device. Similar to ChatGPT, it can generate text, translate languages, and answer questions. The key difference is that Freedom GPT ensures your conversations and inputs stay within your computer, providing privacy. It leverages Stanford’s Alpaca models and features a simple chat-based interface. Keep in mind that it’s not as powerful as ChatGPT due to memory limitations on personal computers1.

 2. Offline AI Assistant Apps:

   • Google Assistant: While Google Assistant primarily relies on the internet, some features work offline, such as setting alarms or sending texts.
   • Robin: Robin offers voice-based navigation and local search even without an internet connection.
   • Hound: Hound is a voice assistant that can handle complex queries offline.
   • Siri: Apple’s Siri can perform certain tasks offline, like setting reminders or sending messages.
   • Cortana: Microsoft’s Cortana has offline capabilities for basic tasks.
   • Dragon Mobile Assistant: Dragon supports voice commands offline.
   • Bestee Offline Virtual Assistant: Bestee provides offline voice assistance for various tasks2.
 3. Quantized LLMs: These are locally loaded language models (LLMs) that you can use without an internet connection. They’re available on platforms like HuggingFace, H20, Text Gen, and GPT4All. These LLMs offer flexibility and security, making them a great choice for various applications3.

Remember to choose the one that best suits your needs and enjoy the benefits of offline AI! 🤖🌟 
 Thanks for the thoughtful answers everyone, raising my takeaways for awareness:

Offline clients & LLMs
-Chatgpt4all seems to be the most popular client with a few different llms behind it
-Koboldcpp was also mentioned as a good front end
-Huggingface came up multiple times as a good place to find llms

Google/online replacement
-perplexity


nostr:nevent1qqs8glr6ett29sga3tzafpazhdrsmut9wqyrzmsksu5xds0wt4lrvsqpr4mhxue69uhkummnw3ezucnfw33k76twv4ezuum0vd5kzmp0qgs0fkyh0y2genfytjx4py229p8avtvhev8mdzme0fc0yjsh9dfzmwgrqsqqqqqplpwww9