1B & 3B models 👀 These should be able to run on iPhones models going back 5 years or so to around iPhone X https://image.nostr.build/f5e78af8d9d7b08b0b89dc77ea941a70ff5762bc6209e37361c213357112ff8d.jpg
Ooooook
How can I run them locally on an iPhone ?
Download gguf quantizations of the models (q4 minimum) and use software like MLCchat or Layla.ai. I'm not sure what software is available on iPhone, personally.