Meta released MobileLLM models in 125M, 350M, 600M, and 1B parameter sizes. their 1B Llama model performs well on phones, even mid-range devices. These one are specifically designed and optimized for mobile hardware. Works with llama.c https://huggingface.co/collections/facebook/mobilellm-6722be18cb86c20ebe113e95