Fun fact: You can also run other AI models locally on your phone, including those for summarization, speech-to-text, translation, and more.
You can essentially run any model that uses the Transformer architecture on mobile devices, and fortunately, these task-specific models are usually small and run very efficiently.
This opens up exciting possibilities for creating private smart note-taking and composition apps.
NostrNet's new 'Aithena' dashboard will become a hub for running multimodal LLMs alongside task-specific SLMs.
You can try these small task-specific AI models and run them locally through your browser.
https://xenova.github.io/transformers.js/
Where can I find documentation about running transformers models on mobile devices?