This is badass! I do already use Ollama but that's clearly a different use case than an in-browser LLM. Kudos for the excellent work!