I use my own locally-hosted un-nerfed GPT model for conversationally exploring/bootstrapping new projects, but I would love a specially designed portable device for hosting my model. Best I could hack together currently is over-the-net communication with my home GPU server but that is highly connection dependent. Rabbit + self hosting option would be amazing.