We're throwing around the open-source label a little too wide for the actual goods delivered I find...
> Local backend server with full API Local model integration (vLLM, Ollama, LM Studio, etc.) Complete isolation from cloud services Zero external dependencies
Seems open source/open weight to me. They additionally offer some cloud hosted version.