opencode-go is already a first-class provider in pi-ai (models.generated.js
registers 7 models under the opencode-go namespace: glm-5, glm-5.1,
kimi-k2.5, mimo-v2-{omni,pro}, minimax-m2.{5,7}) and runs against
https://opencode.ai/zen/go/v1 with OPENCODE_API_KEY auth.
It was missing from key-manager's LLM provider registry, so the /sf
config wizard and onboarding flows didn't prompt users to supply
OPENCODE_API_KEY. Adding it here gives users a discoverable path to
subscribe and surface the 7 opencode-go models in list-models.
Research confirmed (DeepWiki sst/opencode + curl probes):
- /zen/go/v1/chat/completions is the OpenAI-compatible endpoint
- OPENCODE_API_KEY is the correct env var
- No /models listing endpoint — hardcoding is correct (already done
by the generate-models.ts pipeline)
- Sister /zen/go/v1/messages serves Anthropic-compat minimax variants
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>