key-manager: surface opencode-go in LLM provider list for onboarding
opencode-go is already a first-class provider in pi-ai (models.generated.js
registers 7 models under the opencode-go namespace: glm-5, glm-5.1,
kimi-k2.5, mimo-v2-{omni,pro}, minimax-m2.{5,7}) and runs against
https://opencode.ai/zen/go/v1 with OPENCODE_API_KEY auth.
It was missing from key-manager's LLM provider registry, so the /sf
config wizard and onboarding flows didn't prompt users to supply
OPENCODE_API_KEY. Adding it here gives users a discoverable path to
subscribe and surface the 7 opencode-go models in list-models.
Research confirmed (DeepWiki sst/opencode + curl probes):
- /zen/go/v1/chat/completions is the OpenAI-compatible endpoint
- OPENCODE_API_KEY is the correct env var
- No /models listing endpoint — hardcoding is correct (already done
by the generate-models.ts pipeline)
- Sister /zen/go/v1/messages serves Anthropic-compat minimax variants
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
58543fdae4
commit
a4428ba1ff
1 changed files with 1 additions and 0 deletions
|
|
@ -46,6 +46,7 @@ export const PROVIDER_REGISTRY: ProviderInfo[] = [
|
|||
{ id: "openrouter", label: "OpenRouter", category: "llm", envVar: "OPENROUTER_API_KEY", dashboardUrl: "openrouter.ai/keys" },
|
||||
{ id: "mistral", label: "Mistral", category: "llm", envVar: "MISTRAL_API_KEY", dashboardUrl: "console.mistral.ai" },
|
||||
{ id: "ollama-cloud", label: "Ollama Cloud", category: "llm", envVar: "OLLAMA_API_KEY" },
|
||||
{ id: "opencode-go", label: "OpenCode Go", category: "llm", envVar: "OPENCODE_API_KEY", dashboardUrl: "opencode.ai/zen" },
|
||||
{ id: "custom-openai", label: "Custom (OpenAI-compat)", category: "llm", envVar: "CUSTOM_OPENAI_API_KEY" },
|
||||
{ id: "cerebras", label: "Cerebras", category: "llm", envVar: "CEREBRAS_API_KEY" },
|
||||
{ id: "azure-openai-responses", label: "Azure OpenAI", category: "llm", envVar: "AZURE_OPENAI_API_KEY" },
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue