singularity-forge/packages
Jeremy McSpadden 04ebe3f0a0 feat(extensions): add Ollama extension for first-class local LLM support (#3371)
Self-contained extension at src/resources/extensions/ollama/ that
auto-detects a running Ollama instance, discovers locally pulled models,
and registers them as a first-class provider with zero configuration.

Features:
- Auto-discovery of local models via /api/tags on session_start
- Capability detection (vision, reasoning, context window) for 40+ model families
- /ollama slash command with status, list, pull, remove, ps subcommands
- ollama_manage LLM-callable tool for agent-driven model operations
- Onboarding flow with auto-detect (no API key required)
- Non-blocking async probe — doesn't delay TUI paint
- Respects OLLAMA_HOST env var for non-default endpoints

Core changes (minimal):
- Add "ollama" to KnownProvider in pi-ai types
- Add "ollama" key resolution in env-api-keys.ts
- Add "ollama" default model in model-resolver.ts
- Add "Ollama (Local)" to onboarding wizard with probe flow
2026-04-01 08:37:31 -06:00
..
daemon wip: M005 daemon — orchestrator, event bridge, formatter, batcher improvements (#2929) 2026-03-27 20:22:30 -06:00
mcp-server feat: Headless Integration Hardening & Release (M002) (#2811) 2026-03-26 23:33:22 -06:00
native fix: align @gsd/native module type with compiled output (#3253) 2026-03-30 13:51:57 -06:00
pi-agent-core fix: handle pause_turn stop reason to prevent 400 errors with native web search (#2869) (#3248) 2026-03-30 13:51:18 -06:00
pi-ai feat(extensions): add Ollama extension for first-class local LLM support (#3371) 2026-04-01 08:37:31 -06:00
pi-coding-agent feat(extensions): add Ollama extension for first-class local LLM support (#3371) 2026-04-01 08:37:31 -06:00
pi-tui fix: skip TUI render loop on non-TTY stdout to prevent CPU burn (#3095) (#3263) 2026-03-30 13:49:55 -06:00
rpc-client feat: Headless Integration Hardening & Release (M002) (#2811) 2026-03-26 23:33:22 -06:00