Self-contained extension at src/resources/extensions/ollama/ that auto-detects a running Ollama instance, discovers locally pulled models, and registers them as a first-class provider with zero configuration. Features: - Auto-discovery of local models via /api/tags on session_start - Capability detection (vision, reasoning, context window) for 40+ model families - /ollama slash command with status, list, pull, remove, ps subcommands - ollama_manage LLM-callable tool for agent-driven model operations - Onboarding flow with auto-detect (no API key required) - Non-blocking async probe — doesn't delay TUI paint - Respects OLLAMA_HOST env var for non-default endpoints Core changes (minimal): - Add "ollama" to KnownProvider in pi-ai types - Add "ollama" key resolution in env-api-keys.ts - Add "ollama" default model in model-resolver.ts - Add "Ollama (Local)" to onboarding wizard with probe flow |
||
|---|---|---|
| .. | ||
| scripts | ||
| src | ||
| package.json | ||
| tsconfig.json | ||