All notable changes to the "vscode-pi-model-chat-provider" extension will be documented in this file.
- Initial release
- Language Model Chat Provider integration for Pi coding agent
- Multi-turn conversation persistence with session pooling
- Support for up to 20 concurrent chat sessions
- LRU eviction and idle timeout cleanup (10 minutes)
- Status bar showing context window utilization
- Dynamic model enumeration from Pi
- Streaming responses with tool execution display
- Cancellation support (stop button)
- Automatic context (environment, workspace) injection
- Session isolation per chat thread
- Context window tracking and display
- Graceful error handling for empty/image-only messages
- Model caching for fast startup
- Persistent metadata bridge for quick model enumeration
- Debug logging configurable via settings (disabled by default)
pi.binaryPath- Path to Pi CLI binarypi.workingDirectory- Working directory for Pi agentpi.autoRestart- Auto-restart on crashpi.maxRestartAttempts- Max restart attemptspi.additionalArgs- Additional CLI argumentspi.debug- Enable debug logging (default: false)