Releases: Josh-XT/OpticXT
v0.1.0
OpticXT v0.1.0 Release Notes 🎉
September 16, 2025
We're thrilled to announce OpticXT v0.1.0, a major update that revolutionizes how you deploy and use OpticXT! This release introduces groundbreaking remote model capabilities, a comprehensive REST API, and significant improvements to project architecture and documentation.
🌟 What's New
🌐 Remote Model Revolution
The biggest game-changer in this release! OpticXT now supports remote model inference via OpenAI-compatible APIs, enabling deployment on minimal hardware while maintaining full multimodal capabilities.
- 🔥 Run on Raspberry Pi Zero 2W: Deploy OpticXT on ultra-low-power hardware by offloading inference to remote providers
- 🎯 Multiple Provider Support: Seamlessly switch between OpenAI GPT-4o, Anthropic Claude, Groq, or your own self-hosted models
- 🖼️ Full Vision Support: Complete multimodal capabilities (text, image, video) work with remote models
- ⚙️ Same Interface: Zero code changes needed - just update your configuration to switch from local to remote inference
- 💪 Provider Flexibility: Configure multiple providers and failover options for maximum reliability
🚀 REST API Server Mode
Transform OpticXT into a powerful multimodal API server with our new REST API functionality:
- 📡
/v1/inferenceEndpoint: Process text, image, and video inputs via HTTP requests - 📊 Real-time Status Updates: Track task progress and get live status information
- 🔧 Easy Integration: OpenAPI-compatible endpoints for seamless integration with existing systems
- 📖 Comprehensive Documentation: Detailed API documentation with examples in the new
docs/API_DOCUMENTATION.md
📚 Enhanced Documentation & Project Structure
We've completely overhauled the documentation and project organization:
- 🗂️ Clear Project Structure: New detailed section in README outlining code organization
- 📋 API Documentation: Brand new
docs/API_DOCUMENTATION.mdwith complete API reference - 🔧 CUDA Build Guide: Enhanced CUDA setup instructions for optimal GPU performance
- 🌐 Remote Model Guide: Step-by-step implementation guide for remote model setup
- ✅ Improved Testing: Reorganized test structure with clear instructions for unit, integration, and component testing
⚡ Configuration & Dependencies
- 📦 Modern Dependencies: Added support for HTTP requests, web server functionality, and multipart data handling
- ⚙️ Enhanced Config: Updated
config.tomlwith comprehensive remote model configuration examples - 🔄 Easy Switching: Simple configuration changes to toggle between local and remote inference modes
🛠️ CLI & Development Improvements
- 🖥️ Extended CLI Options: New API server mode, benchmarking capabilities, and enhanced chat/video modes
- 🧪 Streamlined Testing: Standard Cargo test commands with organized test directory structure
- 📜 Example Scripts: Updated scripts and configuration examples for various deployment scenarios
🔧 Technical Improvements
Bug Fixes & Stability
- 🐛 Fixed test references: Resolved issues with test file organization and execution
- ✅ Status check improvements: Enhanced path resolution and status validation
- 🔄 Error handling: Better error recovery and reporting throughout the system
Performance Enhancements
- ⚡ Optimized HTTP handling: Efficient request/response processing for API mode
- 📊 Better resource management: Improved memory usage and cleanup in multimodal processing
- 🚀 Faster initialization: Reduced startup time for both local and remote model configurations
🚀 Getting Started with Remote Models
Ready to deploy OpticXT on minimal hardware? Here's how easy it is:
- Configure your provider in
config.toml:
[model]
use_remote = true
remote_api_url = "https://api.openai.com/v1/chat/completions"
remote_api_key = "your-api-key"
remote_model_name = "gpt-4o"- Run OpticXT - it works exactly the same way:
./target/release/opticxt --verbose- Or start the API server:
./target/release/opticxt --api-server --port 8080🎯 Impact
This release transforms OpticXT from a GPU-dependent robot control system into a flexible, scalable platform that can run anywhere - from Raspberry Pi Zero to high-end NVIDIA RTX systems. Whether you're prototyping on minimal hardware or deploying in production with local GPU acceleration, OpticXT v0.1.0 adapts to your needs.
📦 What's Next
- Enhanced provider integrations and failover mechanisms
- Extended API endpoints for robot-specific operations
- Advanced configuration management and deployment tools
- Community provider marketplace and sharing features
Ready to upgrade? Check out our updated README.md and API Documentation to get started with the new features!
Questions or feedback? Open an issue on GitHub or join our community discussions.
The OpticXT Team 🤖✨
Full Changelog: v0.0.1...v0.1.0