BigChat is a WinUI 3 chat application that provides a modern desktop interface for interacting with large language models. It is designed for experimentation, developer workflows, and local/cloud model usage on Windows.
- Native WinUI 3 desktop UI with a responsive chat experience.
- Flexible provider support: local Ollama models and OpenAI-compatible services.
- Conversation management with persistent, on-device storage (SQLite + EF Core).
- Fine-grained model controls: temperature, top-p, max output tokens, frequency/presence penalties, and restore defaults.
- Streaming responses and custom Markdown rendering for readable replies.
- Open-source and extensible — aimed at developers and advanced users.
- Modern WinUI 3 chat interface for creating and managing conversations.
- Multiple AI provider support:
- Ollama (local model server via OllamaSharp)
- OpenAI-compatible endpoints (use your API key)
- Local data store using Entity Framework Core with SQLite – conversations and messages are stored on-device.
- Settings page to configure provider, endpoints, API keys, and model parameters.
- Custom Markdown rendering using a modified
MarkdownTextBlockfrom the Community Toolkit.
- Windows 11 or later.
- .NET 10 SDK.
- Visual Studio (recommended) with WinUI 3 workload to build from source.
- For Ollama: install and run a local Ollama server before using Ollama models.
- For cloud providers: valid API key(s) and correct endpoint configuration in Settings.
- Clone the repository.
- Restore dependencies and build in Visual Studio or via the .NET CLI.
- Run the app (
BigChat.WinUI). - Open Settings and choose your AI provider. Configure endpoint, model ID, and API key if required.
- Start a new conversation and type your message in the input box. Responses stream back as they are generated.
Open Settings to select the active provider and set provider-specific values:
Ollama: set the local endpoint (defaulthttp://localhost:11434) and choose a local model.OpenAI-compatible: set endpoint, API key, and model ID.
- Conversation history is stored locally in a SQLite database by default.
- Model inference is performed by the configured provider. If you use a remote provider (OpenAI or Azure), your data is sent to that service according to its policy. Review provider privacy terms before connecting.
- Experimental: the app relies on preview and alpha libraries and is intended for learning and experimentation.
Licensed under the MIT License. See LICENSE.txt for details.
- Built with WinUI 3, .NET 10, Entity Framework Core, and CommunityToolkit components.
Efficient, modern, and open.
