Skip to content
/ cookie Public

A terminal-based chat client for LLMs in Rust.

License

Notifications You must be signed in to change notification settings

zzho325/cookie

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

159 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

cookie 🍪

🧪 Early Alpha — A work-in-progress with frequent updates and improvements. Your feedback is welcomed!

A lightweight, terminal-based chat client for LLMs, built in Rust. Chat with OpenAI’s ChatGPT or any provider directly from your terminal.

Snapshot

🛠️ Getting Started

Installation

Homebrew:

brew tap zzho325/tap
brew install zzho325/tap/cookie

Build from source:

git clone https://github.com/zzho325/cookie.git
cd cookie

Configuration

export OPENAI_API_KEY=your_key_here

Usage

cargo build --release
./target/release/cookie
  • Type your prompt, Enter to send.
  • i / Esc to toggle input mode, q to quit.
  • CTRL + e to toggle side bar, j / k or Down / Up to navigate sessions and d to delete selected session.
  • s to open model selection, j / k or Down / Up to select, Esc / Enter to cancel or save.
  • Tab to shift focus.
  • n to start new session.
  • In editor/messages: e to enter editor based on VISUAL or EDITOR environment variable.
  • In messages: v to toggle line-based visual selection, y to copy selection.

🛣️ Roadmap

🎯 Milestones

  • Chat UI/UX
    • [Input box] Soft wrap.
    • [Input box] Cursor nagivation.
    • [Input box] Scrollable buffer.
    • [Chat messages] Render chat as markdown.
      • Fix color.
      • Fix unsupported syntax.
    • [Chat messages] Scroll.
    • [Chat messages] Cursor navigation.
      • Select range and copy.
    • Mouse event - scroll, navigation, and select range and copy.
    • Embed nvim.
      • [Input editor].
      • [Chat Messages].
  • App:
    • Indicate current focused widget.
    • Help popup.
    • Configurable key bindings.
    • Load config properly.
    • UI to update settings.
    • Error popup.
      • Separate out recoverable or irrecoverable errors.
    • Color theme.
  • Chat Engine:
    • Retain context across chats.
      • Maintain reasoning context.
    • Web: optional search and crawl.
    • Model selection.
    • Other LLM providers and provider selection.
    • Configurable system prompt.
    • Streaming.
    • Track token usage.
    • On shutdown, persist streaming message.
  • Session Management:
    • Sessions.
    • Persist sessions to db.
    • Global search.

⚠️ Limitations

  • Using the tui-markdown crate for Markdown rendering, which currently supports a subset of markdown features.

License

Copyright (c) Ashley Zhou ashleyzhou62@gmail.com

This project is licensed under the MIT license (LICENSE or http://opensource.org/licenses/MIT)

About

A terminal-based chat client for LLMs in Rust.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 2

  •  
  •  

Languages