Skip to content

Conversation

kyle-rader
Copy link

Add tiktoken CLI tool for quick token counting 🚀

Motivation

While building LLM-powered tools/agents, I'm finding the need to quickly count tokens in prompt files and potential input data to estimate context usage (token cost 💰).

Changes

  • Added CLI binary target with clap-based argument parsing
  • Updated README with usage examples

Usage

# Count tokens and show JSON output format
$ bat README.md | tiktoken --json
🔎 Reading from stdin... # shown on stderr
{
  "token_count": 1478,
  "model": "gpt-4.1",
  "context_size": 1047576,
  "remaining_tokens": 1046098,
  "usage_percentage": 0.141
}

# Specific model
tiktoken --model gpt-4o "Your text here"

# List available models
tiktoken --list-models

Installation

cargo install tiktoken-rs

The CLI tool will be available as tiktoken after installation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant