A collection of tools and examples for working with Large Language Models (LLMs), focusing on Groq for fast inference and Opik for tracing and evaluation.
| File | Description |
|---|---|
app.py |
Basic Streamlit chatbot powered by the Groq API. |
app_with_opik.py |
Full chatbot with Opik tracing and an in-app evaluation suite. |
groq-inference.py |
Minimal script: one Groq completion and print. |
hf_router_chat.py |
OpenAI-compatible client calling Hugging Face router API. |
run-opik.py |
Single LLM call instrumented with Opik @track. |
tests/eval.py |
Opik evaluation example: dataset + metrics + evaluate(). |
- Python 3.12+
- uv (recommended) or pip
-
Clone the repository:
git clone <repository_url> cd <repository_name>
-
Install dependencies with uv:
uv sync
Or with pip:
pip install -r requirements.txt
-
Environment variables: create a
.envin the project root (see .env.example below). Required for most scripts:GROQ_API_KEY— Groq API keyOPIK_API_KEY,OPIK_WORKSPACE— for Opik tracing/evalHF— Hugging Face API key (only forhf_router_chat.py)
For Opik integration, run once:
opik configure
# or for local self-host: opik configure --use-local True-
Basic chat (Groq only):
streamlit run src/app.py
-
Chat with Opik tracing and evaluation:
streamlit run src/app_with_opik.py
Run other scripts from the project root with python src/<script>.py or python tests/eval.py after setting env vars.
GROQ_API_KEY="your_groq_api_key"
HF="your_huggingface_api_key"
OPIK_API_KEY="your_opik_api_key"
OPIK_WORKSPACE="your_opik_workspace"Do not commit .env; it is listed in .gitignore.