Skip to content

fariha-batool/gpt-oss-groq-opik

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM OSS Playground

A collection of tools and examples for working with Large Language Models (LLMs), focusing on Groq for fast inference and Opik for tracing and evaluation.

Overview

File Description
app.py Basic Streamlit chatbot powered by the Groq API.
app_with_opik.py Full chatbot with Opik tracing and an in-app evaluation suite.
groq-inference.py Minimal script: one Groq completion and print.
hf_router_chat.py OpenAI-compatible client calling Hugging Face router API.
run-opik.py Single LLM call instrumented with Opik @track.
tests/eval.py Opik evaluation example: dataset + metrics + evaluate().

Setup

Prerequisites

  • Python 3.12+
  • uv (recommended) or pip

Installation

  1. Clone the repository:

    git clone <repository_url>
    cd <repository_name>
  2. Install dependencies with uv:

    uv sync

    Or with pip:

    pip install -r requirements.txt
  3. Environment variables: create a .env in the project root (see .env.example below). Required for most scripts:

    • GROQ_API_KEY — Groq API key
    • OPIK_API_KEY, OPIK_WORKSPACE — for Opik tracing/eval
    • HF — Hugging Face API key (only for hf_router_chat.py)

Opik configuration

For Opik integration, run once:

opik configure
# or for local self-host: opik configure --use-local True

Running the apps

  • Basic chat (Groq only):

    streamlit run src/app.py
  • Chat with Opik tracing and evaluation:

    streamlit run src/app_with_opik.py

Run other scripts from the project root with python src/<script>.py or python tests/eval.py after setting env vars.

.env example

GROQ_API_KEY="your_groq_api_key"
HF="your_huggingface_api_key"
OPIK_API_KEY="your_opik_api_key"
OPIK_WORKSPACE="your_opik_workspace"

Do not commit .env; it is listed in .gitignore.

About

This project provides a collection of tools and examples for working with Large Language Models (LLMs), focusing on Groq's fast inference capabilities and integration with the Opik LLM evaluation and tracing platform.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages