This repository is a hands-on learning lab for LangChain Expression Language (LCEL) and a minimal LangGraph example. It provides:
- A comprehensive script of runnable LCEL examples grouped from foundation to advanced (
src/runnable_api.py,src/runnable.py). - An interactive CLI to run examples in order and view source + output (
src/runnable_cli.py). - A Chainlit chat UI that lets you pick a mode each turn, shows the example’s source, and streams the live output (
src/chainlit_app.py).
src/runnable_api.py: Source of truth. Per-mode functions, public API (MODES,run_mode,get_mode_source,get_next_suggested_mode) and runtime key management.src/runnable.py: Thin CLI wrapper that delegates torun_mode(compatible with--mode).src/runnable_cli.py: Interactive terminal UI suggesting the next mode and showing source + streamed output.src/chainlit_app.py: Chainlit app that behaves like a chat with per-turn mode selection.src/README_CLI.md: Extra details for the CLI.tests: All tests are undersrc/(e.g.,test_runnable.py,test_runnable_units.py).
- Python 3.9+
- A virtual environment is recommended.
# From repo root
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt# Create venv and install dependencies
make install
# (optional) Install dev tools
make devThis project supports Anthropic and OpenAI backends via LangChain. Anthropic takes precedence when both are set.
Set keys in your shell or in a local .env file at the project root (loaded automatically):
# Quick setup using the provided template:
cp example.env .env
# Then edit .env and set your keys
# Or directly in your shell:
# Anthropic (preferred if present)
export ANTHROPIC_API_KEY=your_key
# OpenAI (used if Anthropic not set)
export OPENAI_API_KEY=your_keyWhen you need offline/testing mode (no external API calls):
export LCEL_TEST_MODE=1In this mode the app uses a lightweight fake model that returns deterministic responses while preserving output shape.
Start the chat UI that lets you pick any LCEL mode per turn. It shows the example source and streams execution output.
# Ensure the venv is active and dependencies installed
chainlit run src/chainlit_app.py -w- Use the mode selector chips to choose a mode. The app suggests the next one in the learning path.
- You can set
OPENAI_API_KEY/ANTHROPIC_API_KEYin the Chainlit settings panel at runtime; the backend reloads automatically.
Interactive terminal experience with suggestions and progress tracking.
python src/runnable_cli.pyRun an individual example directly (same behavior as before refactor):
python src/runnable.py --mode basic# From repo root (venv activated)
pytest -q- Mode order is defined once in
src/runnable_api.py(MODES) and reused by CLI and Chainlit. - All non-streaming log lines are emitted as full lines (newline-prefixed and newline-suffixed) for consistent rendering; streaming keeps raw chunks.