A self-hosted AI agent for NOAA HPC that connects local LLMs to ocean data,
scientific documentation, and HPC workflows entirely within NOAA's network.
- Ocean data & analysis — Real-time water levels, hurricane tracks, storm surge forecasts, satellite data via ocean-mcp, plus Python code execution for plotting and analysis
- Code & documentation — RAG search over SCHISM/ADCIRC source code, NOAA tech memos, namelists, and NOS workflow configs
- HPC workflows — Slurm and PBS job diagnostics, ecFlow suite monitoring, UFS-Coastal experiment management, disk quotas, FairShare, threshold alerting
- Persistent & portable — Memory across sessions, CLI with slash commands, web UI, deployed on NOAA Ursa and TACC Vista
All self-hosted on Ollama with open-weight LLMs. No external APIs, no cloud dependencies.
git clone https://github.com/mansurjisan/coral.git
cd coral
pip install -e .
ollama pull qwen3:32b
coral chat --model qwen3:32b --mode multiFor HPC deployment, see NOAA Ursa setup or TACC Vista setup.
CORAL connects to 22 MCP servers providing 150+ tools across three categories:
- Ocean data (12 servers) — CO-OPS, NHC, STOFS, ERDDAP, OFS, GOES, USGS, NDBC, WW3, ADCIRC, SCHISM, Hurricane Recon via ocean-mcp
- HPC & workflow (6 servers) — Slurm, PBS (WCOSS2), ecFlow, UFS experiment runner, HPC system admin, NOS workflow configs
- Local tools (4 servers) — NetCDF queries, RAG documentation search, Python execution (sandboxed), threshold alerting
╭──────────────────────── CORAL v0.1.0 ────────────────────────╮
│ 🪸 Model qwen3:32b │ Tools 150 │ Mode multi-agent │
│ 🖥️ User mansurjisan │ Host login2.vista.tacc.utexas.edu │
│ 🧠 3 memories loaded │
╰──── Coastal Ocean Research AI Layer · /help for commands ────╯
| Command | Description |
|---|---|
/help |
Show all commands |
/tools |
Tool count per section (DATA/CODE/WORKFLOW) |
/status |
Quick dashboard: Ollama health, running jobs, disk usage |
/memory |
Show saved memories |
/remember |
Save a preference (e.g. /remember account = coastal) |
/forget |
Remove a memory |
/save |
Export conversation to markdown |
/report |
Auto-generate HPC status report |
/techmemo |
Auto-generate NOAA tech memo draft |
/alert |
Set threshold alert (e.g. /alert 8518750 > 1.5) |
/watch |
Monitor a job (e.g. /watch 9848988) |
/branch |
Save conversation, start fresh |
/audit |
Show tool call history and stats |
/clear |
Clear conversation history |
@model |
Override model for one query (e.g. @qwen3:8b What is SCHISM?) |
- Audit logging — Every query and tool call recorded with timestamps, routing decisions, tool timing, confidence scores, and sandbox usage
- Policy manifest (
src/coral/policy_manifest.json) — Defines which MCP servers each agent section can access, trust classes, and per-environment sandbox requirements - Sandbox enforcement — Set
CORAL_REQUIRE_SANDBOX=1to block host-side Python execution - Tool caching — 5-minute TTL cache for read-only tools (HPC status, configs)
- Retry with backoff — Automatic retry on transient Ollama connection failures
- ocean-mcp — MCP servers for NOAA ocean data (18 servers)
- nos-workflow — NOS Unified Operational Forecast System workflow
- Ollama — Local LLM inference
- Model Context Protocol — Tool integration standard
If you use CORAL in your research or operations, please cite:
@software{jisan2025coral,
author = {Jisan, Mansur},
title = {CORAL: Coastal Ocean Research AI Layer},
year = {2025},
url = {https://github.com/mansurjisan/coral},
note = {A self-hosted AI agent connecting local LLMs to NOAA ocean data via MCP}
}Mansur Jisan — NOAA National Ocean Service
Apache 2.0