Skip to content

mansurjisan/coral

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🪸 CORAL — Coastal Ocean Research AI Layer

A self-hosted AI agent for NOAA HPC that connects local LLMs to ocean data,
scientific documentation, and HPC workflows entirely within NOAA's network.

CI Python 3.10+ License Ollama MCP

What It Does

  • Ocean data & analysis — Real-time water levels, hurricane tracks, storm surge forecasts, satellite data via ocean-mcp, plus Python code execution for plotting and analysis
  • Code & documentation — RAG search over SCHISM/ADCIRC source code, NOAA tech memos, namelists, and NOS workflow configs
  • HPC workflows — Slurm and PBS job diagnostics, ecFlow suite monitoring, UFS-Coastal experiment management, disk quotas, FairShare, threshold alerting
  • Persistent & portable — Memory across sessions, CLI with slash commands, web UI, deployed on NOAA Ursa and TACC Vista

All self-hosted on Ollama with open-weight LLMs. No external APIs, no cloud dependencies.

Quick Start

git clone https://github.com/mansurjisan/coral.git
cd coral
pip install -e .
ollama pull qwen3:32b
coral chat --model qwen3:32b --mode multi

For HPC deployment, see NOAA Ursa setup or TACC Vista setup.

MCP Servers

CORAL connects to 22 MCP servers providing 150+ tools across three categories:

  • Ocean data (12 servers) — CO-OPS, NHC, STOFS, ERDDAP, OFS, GOES, USGS, NDBC, WW3, ADCIRC, SCHISM, Hurricane Recon via ocean-mcp
  • HPC & workflow (6 servers) — Slurm, PBS (WCOSS2), ecFlow, UFS experiment runner, HPC system admin, NOS workflow configs
  • Local tools (4 servers) — NetCDF queries, RAG documentation search, Python execution (sandboxed), threshold alerting

CLI Features

╭──────────────────────── CORAL v0.1.0 ────────────────────────╮
│  🪸 Model  qwen3:32b  │  Tools  150  │  Mode  multi-agent   │
│  🖥️  User   mansurjisan  │  Host   login2.vista.tacc.utexas.edu │
│  🧠 3 memories loaded                                        │
╰──── Coastal Ocean Research AI Layer · /help for commands ────╯
Command Description
/help Show all commands
/tools Tool count per section (DATA/CODE/WORKFLOW)
/status Quick dashboard: Ollama health, running jobs, disk usage
/memory Show saved memories
/remember Save a preference (e.g. /remember account = coastal)
/forget Remove a memory
/save Export conversation to markdown
/report Auto-generate HPC status report
/techmemo Auto-generate NOAA tech memo draft
/alert Set threshold alert (e.g. /alert 8518750 > 1.5)
/watch Monitor a job (e.g. /watch 9848988)
/branch Save conversation, start fresh
/audit Show tool call history and stats
/clear Clear conversation history
@model Override model for one query (e.g. @qwen3:8b What is SCHISM?)

Audit & Policy

  • Audit logging — Every query and tool call recorded with timestamps, routing decisions, tool timing, confidence scores, and sandbox usage
  • Policy manifest (src/coral/policy_manifest.json) — Defines which MCP servers each agent section can access, trust classes, and per-environment sandbox requirements
  • Sandbox enforcement — Set CORAL_REQUIRE_SANDBOX=1 to block host-side Python execution
  • Tool caching — 5-minute TTL cache for read-only tools (HPC status, configs)
  • Retry with backoff — Automatic retry on transient Ollama connection failures

Related

Citation

If you use CORAL in your research or operations, please cite:

@software{jisan2025coral,
  author = {Jisan, Mansur},
  title = {CORAL: Coastal Ocean Research AI Layer},
  year = {2025},
  url = {https://github.com/mansurjisan/coral},
  note = {A self-hosted AI agent connecting local LLMs to NOAA ocean data via MCP}
}

Author

Mansur Jisan — NOAA National Ocean Service

License

Apache 2.0

About

A self-hosted multi-agent AI assistant for NOAA ocean data, model code, and HPC workflows — powered by Ollama and MCP

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors