Skip to content

moustafa-openai/chatgpt-app-template

Repository files navigation

Apps SDK Pizzaz Examples

Focused Apps SDK + MCP example centered on Pizzaz widgets.

Project layout

  • index.ts - MCP server entrypoint (SSE + tool wiring)
  • tools/ - MCP tool definitions (one file per tool)
  • utils/ - generic MCP server plumbing/helpers
  • ui/ - React widget source code
  • assets/ - built widget HTML/JS/CSS output
  • build-all.mts - widget production build script
  • vite.config.mts - local widget dev server config

Prerequisites

  • Node.js 18+
  • pnpm

Install

pnpm install

Optional local config:

cp .env.example .env
  • MCP_PORT controls MCP server port (default 8000)

OpenAI Docs MCP

Recommend installing the OpenAI developer docs MCP server.

Server URL (streamable HTTP): https://developers.openai.com/mcp

codex mcp add openaiDeveloperDocs --url https://developers.openai.com/mcp
codex mcp list

Alternative config in ~/.codex/config.toml:

[mcp_servers.openaiDeveloperDocs]
url = "https://developers.openai.com/mcp"

Add this instruction to AGENTS guidance when working with OpenAI platform topics:

Always use the OpenAI developer documentation MCP server if you need to work with the OpenAI API, ChatGPT Apps SDK, Codex,… without me having to explicitly ask.

Smooth local dev

Run everything with one command:

pnpm run dev

This does:

  • initial asset build
  • UI rebuild watch on ui/**
  • MCP server watch/restart on backend changes
  • serves widget assets from MCP server at /assets/*
  • serves latest widget template HTML from disk on each resource read, so UI-only changes are reflected without restarting MCP

To expose MCP publicly, run ngrok separately:

pnpm run dev
# in another terminal:
ngrok http 8000

Common commands

Build widget assets:

pnpm run build

Run MCP server:

pnpm run mcp:start

Run MCP server in watch mode:

pnpm run mcp:dev

Run Vite UI dev server only:

pnpm run dev:vite

Optional separate static asset serving (not needed for normal MCP dev):

BASE_URL=http://localhost:4444 pnpm run build
pnpm run serve

MCP endpoints

Default (MCP_PORT=8000):

  • GET /mcp
  • POST /mcp/messages?sessionId=...
  • GET /assets/*

ChatGPT connector (local)

Run the MCP app locally:

pnpm run dev

In a separate terminal, expose it with ngrok:

ngrok http 8000

Then add this MCP URL in ChatGPT developer mode:

https://<your-tunnel-domain>/mcp

Deploy note

If assets are hosted elsewhere, override BASE_URL when building:

BASE_URL=https://your-server.com pnpm run build

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors