You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A modern template for agentic orchestration — built for rapid iteration and scalable deployment using highly customizable, community-supported tools like MCP, LangGraph, and more.
4
4
5
-
Read the docs with demo videos [here](https://nicholas-goh.com/docs/intro?ref=fastapi-mcp-langgraph-template).
5
+
Visit the Github: [](https://github.com/NicholasGoh/fastapi-mcp-langgraph-template)
6
+
7
+
> [!NOTE]
8
+
> Read the docs with demo videos [here](https://nicholas-goh.com/docs/intro?ref=fastapi-mcp-langgraph-template). This repo will not contain demo videos.
@@ -51,14 +59,16 @@ Read the docs with demo videos [here](https://nicholas-goh.com/docs/intro?ref=fa
51
59
-[](https://github.com/langfuse/langfuse) for LLM Observability and LLM Metrics
52
60
-[](https://github.com/prometheus/prometheus) for scraping Metrics
53
61
-[](https://github.com/grafana/grafana) for visualizing Metrics
54
-
-[](https://auth0.com/docs) SaaS for JWT authentication
62
+
-[](https://auth0.com/docs) SaaS for Authentication and Authorization with OIDC & JWT via OAuth 2.0
55
63
- CI/CD via Github Actions
56
64
-:dollar: Deploy live demo to [](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/AWS_Fargate.html)
57
65
- Provision with [](https://github.com/hashicorp/terraform) IaC
58
66
- Push built images to ECR and Dockerhub
59
67
60
68
## Architecture
61
69
70
+
This section outlines the architecture of the services, their interactions, and planned features.
71
+
62
72
### Inspector
63
73
64
74
Inspector communicates via SSE protocol with each MCP Server, while each server adheres to MCP specification.
@@ -112,6 +122,8 @@ graph LR
112
122
113
123
### Reverse Proxy
114
124
125
+
Can be extended for other services like Frontend and/or certain backend services self-hosted instead of on cloud (e.g., Langfuse).
Setup to run the repository in both production and development environments.
150
172
151
173
Build community youtube MCP image with:
152
174
153
175
```bash
154
176
./community/youtube/build.sh
155
177
```
156
178
157
-
> [!TIP]
158
-
> Instead of cloning or submoduling the repository locally, then building the image, this script builds the Docker image inside a temporary Docker-in-Docker container. This approach avoids polluting your local environment with throwaway files by cleaning up everything once the container exits.
179
+
:::tip
180
+
181
+
Instead of cloning or submoduling the repository locally, then building the image, this script builds the Docker image inside a temporary Docker-in-Docker container. This approach avoids polluting your local environment with throwaway files by cleaning up everything once the container exits.
182
+
183
+
:::
159
184
160
185
Then build the other images with:
161
186
@@ -189,27 +214,39 @@ Start production containers:
[](https://www.star-history.com/#nicholasgoh/fastapi-mcp-langgraph-template&Date)
297
+
298
+
> [!NOTE]
299
+
> Click above to view live update on star history as per their [article](https://www.star-history.com/blog/a-message-to-github-star-history-users):
300
+
> Ongoing Broken Live Chart
301
+
> you can still use this website to view and download charts (though you may need to provide your own token).
> By configuring the OpenAI Integration, users can gain valuable insights into token usage rates, response times, and overall costs. This integration empowers users to make data-driven decisions while ensuring optimal utilization of OpenAI APIs.
4
+
5
+
Learn more [here](https://grafana.com/docs/grafana-cloud/monitor-infrastructure/integrations/integration-reference/integration-openai/)
At its core, a compiled LangGraph is a [Runnable](https://github.com/langchain-ai/langchain/blob/langchain%3D%3D0.3.6/libs/core/langchain_core/runnables/base.py#L108). This template utilizes LangChain’s built-in streaming support through [`astream_events`](https://python.langchain.com/docs/how_to/streaming/#using-stream-events), granting programmatic access to every stage of the Agentic Workflow. You can observe and interact with key components—LLM, prompt, and tool—throughout their full execution lifecycle: start, stream, and end. For a comprehensive list of event types and usage examples, refer to the [Event Reference](https://python.langchain.com/docs/how_to/streaming/#event-reference).
12
+
13
+
## Persistence
14
+
15
+
LangGraph offers built-in state management and persistence via the [AsyncPostgresSaver](https://github.com/langchain-ai/langgraph/blob/0.2.39/libs/checkpoint-postgres/langgraph/checkpoint/postgres/aio.py#L39), enabling faster iteration on agentic workflows. Since LLMs are inherently stateless, chat history must typically be injected as context for each query—but LangGraph abstracts this away, requiring only a `thread_id`. It seamlessly handles chat history and metadata serialization/deserialization, simplifying development. Learn more about its advanced persistence capabilities [here](https://langchain-ai.github.io/langgraph/concepts/persistence/).
> MCP is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect your devices to various peripherals and accessories, MCP provides a standardized way to connect AI models to different data sources and tools.
17
5
18
6
Learn more [here](https://modelcontextprotocol.io/introduction).
@@ -30,7 +16,7 @@ Learn more [here](https://modelcontextprotocol.io/introduction).
30
16
31
17
## Inspector
32
18
33
-
Explore community and your custom MCP servers via Inspector at [http://localhost:6274](http://localhost:6274) in [Development](../README.md#development).
19
+
Explore community and your custom MCP servers via Inspector at [http://localhost:6274](http://localhost:6274) in [Development](./quick-start#development).
34
20
35
21
Left Sidebar:
36
22
@@ -42,11 +28,13 @@ Explore the following tabs in the Top Navbar:
42
28
43
29
-`Resources`
44
30
-`Prompts`
45
-
-`Tools`.
31
+
-`Tools`
32
+
33
+
See demo videos to learn more.
46
34
47
-
## Community MCPs
35
+
## Community MCP Servers
48
36
49
-
Before building your own custom MCP, explore the growing list of hundreds of [community MCPs](https://github.com/modelcontextprotocol/servers). With integrations spanning databases, cloud services, and web resources, the perfect fit might already exist.
37
+
Before building your own custom MCP servers, explore the growing list of hundreds of [community MCP servers](https://github.com/modelcontextprotocol/servers). With integrations spanning databases, cloud services, and web resources, the perfect fit might already exist.
50
38
51
39
### DBHub
52
40
@@ -55,10 +43,12 @@ Learn more [here](https://github.com/bytebase/dbhub). Explore more in [Inspector
55
43
Easily plug in this MCP into LLM to allow LLM to:
56
44
57
45
- Perform read-only SQL query validation for secure operations
46
+
58
47
- Enable deterministic introspection of DB
59
48
- List schemas
60
49
- List tables in schemas
61
50
- Retrieve table structures
51
+
62
52
- Enrich user queries deterministically
63
53
- Ground DB related queries with DB schemas
64
54
- Provide SQL templates for translating natural language to SQL
@@ -78,6 +68,31 @@ Simply plug in this MCP to enable LLM to:
78
68
79
69
- Fetch transcripts from any YouTube URL on demand
80
70
71
+
Check out the [demo video](#video-demo) at the top.
72
+
81
73
## Custom MCP
82
74
83
-
Should you require a custom MCP, a template is provided [here](https://github.com/NicholasGoh/fastapi-mcp-langgraph-template/blob/main/backend/shared_mcp/tools.py) for you to reference in development.
75
+
Should you require a custom MCP server, a template is provided [here](https://github.com/NicholasGoh/fastapi-mcp-langgraph-template/blob/main/backend/shared_mcp/tools.py) for you to reference in development.
0 commit comments