Skip to content

Commit 0078986

Browse files
committed
docs: add agent-api readme
1 parent 03f0cd5 commit 0078986

File tree

2 files changed

+99
-12
lines changed

2 files changed

+99
-12
lines changed

packages/agent-api/README.md

Lines changed: 97 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,105 @@
1-
# Azure Functions API
1+
<div align="center">
22

3-
This project uses [Azure Functions](https://learn.microsoft.com/azure/azure-functions/functions-overview?pivots=programming-language-javascript) as a serverless API, and [LangChain.js](https://js.langchain.com/) to implement the AI capabilities.
3+
# Agent API (Azure Functions + LangChain.js)
44

5-
## Available Scripts
5+
[![Open project in GitHub Codespaces](https://img.shields.io/badge/Codespaces-Open-blue?style=flat-square&logo=github)](https://codespaces.new/Azure-Samples/mcp-agent-langchainjs?hide_repo_select=true&ref=main&quickstart=true)
6+
![Node version](https://img.shields.io/badge/Node.js->=22-3c873a?style=flat-square)
7+
[![TypeScript](https://img.shields.io/badge/TypeScript-blue?style=flat-square&logo=typescript&logoColor=white)](https://www.typescriptlang.org)
8+
[![LangChain.js](https://img.shields.io/badge/LangChain.js-1C3C3C?style=flat-square&logo=langchain&logoColor=white)](https://js.langchain.com)
69

7-
In the project directory, you can run:
10+
[Overview](#overview)[API Endpoints](#api-endpoints)[Development](#development)
811

9-
### `npm start`
12+
</div>
1013

11-
This command will start the API in dev mode, and you will be able to access it through the URL `http://localhost:7071/api/`.
14+
## Overview
1215

13-
You can use the `api.http` file to test the API using the [REST Client](https://marketplace.visualstudio.com/items?itemName=humao.rest-client) extension for Visual Studio Code.
16+
The Agent API hosts the LangChain.js powered burger ordering agent. It:
1417

15-
### `npm run build`
18+
- Streams chat completions with intermediate tool + LLM steps
19+
- Connects to the Burger MCP server (Streamable HTTP) to invoke tools
20+
- Persists chat history + session titles in Azure Cosmos DB for NoSQL
21+
- Derives user identity from Azure Static Web Apps authentication
22+
- Emits OpenTelemetry traces (to Azure Monitor if configured, else local OTLP exporter)
1623

17-
To build the API for production to the `dist` folder.
24+
<div align="center">
25+
<img src="../../docs/images/agent-architecture.drawio.png" alt="Architecture" />
26+
</div>
27+
28+
## API Endpoints
29+
30+
| Method | Path | Description |
31+
|--------|------|-------------|
32+
| GET | /api/me | Returns (and lazily creates) the internal hashed user id for the authenticated user |
33+
| GET | /api/chats | Lists all chat sessions for the current user |
34+
| GET | /api/chats/{sessionId} | Returns messages for a specific session |
35+
| DELETE | /api/chats/{sessionId} | Deletes a chat session and its messages |
36+
| POST | /api/chats/stream | Streams an agent response for provided messages; creates/updates session |
37+
38+
### Streaming format
39+
40+
`POST /api/chats/stream` returns `application/x-ndjson`. Each line is a JSON object shaped like:
41+
42+
```jsonc
43+
{
44+
"delta": {
45+
"content": "<partial text>",
46+
"role": "assistant",
47+
"context": { "currentStep": { ... } }
48+
},
49+
"context": { "sessionId": "<uuid>" }
50+
}
51+
```
52+
53+
Tool and LLM steps surface in `context.currentStep` / `context.intermediateSteps` enabling progressive UI rendering.
54+
55+
## Development
56+
57+
### Getting started
58+
59+
Follow the instructions [here](../../README.md#getting-started) to set up the development environment for the entire Pizza MCP Agents project.
60+
61+
### Run the application
62+
63+
Use the following command to run the application locally:
64+
65+
```bash
66+
npm start
67+
```
68+
69+
This command will start the Azure Functions application locally. You can test the endpoints by opening the file `api.http` and click on **Send Request** to test the endpoints.
70+
71+
The agent API needs the Burger MCP server (and Burger API if running everything locally) to be running as well. You can start all the services at once by running `npm start` in the root of the project.
72+
73+
> [!NOTE]
74+
> If you have not deployed the Azure resources, it will fall back to in-memory data. You can test the API without deploying it to Azure.
75+
76+
### Available Scripts
77+
78+
| Script | Description |
79+
|--------|-------------|
80+
| `npm start` | Start the development server with hot reload |
81+
| `npm run build` | Build the TypeScript source |
82+
| `npm run clean` | Clean build artifacts |
83+
| `update:local-settings` | Create or update `local.settings.json` needed by the Functions runtime |
84+
85+
## Configuration
86+
87+
The application uses environment variables for configuration:
88+
89+
| Variable | Required | Purpose | Default / Fallback |
90+
|----------|----------|---------|--------------------|
91+
| `AZURE_OPENAI_API_ENDPOINT` | Yes | Azure OpenAI endpoint used for chat completions ||
92+
| `AZURE_OPENAI_MODEL` | No | Model name passed to LangChain.js | `gpt-5-mini` |
93+
| `BURGER_MCP_URL` | Yes* | Streamable HTTP endpoint of Burger MCP server | `http://localhost:3000/mcp` |
94+
| `AZURE_COSMOSDB_NOSQL_ENDPOINT` | No | Enables persistent chat + session titles | In‑memory fallback |
95+
| `APPLICATIONINSIGHTS_CONNECTION_STRING` | No | Sends traces to Azure Monitor | Local OTLP exporter |
96+
97+
> <sup>*</sup>The code defaults `BURGER_MCP_URL` to the local dev port; production must set this.
98+
99+
### Local OpenTelemetry traces
100+
101+
When running locally, spans are exported to `http://localhost:4318/v1/traces` and you can capture traces by running a local collector.
102+
103+
For example, you can use the [AI Toolkit VSCode extension](https://marketplace.visualstudio.com/items?itemName=ms-windows-ai-studio.windows-ai-studio) which includes a local OpenTelemetry collector.
104+
105+
After installing the extension, open the **I Toolkit** panel in the sidebar, go to the **Tracing** tool under **Agent and workflow tools**, and select **Start Collector**. When calling the agent API, you should see traces appear in the panel. You will then be able to inspect each trace in detail: The one named `LangGraph` contains the full sequence of LLM calls and tool invocations, allowing you to see how the agent arrived at its final response.

packages/burger-api/README.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -88,18 +88,17 @@ Follow the instructions [here](../../README.md#getting-started) to set up the de
8888

8989
### Run the application
9090

91-
You can run the following command to run the application locally:
91+
Use the following command to run the application locally:
9292

9393
```bash
9494
npm start
9595
```
9696

97-
This command will start the Azure Functions application locally. You can test the application by opening the file `api.http` and click on **Send Request** to test the endpoints.
97+
This command will start the Azure Functions application locally. You can test the endpoints by opening the file `api.http` and click on **Send Request** to test the endpoints.
9898

9999
> [!NOTE]
100100
> If you have not deployed the Azure resources, it will fall back to in-memory data. You can test the API without deploying it to Azure.
101101
102-
103102
### Available Scripts
104103

105104
| Script | Description |

0 commit comments

Comments
 (0)