Skip to content

Commit fb90322

Browse files
lara-openaisabduragimov-openai
authored andcommitted
Improve FastAPI section narratives
1 parent f34766b commit fb90322

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

examples/mcp/databricks_mcp_cookbook.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -454,8 +454,6 @@
454454
"cell_type": "markdown",
455455
"metadata": {},
456456
"source": [
457-
"The [api_server.py](https://github.com/openai/openai-cookbook/blob/main/examples/mcp/building-a-supply-chain-copilot-with-agent-sdk-and-databricks-mcp/api_server.py) is a FastAPI backend that exposes your agent as a streaming /chat API endpoint. At startup it configures CORS so a local front-end can talk to it, then defines `build_mcp_servers()`, which authenticates to the caller’s Databricks workspace, constructs two HTTP “server tools” (one for vector search, one for Unity-Catalog functions), and pre-connects them for low-latency use. Each incoming POST to /chat contains a single user message. The handler spins up a fresh Agent whose mcp_servers list is populated by those streaming tools and whose model is forced to call a tool for every turn. \n",
458-
"\n",
459457
"To kick off the backend (Fast API), run the following command: "
460458
]
461459
},
@@ -472,7 +470,9 @@
472470
"cell_type": "markdown",
473471
"metadata": {},
474472
"source": [
475-
"The API will be available at http://localhost:8000 (for FastAPI docs go to: http://localhost:8000/docs). "
473+
"The API will be available at http://localhost:8000 (for FastAPI docs go to: http://localhost:8000/docs). \n",
474+
"\n",
475+
"The [api_server.py](https://github.com/openai/openai-cookbook/blob/main/examples/mcp/building-a-supply-chain-copilot-with-agent-sdk-and-databricks-mcp/api_server.py) is a FastAPI backend that exposes your agent as a streaming /chat API endpoint. At startup it configures CORS so a local front-end can talk to it, then defines `build_mcp_servers()`, which authenticates to the caller’s Databricks workspace, constructs two HTTP “server tools” (one for vector search, one for Unity-Catalog functions), and pre-connects them for low-latency use. Each incoming POST to /chat contains a single user message. The handler spins up a fresh Agent whose mcp_servers list is populated by those streaming tools and whose model is forced to call a tool for every turn. "
476476
]
477477
},
478478
{

0 commit comments

Comments
 (0)