Skip to content

Commit f13e698

Browse files
committed
Update README.md
1 parent 5f14829 commit f13e698

File tree

2 files changed

+8
-37
lines changed

2 files changed

+8
-37
lines changed

examples/mcp/building-a-supply-chain-copilot-with-agent-sdk-and-databricks-mcp/README.md

Lines changed: 2 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -11,36 +11,22 @@ A full-stack, Databricks-themed conversational assistant for supply chain querie
1111
- Example agent logic and tool usage
1212
- Modern UX, easy local development
1313

14-
---
15-
16-
## Project Structure
17-
```
18-
/ (root)
19-
├── main.py # Example CLI agent runner
20-
├── api_server.py # FastAPI backend for chat UI
21-
├── requirements.txt # Python dependencies
22-
├── ui/ # React frontend (Vite)
23-
│ ├── src/components/ChatUI.jsx, ChatUI.css, ...
24-
│ └── ...
25-
└── README.md # (this file)
26-
```
27-
28-
---
2914

3015
## Quickstart
3116

3217
### 0. Databricks assets
3318

3419
You can kick start your project with Databricks’ Supply-Chain Optimization Solution Accelerator (or any other accelerator if working in a different industry). Clone this accelerator’s GitHub repo into your Databricks workspace and run the bundled notebooks by running notebook 1:
3520

36-
https://github.com/lararachidi/agent-supply-chain/blob/main/README.md
21+
https://github.com/lara-openai/databricks-supply-chain
3722

3823
These notebooks stand up every asset the Agent will later reach via MCP, from raw enterprise tables and unstructured e-mails to classical ML models and graph workloads.
3924

4025
### 1. Prerequisites
4126
- Python 3.10+
4227
- Node.js 18+
4328
- Databricks credentials in `~/.databrickscfg`
29+
- OpenAI API key
4430
- (Optional) Virtualenv/pyenv for Python isolation
4531

4632
### 2. Install Python Dependencies
@@ -87,11 +73,5 @@ npm run dev
8773
- To update backend agent logic, modify `api_server.py`.
8874
- UI styling is in `ui/src/components/ChatUI.css` (Databricks red palette).
8975

90-
---
9176

92-
## Credits & References
93-
- Inspired by [Databricks Supply Chain Solution Accelerator](https://www.databricks.com/solutions/accelerators/supply-chain-distribution-optimization)
94-
- Uses [openai-agents-python](https://github.com/openai/openai-agents-python)
95-
- Databricks MCP integration via [databricks.sdk](https://github.com/databricks/databricks-sdk-py)
96-
- Supply-chain scope enforced by a simple LLM guardrail (see `supply_chain_guardrails.py`)
9777

examples/mcp/databricks_mcp_cookbook.ipynb

Lines changed: 6 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -87,19 +87,14 @@
8787
"To create this Databricks configuration profile file, run the [Databricks CLI](https://docs.databricks.com/aws/en/dev-tools/cli/) databricks configure command, or follow these steps:\n",
8888
"- If `~/.databrickscfg` is missing, create it: touch `~/.databrickscfg`\n",
8989
"- Open the file: `nano ~/.databrickscfg`\n",
90-
"- Insert a profile section that lists the workspace URL and personal-access token (PAT) (additional profiles can be added at any time):"
91-
]
92-
},
93-
{
94-
"cell_type": "code",
95-
"execution_count": null,
96-
"metadata": {},
97-
"outputs": [],
98-
"source": [
99-
"# Example profile section to add in your Databricks config profile file \n",
90+
"- Insert a profile section that lists the workspace URL and personal-access token (PAT) (additional profiles can be added at any time):\n",
91+
"\n",
92+
"\n",
93+
"```bash\n",
10094
"[DEFAULT]\n",
10195
"host = https://dbc-a1b2345c-d6e7.cloud.databricks.com # add your workspace URL here\n",
102-
"token = dapi123... # add your PAT here"
96+
"token = dapi123... # add your PAT here\n",
97+
"```"
10398
]
10499
},
105100
{
@@ -140,10 +135,6 @@
140135
"\n",
141136
"Alternatively, you can accelerate your setup by using a tailored version of the Databricks’ Supply Chain Optimization Solution Accelerator. To do so, you can clone this GitHub [repository](https://github.com/lara-openai/databricks-supply-chain) into your Databricks workspace and follow the instructions in the README [file](https://github.com/lara-openai/databricks-supply-chain/blob/main/README.md). Running the solution will stand up every asset the Agent will later reach via MCP, from raw enterprise tables and unstructured e-mails to classical ML models and graph workloads. \n",
142137
"\n",
143-
"```\n",
144-
"git clone https://github.com/lara-openai/databricks-supply-chain.git\n",
145-
"```\n",
146-
"\n",
147138
"If you prefer to use your own datasets and models, make sure to wrap relevant components as Unity Catalog functions and define a Vector Search index as shown in the accelerator. You can also expose Genie Spaces.\n",
148139
"\n",
149140
"The sample data mirrors a realistic pharma network: three plants manufacture 30 products, ship them to five distribution centers, and each distribution center serves 30-60 wholesalers. The repo ships time-series demand for every product-wholesaler pair, a distribution center-to-wholesaler mapping, a plant-to-distribution center cost matrix, plant output caps, and an e-mail archive flagging shipment delays. \n",

0 commit comments

Comments
 (0)