Skip to content
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -124,4 +124,6 @@ gunicorn.conf.py
# ignore version file
src/databricks/labs/mcp/_version.py

.ruff_cache/
.ruff_cache/
.build/
.databricks
20 changes: 20 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ Table of Contents
- [Usage](#usage)
- [Supported tools](#supported-tools)
- [Developer Tools Server](#developer-tools-server)
- [Deploying the MCP server on Databricks Apps](#deploying-the-mcp-server-on-databricks-apps)
- [Support](#support)
- [Contributing](#contributing)

Expand Down Expand Up @@ -76,6 +77,24 @@ the following tools:

This server is currently under construction. It is not yet usable, but contributions are welcome!

## Deploying the MCP server on Databricks Apps
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@renardeinside sorry I totally missed this, thanks for doing this! cc @aravind-segu. Would you mind moving this up into the Unity Catalog MCP server README section? Since the devtools server doesn't include any code just yet

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's keep everything in one readme. I'll later refactor it into proper docs via docusaurus, same as for DQX. Having separate files is quite inconvenient.

Copy link
Contributor

@smurching smurching May 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I meant just move this section up - currently it looks like it's under or associated with

Developer Tools Server

This server is currently under construction. It is not yet usable, but contributions are welcome!

But it only works for the UC server

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely +1 to a single README

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can help make that edit


This server can be deployed on Databricks Apps. To do so, follow the instructions below:
1. Move into the project directory:
```bash
cd /path/to/this/repo
```

2. Deploy the app:
```bash
databricks bundle deploy -p <name-of-your-profile>
```

3. Run the app:
```bash
databricks bundle run mcp-on-apps -p <name-of-your-profile>
```

## Support
Please note that all projects in the `databrickslabs` GitHub organization are provided for your exploration only, and are not formally supported by Databricks with Service Level Agreements (SLAs). They are provided AS-IS and we do not make any guarantees of any kind. Please do not submit a support ticket relating to any issues arising from the use of these projects.

Expand All @@ -85,3 +104,4 @@ Any issues discovered through the use of this project should be filed as GitHub

We welcome contributions :) - see [CONTRIBUTING.md](./CONTRIBUTING.md) for details. Please make sure to read this guide before
submitting pull requests, to ensure your contribution has the best chance of being accepted.

26 changes: 26 additions & 0 deletions databricks.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
bundle:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should move this into the UC server directory, to enable separately deploying individual servers, I can help with that

name: mcp-on-apps

sync:
include:
- .build

artifacts:
default:
type: whl
path: .
build: uv build --wheel

resources:
apps:
mcp-on-apps:
name: "mcp-on-apps"
description: "MCP Server on Databricks Apps"
source_code_path: ./.build
config:
command: ["unitycatalog-mcp-app"]

targets:
dev:
mode: development
default: true
49 changes: 49 additions & 0 deletions hooks/apps_build.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
from typing import Any
from hatchling.builders.hooks.plugin.interface import BuildHookInterface
from pathlib import Path
import shutil


class AppsBuildHook(BuildHookInterface):
"""Hook to create a Databricks Apps-compatible build.

This hook is used to create a Databricks Apps-compatible build of the project.

The following steps are performed:
- Remove the ./.build folder if it exists.
- Copy the artifact_path to the ./.build folder.
- Write the name of the artifact to a requirements.txt file in the ./.build folder.
- The resulting build directory is printed to the console.

"""

def finalize(
self, version: str, build_data: dict[str, Any], artifact_path: str
) -> None:
self.app.display_info(
f"Running Databricks Apps build hook for project {self.metadata.name} in directory {Path.cwd()}"
)
# remove the ./.build folder if it exists
build_dir = Path(".build")
self.app.display_info(f"Resulting build directory: {build_dir.absolute()}")

if build_dir.exists():
self.app.display_info(f"Removing {build_dir}")
shutil.rmtree(build_dir)
self.app.display_info(f"Removed {build_dir}")
else:
self.app.display_info(f"{build_dir} does not exist, skipping removal")

# copy the artifact_path to the ./.build folder
build_dir.mkdir(exist_ok=True)
self.app.display_info(f"Copying {artifact_path} to {build_dir}")
shutil.copy(artifact_path, build_dir)

# write the name of the artifact to a requirements.txt file in the ./.build folder
requirements_file = build_dir / "requirements.txt"

requirements_file.write_text(Path(artifact_path).name, encoding="utf-8")

self.app.display_info(
f"Apps-compatible build written to {build_dir.absolute()}"
)
5 changes: 5 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,15 @@ dev-dependencies = [
"ruff>=0.9.4",
"pytest>=8.3.4",
"isort>=6.0.1",
"hatchling>=1.27.0",
]

[tool.hatch.build.hooks.custom]
path = "hooks/apps_build.py"

[project.scripts]
unitycatalog-mcp = "databricks.labs.mcp.servers.unity_catalog:main"
unitycatalog-mcp-app = "databricks.labs.mcp.servers.unity_catalog.app:start_app"

[build-system]
requires = ["hatchling", "hatch-fancy-pypi-readme", "hatch-vcs"]
Expand Down
65 changes: 65 additions & 0 deletions src/databricks/labs/mcp/servers/unity_catalog/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
from mcp.server import NotificationOptions, Server
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice, any reason not to just replace the existing stdio-based server implementation with this one? We can do it in a follow-up PR, but just curious

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Eventually seems like we'd just have one streamable-HTTP-based implementation (once more clients support streamable HTTP)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, depending on how much work it is, it's actually now possible to use streamable HTTP to write the server, as of four days ago! https://github.com/modelcontextprotocol/python-sdk/releases/tag/v1.8.0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll update my version then, should work easily.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, updated!

from mcp.types import Tool as ToolSpec
from mcp.server.sse import SseServerTransport
import uvicorn
from databricks.labs.mcp.servers.unity_catalog.tools import (
Content,
)
from starlette.applications import Starlette
from starlette.routing import Mount, Route
from databricks.labs.mcp.servers.unity_catalog.cli import get_settings

from databricks.labs.mcp._version import __version__ as VERSION
from databricks.labs.mcp.servers.unity_catalog.server import get_tools_dict


server = Server(name="mcp-unitycatalog", version=VERSION)
tools_dict = get_tools_dict(settings=get_settings())


@server.list_tools()
async def list_tools() -> list[ToolSpec]:
return [tool.tool_spec for tool in tools_dict.values()]


@server.call_tool()
async def call_tool(name: str, arguments: dict) -> list[Content]:
tool = tools_dict[name]
return tool.execute(**arguments)


sse = SseServerTransport("/messages/")


# Define handler functions
async def handle_sse(request):
async with sse.connect_sse(
request.scope, request.receive, request._send
) as streams:
await server.run(
streams[0],
streams[1],
server.create_initialization_options(
notification_options=NotificationOptions(
resources_changed=True, tools_changed=True
)
),
)


# Create Starlette routes for SSE and message handling
routes = [
Route("/sse", endpoint=handle_sse),
Mount("/messages/", app=sse.handle_post_message),
]

# Create and run Starlette app
app = Starlette(routes=routes)


def start_app():
uvicorn.run(app, host="0.0.0.0", port=8000)


if __name__ == "__main__":
start_app()
27 changes: 27 additions & 0 deletions uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.