Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
This feature is in Beta.
Host your own custom or third-party MCP servers as Databricks apps. Custom MCP servers are useful if you already have an MCP server you want to deploy or if you want to run a third-party MCP server as a source of tools.
Requirements
- An MCP server hosted as a Databricks app must implement an HTTP-compatible transport, such as the streamable HTTP transport.
Host an MCP server as a Databricks app
See the custom MCP server repo for an example of writing your own MCP server and deploying it as a Databricks app.
To host an existing Python MCP server as a Databricks app, follow these steps:
Set up your environment
Before deploying your MCP server, authenticate to your workspace using OAuth.
Run the following in a local terminal:
databricks auth login --host https://<your-workspace-hostname>
Set up the MCP server
Add a
requirements.txtto the MCP server's root directory and specify Python dependencies for the server.If the server uses uv for package management, add
uvto requirements.txt and it handles installing additional dependencies.Add an
app.yamlspecifying the CLI command to run the MCP server.By default, Databricks apps listen on port 8000. If the server listens on a different port, set it using an environment variable override in the
app.yamlfile.Example
app.yaml:command: [ 'uv', 'run', 'your-server-name', ..., # optionally include additional parameters here ]
Deploy the MCP server as a Databricks app
Create a Databricks app to host the MCP server:
databricks apps create mcp-my-custom-serverUpload the source code to Databricks and deploy the app by running the following commands from the directory containing your
app.yamlfile:DATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName) databricks sync . "/Users/$DATABRICKS_USERNAME/mcp-my-custom-server" databricks apps deploy mcp-my-custom-server --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/mcp-my-custom-server"
Connect to the custom MCP server
Click the tabs to see how to connect to an MCP server from various environments.
Local environment
Authenticate to your workspace using OAuth as described in Set up your environment.
The following example shows how to connect to the custom MCP server and list available tools:
from databricks_mcp import DatabricksOAuthClientProvider
from databricks.sdk import WorkspaceClient
from mcp.client.session import ClientSession
from mcp.client.streamable_http import streamablehttp_client
databricks_cli_profile = "DEFAULT"
workspace_client = WorkspaceClient(profile=databricks_cli_profile)
# Replace with your actual custom MCP server URL
mcp_server_url = "https://<workspace-hostname>/serving-endpoints/mcp-my-custom-server/invocations"
async def test_connection_to_server():
async with streamablehttp_client(
f"{mcp_server_url}", auth=DatabricksOAuthClientProvider(workspace_client)
) as (read_stream, write_stream, _), ClientSession(
read_stream, write_stream
) as session:
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
Notebook (service principal)
Use a service principal to access the hosted Databricks app in a Databricks notebook.
import json
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksOAuthClientProvider
from databricks.sdk.credentials_provider import ModelServingUserCredentials
from mcp.client.streamable_http import streamablehttp_client as connect
from mcp import ClientSession
async def main():
app_url = "https://mcp-server.databricksapps.com/mcp"
client = WorkspaceClient(
host="https://workspace_url",
client_id="client_id",
client_secret="client_secret"
)
async with connect(app_url, auth=DatabricksOAuthClientProvider(client)) as (
read_stream,
write_stream,
_,
):
async with ClientSession(read_stream, write_stream) as session:
await session.initialize()
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
await main()
Agent code (on-behalf-of-user)
Set up on-behalf-of-user authorization. See On-behalf-of-user authentication.
The following example shows how to enable on-behalf-of-user access using ModelServingUserCredentials to access the hosted Databricks app from an agent:
import json
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksOAuthClientProvider
from databricks.sdk.credentials_provider import ModelServingUserCredentials
from mcp.client.streamable_http import streamablehttp_client as connect
from mcp import ClientSession
async def main():
app_url = "https://mcp-server.databricksapps.com/mcp"
client = WorkspaceClient(credentials_strategy = ModelServingUserCredentials())
async with connect(app_url, auth=DatabricksOAuthClientProvider(client)) as (
read_stream,
write_stream,
_,
):
async with ClientSession(read_stream, write_stream) as session:
await session.initialize()
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
await main()
Log the agent model using apps.apps scope. See On-behalf-of-user authentication.
Agent code (service principal)
Enable System auth using service principal to access the hosted Databricks app from an agent:
import json
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksOAuthClientProvider
from mcp.client.streamable_http import streamablehttp_client as connect
from mcp import ClientSession
async def main():
app_url = "https://mcp-server.databricksapps.com/mcp"
client = WorkspaceClient()
async with connect(app_url, auth=DatabricksOAuthClientProvider(client)) as (
read_stream,
write_stream,
_,
):
async with ClientSession(read_stream, write_stream) as session:
await session.initialize()
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
await main()
Log the agent model using DatabricksApps as a resource. See Automatic authentication passthrough.
Example notebooks: Build an agent with Databricks MCP servers
The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools with custom MCP servers hosted on Databricks apps.
LangGraph MCP tool-calling agent
OpenAI MCP tool-calling agent
Additional resources
The apps cookbook provides end-to-end code examples for integrating MCP servers with different frameworks:
For complete source code and additional examples, see the Databricks Apps Cookbook repository.