Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
This feature is in Beta.
Connect Databricks to external Model Context Protocol (MCP) servers to give your agents access to a wider range of tools that are hosted outside of Databricks. External MCP servers are third-party MCP servers hosted outside of Databricks that you connect to through Databricks-managed proxies.
Authentication and security
Databricks uses managed MCP proxies and Unity Catalog HTTP connections to securely handle authentication to your workspace:
- Secure token management: Databricks handles all OAuth flows and token refresh automatically
- No exposed credentials: Tokens are never exposed to end users
- Centralized authentication: Consistent authentication patterns using Unity Catalog connections
Requirements
- A workspace with the Managed MCP Servers preview enabled. See Manage Azure Databricks previews.
- To connect to external MCP servers, you must use a Unity Catalog HTTP connection. You must have CREATE CONNECTION privileges on the Unity Catalog metastore attached to the workspace. See Create a Unity Catalog HTTP connection for more details.
Create a Unity Catalog HTTP connection
To connect to an external MCP server, you must create an Unity Catalog HTTP connection to the server.
- Create an HTTP connection, see HTTP connection.
- When creating the HTTP connection, select the Is mcp connection checkbox to enable MCP functionality.
- Verify that the connection URL points to a valid MCP server endpoint.
After the UC connection is created, Databricks automatically provisions a proxy endpoint that acts as a Databricks managed MCP server. This endpoint securely proxies traffic to the external MCP server, managing token injection and renewal.
Proxy endpoint URLs follow this format:
https://<workspace-hostname>/api/2.0/mcp/external/{connection_name}
Connect with AI Playground
You can test external MCP servers directly in AI Playground without writing any code:
Navigate to the AI Playground in your Databricks workspace.
Choose a model with the Tools enabled label.
Click Tools > + Add tool and select MCP Servers from the available tool options.
In the MCP Servers section, select External MCP servers to browse available external connections.
Choose the Unity Catalog HTTP connection you created earlier (for example,
github_u2m_connection).Chat with the LLM to test how it interacts with your external MCP server tools. AI Playground automatically discovers available tools from your external MCP server and makes them available to the LLM.
This allows you to quickly prototype and test external MCP server integrations before building full agents or deploying to production.
Connect to external MCP servers
Connect to an external MCP server using the proxy URL. The Databricks proxy makes external servers behave like managed MCP servers.
Databricks MCP Client
The recommended approach treats external MCP servers as Databricks managed servers by adding the proxy endpoint to your MANAGED_MCP_SERVER_URLS list.
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksMCPClient
# Initialize workspace client
workspace_client = WorkspaceClient()
host = workspace_client.config.host
# External MCP servers are proxied as managed servers, allowing you
# to use the same API for both managed and external servers
MANAGED_MCP_SERVER_URLS = [
f"{host}/api/2.0/mcp/functions/system/ai", # Default managed MCP
f"{host}/api/2.0/mcp/external/github_u2m_connection" # External MCP proxy
]
To use the external MCP server in an agent, pass the proxy URL to the managed_server_urls parameter:
# Use with agents - external servers work just like managed ones
import asyncio
from your_agent_code import create_mcp_tools # Your agent's tool creation function
# Create tools from both managed and external (proxied) servers
mcp_tools = asyncio.run(
create_mcp_tools(
ws=workspace_client,
managed_server_urls=MANAGED_MCP_SERVER_URLS
)
)
You can also call the tool directly using the Databricks MCP Client:
# Direct tool call using DatabricksMCPClient
mcp_client = DatabricksMCPClient(
server_url=f"{host}/api/2.0/mcp/external/github_u2m_connection",
workspace_client=workspace_client
)
# List available tools
tools = mcp_client.list_tools()
print(f"Available tools: {[tool.name for tool in tools]}")
# Call a tool
response = mcp_client.call_tool(
"list_commits",
{"owner": "mlflow", "repo": "mlflow", "sha": "master"}
)
print(response.content[0].text)
Standard MCP SDK
Connect to external MCP servers using the standard MCP SDK with async/await:
%pip install -U databricks-sdk databricks_mcp tabulate databricks_ai_bridge
%restart_python
import json
from databricks.sdk import WorkspaceClient
from databricks_mcp import DatabricksOAuthClientProvider
from databricks.sdk.credentials_provider import ModelServingUserCredentials
from mcp.client.streamable_http import streamablehttp_client as connect
from mcp import ClientSession
from tabulate import tabulate
async def main():
app_url = "https://<workspace-hostname>/api/2.0/mcp/external/github_u2m_connection"
client = WorkspaceClient()
async with connect(app_url, auth=DatabricksOAuthClientProvider(client)) as (
read_stream,
write_stream,
_,
):
async with ClientSession(read_stream, write_stream) as session:
init = await session.initialize()
print(json.dumps(init.model_dump(), indent=2))
tools = await session.list_tools()
print(json.dumps(tools.model_dump(), indent=2))
arguments = {
"owner": "mlflow",
"repo": "mlflow",
"sha": "master"
}
response = await session.call_tool(name="list_commits", arguments=arguments)
data = json.loads(response.content[0].text)
rows = []
for commit in data:
author = commit.get("commit", {}).get("author", {}).get("name")
message = commit.get("commit", {}).get("message", "").split("\n")[0]
html_url = commit.get("html_url", "")
rows.append([author, message, html_url])
# Print as table
print(tabulate(rows, headers=["Author", "Message", "Commit URL"], tablefmt="github"))
await main()
Example notebooks: Build an agent with Databricks MCP servers
The following notebooks show how to author LangGraph and OpenAI agents that call MCP tools, including external MCP servers accessed through Databricks proxy endpoints.
LangGraph MCP tool-calling agent
OpenAI MCP tool-calling agent
Limitations
- Private connectivity to resources in your VPC using Private Link is not supported. Contact your support team if you need this functionality.
- Databricks supports only external MCP servers that use the Streamable HTTP transport mechanism.
Next steps
- Use managed MCP servers alongside external servers
- Connect clients to MCP servers for accessing Databricks MCP infrastructure
- Build custom MCP servers for your organization's needs
- Deploy agents that use external MCP servers