Azure AI Foundry support for locally hosted MCP servers?

Gabriel Mousa 0 Reputation points Microsoft Employee
2025-10-02T22:08:54.48+00:00

Does AI Foundry have any plans for agent support for locally hosted MCP servers? I understand that there is currently a private preview for MCP servers for AI foundry agents, however this is only for remotely hosted servers. However, if I want to use the Azure MCP server or the ADO MCP server, I merely input the node commands and arguments into my MCP user configuration and it's hosted locally on the client side. Does AI Foundry currently have support for agents to use these locally hosted servers somehow, or does it plan on doing so in the future? And furthermore, if it doesn't is there any recommendation for an approach currently to do so?

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
{count} votes

1 answer

Sort by: Most helpful
  1. Nikhil Jha (Accenture International Limited) 2,220 Reputation points Microsoft External Staff Moderator
    2025-10-03T08:21:17.89+00:00

    Hello Gabriel Mousa,

    I understand your question regarding Azure AI Foundry support for locally hosted MCP servers, and your goal is to connect these agents to an MCP server instance that you are running locally on your client machine for development purposes.

    The architecture of cloud-based services like Azure AI Foundry agents requires a stable and network-accessible endpoint (a URI) to reliably communicate with external tools like an MCP server.
    A server running on localhost on a developer's machine is generally not accessible from the Azure cloud environment due to networking constraints, firewalls, and non-static IP addresses. This is the primary reason the integration focuses on remotely hosted, reachable servers.

    1)Workaround for Development & Testing:
    To enable AI Foundry agents to communicate with your local MCP server for development, you can use a tunneling service to expose your local port to the internet with a public URL. A common tool for this is ngrok.

    • Start your local MCP server on a specific port (e.g., PORT 3000).
    • Install ngrok and run the following command in a new terminal: ngrok http 3000
    • ngrok will provide a public HTTPS forwarding URL (e.g., https://random-string.ngrok-free.app).
    • You can now use this public URL as the endpoint for your AI Foundry agent configuration. The agent will send requests to this URL, and ngrok will securely tunnel them to your local server.

    For reference documentation:

    Note: Reference shared is only for your help and Microsoft doesn't support them.
    This approach is highly recommended for development and proof-of-concept scenarios only and is not suitable for production workloads due to stability and security considerations.

    2)Approach for Staging/Production:
    For a stable and secure setup, the recommended approach is to host your MCP server on an Azure resource. This provides a reliable endpoint that AI Foundry can integrate with seamlessly. Consider the following services:

    • Azure App Service: A fully managed platform ideal for hosting web applications and APIs.
    • Azure Container Instances (ACI): To quickly run your MCP server in a container without managing infrastructure.
    • Azure Virtual Machine (VM): For full control over the hosting environment.

    For reference documentation:

    Regarding future plans, I cannot comment on specific, unannounced features.
    You can follow:
    https://devblogs.microsoft.com/foundry/category/whats-new/
    https://azure.microsoft.com/en-us/blog/product/azure-ai-foundry/


    I hope it helps.
    Please accept the answer and upvote to make it useful to other members of the community.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.