Edit

Share via


Integrate Open WebUI with Foundry Local

Important

  • Foundry Local is available in preview. Public preview releases provide early access to features that are in active deployment.
  • Features, approaches, and processes can change or have limited capabilities, before General Availability (GA).

This tutorial shows you how to create a chat application using Foundry Local and Open WebUI. When you finish, you have a working chat interface running entirely on your local device.

Prerequisites

Before you start this tutorial, you need:

Set up Open WebUI for chat

  1. Install Open WebUI by following the instructions from the Open WebUI GitHub repository.

  2. Launch Open WebUI with this command in your terminal:

    open-webui serve
    
  3. Open your web browser and go to http://localhost:8080.

  4. Enable Direct Connections:

    1. Select Settings and Admin Settings in the profile menu.
    2. Select Connections in the navigation menu.
    3. Enable Direct Connections by turning on the toggle. This allows users to connect to their own OpenAI compatible API endpoints.
  5. Connect Open WebUI to Foundry Local:

    1. Select Settings in the profile menu.
    2. Select Connections in the navigation menu.
    3. Select + by Manage Direct Connections.
    4. For the URL, enter http://localhost:PORT/v1 where PORT is the Foundry Local endpoint port (use the CLI command foundry service status to find it). Note that Foundry Local dynamically assigns a port, so it isn't always the same.
    5. For the Auth, select None.
    6. Select Save
  6. Start chatting with your model:

    1. Your loaded models appear in the dropdown at the top
    2. Select any model from the list
    3. Type your message in the input box at the bottom

That's it! You're now chatting with an AI model running entirely on your local device.

Next steps