Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Important
- Foundry Local is available in preview. Public preview releases provide early access to features that are in active deployment.
- Features, approaches, and processes can change or have limited capabilities, before General Availability (GA).
This tutorial shows you how to create a chat application using Foundry Local and Open WebUI. When you finish, you have a working chat interface running entirely on your local device.
Prerequisites
Before you start this tutorial, you need:
- Foundry Local installed on your computer. Read the Get started with Foundry Local guide for installation instructions.
Set up Open WebUI for chat
Install Open WebUI by following the instructions from the Open WebUI GitHub repository.
Launch Open WebUI with this command in your terminal:
open-webui serveOpen your web browser and go to http://localhost:8080.
Enable Direct Connections:
- Select Settings and Admin Settings in the profile menu.
- Select Connections in the navigation menu.
- Enable Direct Connections by turning on the toggle. This allows users to connect to their own OpenAI compatible API endpoints.
Connect Open WebUI to Foundry Local:
- Select Settings in the profile menu.
- Select Connections in the navigation menu.
- Select + by Manage Direct Connections.
- For the URL, enter
http://localhost:PORT/v1wherePORTis the Foundry Local endpoint port (use the CLI commandfoundry service statusto find it). Note that Foundry Local dynamically assigns a port, so it isn't always the same. - For the Auth, select None.
- Select Save
Start chatting with your model:
- Your loaded models appear in the dropdown at the top
- Select any model from the list
- Type your message in the input box at the bottom
That's it! You're now chatting with an AI model running entirely on your local device.