Edit

Share via


Codex with Azure OpenAI in Azure AI Foundry Models

OpenAI’s Codex CLI is the same coding agent that powers ChatGPT’s Codex. You can run this coding agent entirely on Azure infrastructure, while keeping your data inside your compliance boundary with the added advantages of enterprise-grade security, private networking, role-based access control, and predictable cost management. Codex is more than a chat with your code agent – it's an asynchronous coding agent that can be triggered from your terminal, VS Code, or from a GitHub Actions runner. Codex allows you to automatically open pull requests, refactor files, and write tests with the credentials of your AI Foundry project and Azure OpenAI deployments.

Prerequisites

Requirements Details
Operating systems macOS 12+, Ubuntu 20.04+/Debian 10+, or Windows 11 via WSL2
Git (optional, recommended) 2.23+ for built-in pull request helpers
RAM 4-GB minimum (8-GB recommended)

Deploy a model in Azure AI Foundry

  1. Go to Azure AI Foundry and create a new project.
  2. From the model catalog select a reasoning model such as gpt-5-codex, gpt-5, gpt-5-mini, or gpt-5-nano.
  3. To deploy the model from the model catalog select Use this model, or if using the Azure OpenAI Deployments pane select deploy model.
  4. Copy the endpoint URL and the API Key.

Install the Codex CLI

From the terminal, run the following commands to install Codex CLI

npm install -g @openai/codex
codex --version # verify installation

Create and configure config.toml

  1. In order to use Codex CLI with Azure, you need to create and set up a config.toml file.

    The config.toml file needs to be stored in the ~/.codex directory. Create a config.toml file inside this directory or edit the existing file if it already exists:

    cd ~/.codex
    nano config.toml
    
  2. Copy the text below to use the v1 Responses API. With the v1 API you no longer need to pass api-version, but you must include /v1 in the base_url path. You can't pass your API key as a string directly to env_key. env_key must point to an environment variable. Update your base_url with your resource name:

    model = "gpt-5-codex"  # Replace with your actual Azure model deployment name
    model_provider = "azure"
    model_reasoning_effort = "medium"
    
    [model_providers.azure]
    name = "Azure OpenAI"
    base_url = "https://YOUR_RESOURCE_NAME.openai.azure.com/openai/v1"
    env_key = "AZURE_OPENAI_API_KEY"
    wire_api = "responses"
    
  3. Once you have saved the updates to your config.toml file return to the terminal and create an instance of the environment variable that is referenced in your config file.

    # Linux, macOS, or WSL 
    export AZURE_OPENAI_API_KEY="<your-api-key>"
    
  4. Now run one of the following commands in the terminal to test if your Codex CLI configuration was successful:

    Command Purpose
    codex Launch interactive Terminal User Interface (TUI)
    codex "Initial prompt" Launch TUI with an initial prompt
    codex exec "Initial prompt" Launch TUI in non-interactive "automation mode"

Use codex in Visual Studio Code

You can also use Codex directly inside Visual Studio Code when using the OpenAI Codex extension

  1. If you don't already have Visual Studio Code, you can install it for macOS and Linux.

  2. Install the OpenAI Codex extension. The extension relies on your config.toml file that was configured for Codex CLI.

  3. If you are in a new terminal session setup the environment variable for AZURE_OPENAI_API_KEY:

    export OPENAI_API_KEY="your-azure-api-key-here"
    
  4. Launch VS Code from the same Terminal session. (Launching from an app launcher can result in your API key environment variable not being available to the Codex extension.)

    code .
    
  5. You'll now be able to use Codex in Visual Studio Code to chat, edit, and preview changes while toggling between three approval modes.

Approval modes

Approval modes determine how much autonomy and interaction you want to have with Codex.

Approval mode Description
Chat To chat and plan with the model.
Agent Codex can read files, make edits, and run commands in the working directory automatically. Codex will need approval for activities outside the working directory or to access the internet.
Agent (full access) All the capabilities of Agent mode without the need for step-by-step approval. Full access mode shouldn't be used without full understanding of the potential risks as well as implementing additional guardrails such as running in a controlled sandbox environment.

Important

We recommend reviewing OpenAI's guidance on Codex security.

Persistent guidance with AGENTS.md

You can give Codex extra instructions and guidance using AGENTS.md files. Codex looks for AGENTS.md files in the following places and merges them top-down, giving it context about your personal preferences, project-specific details, and the current task:

  • ~/.codex/AGENTS.md– personal global guidance.
  • AGENTS.md at your repository’s root – shared project notes.
  • AGENTS.md in the current working directory – subfolder/feature specifics.

For example, to help Codex understand how to write code for Azure AI Foundry Agents, you could create an AGENTS.md in your project root with the following content, derived from the Azure AI Agents SDK documentation:

# Instructions for working with Azure AI Foundry Agents

You are an expert in the Azure AI Agents client library for Python.

## Key Concepts

- **Client Initialization**: Always start by creating an `AIProjectClient` or `AgentsClient`. The recommended way is via `AIProjectClient`.
- **Authentication**: Use `DefaultAzureCredential` from `azure.identity`.
- **Agent Creation**: Use `agents_client.create_agent()`. Key parameters are `model`, `name`, and `instructions`.
- **Tools**: Agents use tools to perform actions like file search, code interpretation, or function calls.
  - To use tools, they must be passed to `create_agent` via the `tools` and `tool_resources` parameters or a `toolset`.
  - Example: `file_search_tool = FileSearchTool(vector_store_ids=[...])`
  - Example: `code_interpreter = CodeInterpreterTool(file_ids=[...])`
  - Example: `functions = FunctionTool(user_functions)`

## Example: Creating a basic agent

\`\`\`python
import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential

# 1. Create Project Client
project_client = AIProjectClient(
    endpoint=os.environ["PROJECT_ENDPOINT"],
    credential=DefaultAzureCredential(),
)

# 2. Get Agents Client
with project_client:
    agents_client = project_client.agents

    # 3. Create Agent
    agent = agents_client.create_agent(
        model=os.environ["MODEL_DEPLOYMENT_NAME"],
        name="my-helpful-agent",
        instructions="You are a helpful agent that can answer questions.",
    )
    print(f"Created agent with ID: {agent.id}")
\`\`\`

In the previous example backticks in the Python code block are escaped to allow proper rendering. The \'s can be removed.

Experiment with Codex CLI

Launch codex with the following initial prompt:

codex "write a python script to create an Azure AI Agent with file search capabilities"

Other suggested tests:

# generate a unit test for src/utils/date.ts
# refactor this agent to use the Code Interpreter tool instead

Codex in GitHub Actions

Codex can execute as part of your continuous integration (CI) pipeline. Store your API key in the repository’s secret store as AZURE_OPENAI_KEY and add a job like this to automatically update your changelog before a release:

jobs:
  update_changelog:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Update changelog via Codex
        run: |
          npm install -g @openai/codex
          export AZURE_OPENAI_API_KEY="${{ secrets.AZURE_OPENAI_KEY }}" 
          codex -p azure exec --full-auto "update CHANGELOG for next release"

Troubleshooting

Symptom Solution
401 Unauthorized or 403 Forbidden Export your AZURE_OPENAI_API_KEY environment variable correctly. Confirm that your key has project/deployment access.
Make sure you aren't passing the API Key as a string directly to the env_key in the config.toml file. You must pass a valid environment variable.
ENOTFOUND, DNS error, or 404 Not Found Verify base_url in config.toml uses your resource name, correct domain, and contains /v1.
For example, base_url = "https://<your-resource>.openai.azure.com/openai/v1".
CLI ignores Azure settings Open ~/.codex/config.toml and ensure:
- model_provider = "azure" is set.
- The [model_providers.azure] section exists.
- env_key = "AZURE_OPENAI_API_KEY" matches your environment variable name.
Entra ID support Entra ID support is currently not available for Codex. To track the status of this feature refer to this pull request which adds support.
401 Unauthorized only with the WSL + VS Code Codex extension When running VS Code from inside WSL with the Codex extension the extension may check for the API key environment variable on the local windows host rather than within the terminal shell that launched VS Code. To mitigate this issue, set the environment variable on the local windows host as well, then launch a new terminal from WSL and launch VS Code with code ..