Azure Open AI Responses API with structured outputs - JSON schema suddenly no longer accepted

Kab 0 Reputation points
2025-10-08T11:01:09.0933333+00:00

We've deployed GPT-4.1 on Azure AI Foundry (OpenAI). We're using the Responses API (https://free.blessedness.top/en-us/azure/ai-foundry/openai/reference-preview-latest#create-response) with structured outputs (a JSON schema), which has been working fine for the past couple of months.

We've suddenly started getting this error: Error code: 400 - {'error': {'message': 'There was an issue with your request. Please check your inputs and try again', 'type': 'invalid_request_error', 'param': None, 'code': None}}

After some debugging, we saw that this error is caused by the JSON schema we're using (it worked with a very simple JSON schema). However, we've been using our same JSON schema for the past couple of months. Even after simplifying it and drastically shortening it, we got the same error.

It's also working fine with OpenAI's API.

Has anything changed with how the Azure OpenAI API handles structured outputs in the past 24 hours (7-8 October 2025)?

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
{count} votes

1 answer

Sort by: Most helpful
  1. Kab 0 Reputation points
    2025-10-12T09:13:59.0533333+00:00

    This issue seems to have been resolved on its own, thanks @Sridhar M.

    However this significantly reduces confidence in Azure OpenAI model deployments being ready for production. It would be great for other customers as well to understand why this happened.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.