We've deployed GPT-4.1 on Azure AI Foundry (OpenAI). We're using the Responses API (https://free.blessedness.top/en-us/azure/ai-foundry/openai/reference-preview-latest#create-response) with structured outputs (a JSON schema), which has been working fine for the past couple of months.
We've suddenly started getting this error: Error code: 400 - {'error': {'message': 'There was an issue with your request. Please check your inputs and try again', 'type': 'invalid_request_error', 'param': None, 'code': None}}
After some debugging, we saw that this error is caused by the JSON schema we're using (it worked with a very simple JSON schema). However, we've been using our same JSON schema for the past couple of months. Even after simplifying it and drastically shortening it, we got the same error.
It's also working fine with OpenAI's API.
Has anything changed with how the Azure OpenAI API handles structured outputs in the past 24 hours (7-8 October 2025)?