How can I resolve 500 internal server error after batch processing? Am I being billed for the documents that fail to be processed?

Bret Smith 0 Reputation points
2025-10-24T16:02:06.2333333+00:00

I am using batch processing with a custom extraction model. The API version is '2024-11-30' and all of the documents are around 100 kilobytes. The documentation says that the maximum amount of documents that the batch can process at a time is 10,000 documents. When I run an analysis of more than ~45 documents, the JSON response lists many documents that have failed to be processed because of an Internal Serer Error. There is nothing wrong with the documents (malformed, etc.), because different documents are processed during different re-runs of the program. Moreover, it looks like from the result JSON response that the documents are failing in groups (a group of failed documents, followed by a group of 'succeeded', followed by a group of 'failed', etc.).

If I am exceeding some internal limit, is there any way for me to adjust this on the Azure portal? Or is my only recourse to limit my batch size?

Furthermore, is Microsoft's policy to bill for Document Intelligence even when the documents fail to be processed?

Azure AI Document Intelligence
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Jerald Felix 7,520 Reputation points
    2025-10-25T11:30:21.8966667+00:00

    Hello Bret Smith,

    When you encounter a 500 Internal Server Error after batch processing with Azure AI Document Intelligence, especially when processing more than ~45 documents at a time (even though the limit is documented as 10,000), it typically indicates a backend issue. This is not due to the document format or corruption, since different documents succeed in different reruns and failures seem to occur in grouped patterns.

    Here’s what you need to know:

    Possible Causes: This error might be triggered by service throttling, temporary internal limits, or constraints on concurrent compute resources. Even though the platform technically supports 10,000 documents per batch, practical throughput may be lower due to backend resource management and how the requests are segmented for processing.

    Adjusting Limits: Currently, you cannot change internal batch processing limits directly from the Azure portal. Microsoft’s processing cap is set by the system, not per-user, and batch size reduction is your immediate workaround.

    Best Practices: Limit your batch submission to smaller sizes—try 40 or fewer documents per batch—and if necessary, implement a retry logic for failed documents. This can help avoid overload and mitigate issues that arise from service-side batch handling.

    Billing Policy: In general, Microsoft’s billing is tied to attempted document processing. If a document is submitted but not processed due to internal server error, you may not be charged; however, you should confirm this policy either via your usage dashboard or by contacting Microsoft support, as billing rules may be updated and can differ for preview or production services. Always review your invoice or monitor resource usage in your Azure subscription for clarity.

    Further Steps: If this issue persists even with reduced batch sizes, raise it with Microsoft support, sharing your batch payload details, request IDs, and error responses. You can also review the platform’s documentation for batch analysis and processing for additional troubleshooting guidance and follow service advisories for backend improvements or bug fixes. If this helps you kindly approve the answer.

    Best Regards,

    Jerald Felix

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.