你当前正在访问 Microsoft Azure Global Edition 技术文档网站。 如果需要访问由世纪互联运营的 Microsoft Azure 中国技术文档网站,请访问 https://docs.azure.cn。
Azure OpenAI API 版本支持
- v1 正式发布版 (GA) API 现在允许访问 GA 和预览操作。 若要了解详细信息,请参阅 API 版本生命周期指南。
安装
dotnet add package OpenAI
身份验证
安全、无密钥的身份验证方法是通过 Azure 标识库使用Microsoft Entra ID(以前为 Azure Active Directory)。 若要使用该库,请使用以下代码:
dotnet add package Azure.Identity
使用库中的所需凭据类型。 例如,DefaultAzureCredential:
using Azure.Identity;
using OpenAI;
using OpenAI.Chat;
using System.ClientModel.Primitives;
#pragma warning disable OPENAI001
BearerTokenPolicy tokenPolicy = new(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default");
ChatClient client = new(
model: "gpt-4.1-nano",
authenticationPolicy: tokenPolicy,
options: new OpenAIClientOptions() {
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
ChatCompletion completion = client.CompleteChat("Tell me about the bitter lesson.'");
Console.WriteLine($"[ASSISTANT]: {completion.Content[0].Text}");
有关 Azure OpenAI 无密钥身份验证的详细信息,请参阅“Azure OpenAI 安全构建基块入门”快速入门文章。
聊天
向推理模型发送聊天补全请求的示例。
using OpenAI;
using OpenAI.Chat;
using System.ClientModel.Primitives;
#pragma warning disable OPENAI001 //currently required for token based authentication
BearerTokenPolicy tokenPolicy = new(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default");
ChatClient client = new(
model: "o4-mini",
authenticationPolicy: tokenPolicy,
options: new OpenAIClientOptions()
{
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
ChatCompletionOptions options = new ChatCompletionOptions
{
ReasoningEffortLevel = ChatReasoningEffortLevel.Low,
MaxOutputTokenCount = 100000
};
ChatCompletion completion = client.CompleteChat(
new DeveloperChatMessage("You are a helpful assistant"),
new UserChatMessage("Tell me about the bitter lesson")
);
Console.WriteLine($"[ASSISTANT]: {completion.Content[0].Text}");
嵌入
using OpenAI;
using OpenAI.Embeddings;
using System.ClientModel;
string apiKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY")
?? throw new InvalidOperationException("AZURE_OPENAI_API_KEY environment variable is not set");
EmbeddingClient client = new(
"text-embedding-3-large",
credential: new ApiKeyCredential(apiKey),
options: new OpenAIClientOptions()
{
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
string input = "This is a test";
OpenAIEmbedding embedding = client.GenerateEmbedding(input);
ReadOnlyMemory<float> vector = embedding.ToFloats();
Console.WriteLine($"Embeddings: [{string.Join(", ", vector.ToArray())}]");
响应 API
using OpenAI;
using OpenAI.Responses;
using System.ClientModel.Primitives;
using Azure.Identity;
#pragma warning disable OPENAI001 //currently required for token based authentication
BearerTokenPolicy tokenPolicy = new(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default");
OpenAIResponseClient client = new(
model: "o4-mini",
authenticationPolicy: tokenPolicy,
options: new OpenAIClientOptions()
{
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
OpenAIResponse response = await client.CreateResponseAsync(
userInputText: "What's the optimal strategy to win at poker?",
new ResponseCreationOptions()
{
ReasoningOptions = new ResponseReasoningOptions()
{
ReasoningEffortLevel = ResponseReasoningEffortLevel.High,
},
});
Console.WriteLine(response.GetOutputText());
流媒体
using OpenAI;
using OpenAI.Responses;
using System.ClientModel.Primitives;
using Azure.Identity;
#pragma warning disable OPENAI001 //currently required for token based authentication
BearerTokenPolicy tokenPolicy = new(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default");
#pragma warning disable OPENAI001
OpenAIResponseClient client = new(
model: "o4-mini",
authenticationPolicy: tokenPolicy,
options: new OpenAIClientOptions()
{
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
await foreach (StreamingResponseUpdate update
in client.CreateResponseStreamingAsync(
userInputText: "What's the optimal strategy to win at poker?",
new ResponseCreationOptions()
{
ReasoningOptions = new ResponseReasoningOptions()
{
ReasoningEffortLevel = ResponseReasoningEffortLevel.High,
},
}))
{
if (update is StreamingResponseOutputItemAddedUpdate itemUpdate
&& itemUpdate.Item is ReasoningResponseItem reasoningItem)
{
Console.WriteLine($"[Reasoning] ({reasoningItem.Status})");
}
else if (update is StreamingResponseOutputTextDeltaUpdate delta)
{
Console.Write(delta.Delta);
}
}
MCP 服务器
using OpenAI;
using OpenAI.Responses;
using System.ClientModel.Primitives;
using Azure.Identity;
#pragma warning disable OPENAI001 //currently required for token based authentication
BearerTokenPolicy tokenPolicy = new(
new DefaultAzureCredential(),
"https://cognitiveservices.azure.com/.default");
OpenAIResponseClient client = new(
model: "o4-mini",
authenticationPolicy: tokenPolicy,
options: new OpenAIClientOptions()
{
Endpoint = new Uri("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1")
}
);
ResponseCreationOptions options = new();
options.Tools.Add(ResponseTool.CreateMcpTool(
serverLabel: "microsoft_learn",
serverUri: new Uri("https://free.blessedness.top/api/mcp"),
toolCallApprovalPolicy: new McpToolCallApprovalPolicy(GlobalMcpToolCallApprovalPolicy.NeverRequireApproval)
));
OpenAIResponse response = (OpenAIResponse)client.CreateResponse([
ResponseItem.CreateUserMessageItem([
ResponseContentPart.CreateInputTextPart("Search for information about Azure Functions")
])
], options);
Console.WriteLine(response.GetOutputText());
错误处理
错误代码
| 状态代码 | 错误类型 |
|---|---|
| 400 | Bad Request Error |
| 401 | Authentication Error |
| 403 | Permission Denied Error |
| 404 | Not Found Error |
| 422 | Unprocessable Entity Error |
| 429 | Rate Limit Error |
| 500 | Internal Server Error |
| 503 | Service Unavailable |
| 504 | Gateway Timeout |
重试
客户端类将使用指数退避自动重试以下错误最多三次:
- 408 请求超时
- 429 请求过多
- 500 内部服务器错误
- 502 错误的网关
- 503 服务不可用
- 504 网关超时
源代码 | 包 (pkg.go.dev) | REST API 参考文档 | 包参考文档
Azure OpenAI API 版本支持
- v1 正式发布版 (GA) API 现在允许访问 GA 和预览操作。 若要了解详细信息,请参阅 API 版本生命周期指南。
安装
使用 go get 安装 openai 和 azidentity 模块:
go get -u 'github.com/openai/openai-go@v2.1.1'
# optional
go get github.com/Azure/azure-sdk-for-go/sdk/azidentity
身份验证
azidentity 模块用于通过 Azure OpenAI 进行 Microsoft Entra ID 身份验证。
package main
import (
"context"
"fmt"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/openai/openai-go/v2"
"github.com/openai/openai-go/v2/azure"
"github.com/openai/openai-go/v2/option"
)
func main() {
// Create an Azure credential
tokenCredential, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
panic(fmt.Sprintf("Failed to create credential: %v", err))
}
// Create a client with Azure OpenAI endpoint and token credential
client := openai.NewClient(
option.WithBaseURL("https://YOUR-RESOURCE_NAME.openai.azure.com/openai/v1/"),
azure.WithTokenCredential(tokenCredential),
)
// Make a completion request
chatCompletion, err := client.Chat.Completions.New(context.TODO(), openai.ChatCompletionNewParams{
Messages: []openai.ChatCompletionMessageParamUnion{
openai.UserMessage("Explain what the bitter lesson is?"),
},
Model: "o4-mini", // Use your deployed model name on Azure
})
if err != nil {
panic(err.Error())
}
fmt.Println(chatCompletion.Choices[0].Message.Content)
}
有关 Azure OpenAI 无密钥身份验证的详细信息,请参阅 不使用密钥的 Azure OpenAI。
嵌入
package main
import (
"context"
"fmt"
"os"
"github.com/openai/openai-go/v2"
"github.com/openai/openai-go/v2/option"
)
func main() {
// Get API key from environment variable
apiKey := os.Getenv("AZURE_OPENAI_API_KEY")
if apiKey == "" {
panic("AZURE_OPENAI_API_KEY environment variable is not set")
}
// Create a client with Azure OpenAI endpoint and API key
client := openai.NewClient(
option.WithBaseURL("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/"),
option.WithAPIKey(apiKey),
)
ctx := context.Background()
text := "The attention mechanism revolutionized natural language processing"
// Make an embedding request
embedding, err := client.Embeddings.New(ctx, openai.EmbeddingNewParams{
Input: openai.EmbeddingNewParamsInputUnion{OfString: openai.String(text)},
Model: "text-embedding-3-small", // Use your deployed model name on Azure
})
if err != nil {
panic(err.Error())
}
// Print embedding information
fmt.Printf("Model: %s\n", embedding.Model)
fmt.Printf("Number of embeddings: %d\n", len(embedding.Data))
fmt.Printf("Embedding dimensions: %d\n", len(embedding.Data[0].Embedding))
fmt.Printf("Usage - Prompt tokens: %d, Total tokens: %d\n", embedding.Usage.PromptTokens, embedding.Usage.TotalTokens)
// Print first few values of the embedding vector
fmt.Printf("First 10 embedding values: %v\n", embedding.Data[0].Embedding[:10])
}
Responses
package main
import (
"context"
"github.com/Azure/azure-sdk-for-go/sdk/azidentity"
"github.com/openai/openai-go/v2"
"github.com/openai/openai-go/v2/azure"
"github.com/openai/openai-go/v2/option"
"github.com/openai/openai-go/v2/responses"
)
func main() {
// Create Azure token credential
tokenCredential, err := azidentity.NewDefaultAzureCredential(nil)
if err != nil {
panic(err)
}
// Create client with Azure endpoint and token credential
client := openai.NewClient(
option.WithBaseURL("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/"),
azure.WithTokenCredential(tokenCredential),
)
ctx := context.Background()
question := "Tell me about the attention is all you need paper"
resp, err := client.Responses.New(ctx, responses.ResponseNewParams{
Input: responses.ResponseNewParamsInputUnion{OfString: openai.String(question)},
Model: "o4-mini",
})
if err != nil {
panic(err)
}
println(resp.OutputText())
}
源代码 |REST API 参考文档 | 包参考文档 | Maven Central
Azure OpenAI API 版本支持
- v1 正式发布版 (GA) API 现在允许访问 GA 和预览操作。 若要了解详细信息,请参阅 API 版本生命周期指南。
安装
Gradle
implementation("com.openai:openai-java:4.0.1")
Maven
<dependency>
<groupId>com.openai</groupId>
<artifactId>openai-java</artifactId>
<version>4.0.1</version>
</dependency>
身份验证
使用 Microsoft Entra ID 进行身份验证需要一些初始设置:
添加 Azure 标识包:
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.18.0</version>
</dependency>
设置后,可以从 azure.identity 中选择要使用的凭据类型。 例如,DefaultAzureCredential 可用于对客户端进行身份验证:将 Microsoft Entra ID 应用程序的客户端 ID、租户 ID 和客户端密码的值设置为环境变量:AZURE_CLIENT_ID、AZURE_TENANT_ID、AZURE_CLIENT_SECRET。
授权是最简单的使用 DefaultAzureCredential。 它会找到可在其运行环境中使用的最佳凭据,尽管我们仅建议在测试而非生产中使用 DefaultAzureCredential。
Credential tokenCredential = BearerTokenCredential.create(
AuthenticationUtil.getBearerTokenSupplier(
new DefaultAzureCredentialBuilder().build(),
"https://cognitiveservices.azure.com/.default"));
OpenAIClient client = OpenAIOkHttpClient.builder()
.baseUrl("https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/")
.credential(tokenCredential)
.build();
有关 Azure OpenAI 无密钥身份验证的详细信息,请参阅 不使用密钥的 Azure OpenAI。
Responses
package com.example;
import com.openai.client.OpenAIClient;
import com.openai.client.okhttp.OpenAIOkHttpClient;
import com.openai.models.ChatModel;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseCreateParams;
import com.azure.core.credential.AzureKeyCredential;
public class OpenAITest {
public static void main(String[] args) {
// Get API key from environment variable for security
String apiKey = System.getenv("OPENAI_API_KEY");
String resourceName = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1";
String modelDeploymentName = "gpt-4.1"; //replace with you model deployment name
try {
OpenAIClient client = OpenAIOkHttpClient.builder()
.baseUrl(resourceName)
.apiKey(apiKey)
.build();
ResponseCreateParams params = ResponseCreateParams.builder()
.input("Tell me about the bitter lesson?")
.model(modelDeploymentName)
.build();
Response response = client.responses().create(params);
System.out.println("Response: " + response);
} catch (Exception e) {
System.err.println("Error: " + e.getMessage());
e.printStackTrace();
}
}
}
Azure OpenAI API 版本支持
- v1 正式发布版 (GA) API 现在允许访问 GA 和预览操作。 若要了解详细信息,请参阅 API 版本生命周期指南。
安装
npm install openai
身份验证
npm install @azure/identity
但是,为了对 OpenAI 客户端进行身份验证,我们需要使用 getBearerTokenProvider 包中的 @azure/identity 函数。 该函数创建一个令牌提供程序,OpenAI 内部使用该提供程序为每个请求获取令牌。 令牌提供程序的创建方式如下:
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
import { OpenAI } from "openai";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
'https://cognitiveservices.azure.com/.default');
const client = new OpenAI({
baseURL: "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
apiKey: tokenProvider
});
有关 Azure OpenAI 无密钥身份验证的详细信息,请参阅“Azure OpenAI 安全构建基块入门”快速入门文章。
Responses
responses.create
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
import { OpenAI } from "openai";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
'https://cognitiveservices.azure.com/.default');
const client = new OpenAI({
baseURL: "https://YOUR-RESORCE-NAME.openai.azure.com/openai/v1/",
apiKey: tokenProvider
});
const response = await client.responses.create({
model: 'gpt-4.1-nano', //model deployment name
instructions: 'You are a helpful AI agent',
input: 'Tell me about the bitter lesson?',
});
console.log(response.output_text);
流媒体
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
import { OpenAI } from "openai";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
'https://cognitiveservices.azure.com/.default');
const client = new OpenAI({
baseURL: "https://YOUR-RESORCE-NAME.openai.azure.com/openai/v1/",
apiKey: tokenProvider
});
const stream = await client.responses.create({
model: 'gpt-4.1-nano', // model deployment name
input: 'Provide a brief history of the attention is all you need paper.',
stream: true,
});
for await (const event of stream) {
if (event.type === 'response.output_text.delta' && event.delta) {
process.stdout.write(event.delta);
}
}
MCP 服务器
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
import { OpenAI } from "openai";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
'https://cognitiveservices.azure.com/.default');
const client = new OpenAI({
baseURL: "https://YOUR-RESORCE-NAME.openai.azure.com/openai/v1/",
apiKey: tokenProvider
});
const resp = await client.responses.create({
model: "gpt-5",
tools: [
{
type: "mcp",
server_label: "microsoft_learn",
server_description: "Microsoft Learn MCP server for searching and fetching Microsoft documentation.",
server_url: "https://free.blessedness.top/api/mcp",
require_approval: "never",
},
],
input: "Search for information about Azure Functions",
});
console.log(resp.output_text);
聊天
chat.completions.create
import { DefaultAzureCredential, getBearerTokenProvider } from "@azure/identity";
import { OpenAI } from "openai";
const tokenProvider = getBearerTokenProvider(
new DefaultAzureCredential(),
'https://cognitiveservices.azure.com/.default');
const client = new OpenAI({
baseURL: "https://france-central-test-001.openai.azure.com/openai/v1/",
apiKey: tokenProvider
});
const messages = [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Tell me about the attention is all you need paper' }
];
// Make the API request with top-level await
const result = await client.chat.completions.create({
messages,
model: 'gpt-4.1-nano', // model deployment name
max_tokens: 100
});
// Print the full response
console.log('Full response:', result);
// Print just the message content from the response
console.log('Response content:', result.choices[0].message.content);
错误处理
错误代码
| 状态代码 | 错误类型 |
|---|---|
| 400 | Bad Request Error |
| 401 | Authentication Error |
| 403 | Permission Denied Error |
| 404 | Not Found Error |
| 422 | Unprocessable Entity Error |
| 429 | Rate Limit Error |
| 500 | Internal Server Error |
| 503 | Service Unavailable |
| 504 | Gateway Timeout |
重试
默认情况下,以下错误会自动停用两次,并出现短暂的指数退避:
- 连接错误
- 408 请求超时
- 429 速率限制
-
>=500 内部错误
使用 maxRetries 设置/禁用重试行为:
// Configure the default for all requests:
const client = new OpenAI({
maxRetries: 0, // default is 2
});
// Or, configure per-request:
await client.chat.completions.create({ messages: [{ role: 'user', content: 'How can I get the name of the current day in Node.js?' }], model: '' }, {
maxRetries: 5,
});
注释
此库由 OpenAI 维护。 请参阅 发布历史记录 ,跟踪库的最新更新。
Azure OpenAI API 版本支持
- v1 正式发布版 (GA) API 现在允许访问 GA 和预览操作。 若要了解详细信息,请参阅 API 版本生命周期指南。
安装
pip install openai
有关最新版本:
pip install openai --upgrade
身份验证
可以从 Azure 门户 或 AI Foundry 检索资源的终结点和 API 密钥:
- 登录到 Azure 门户> ,选择资源 >资源管理>密钥和终结点
- 登录到 AI Foundry 门户> 选择资源
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key = token_provider
)
响应 API
responses.create()
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
)
response = client.responses.create(
model="gpt-4.1-nano",
input= "This is a test"
)
print(response.model_dump_json(indent=2))
有关更多示例,请参阅 响应 API 文档。
responses.create() 与 MCP 服务器工具一起使用
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
)
resp = client.responses.create(
model="gpt-5",
tools=[
{
"type": "mcp",
"server_label": "microsoft_learn",
"server_description": "Microsoft Learn MCP server for searching and fetching Microsoft documentation.",
"server_url": "https://free.blessedness.top/api/mcp",
"require_approval": "never",
},
],
input="Search for information about Azure Functions",
)
print(resp.output_text)
有关更多示例,请参阅 响应 API 文档。
聊天
chat.completions.create()
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
)
completion = client.chat.completions.create(
model="gpt-4o", # Replace with your model deployment name.
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "When was Microsoft founded?"}
]
)
#print(completion.choices[0].message)
print(completion.model_dump_json(indent=2))
chat.completions.create() - 流式处理
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
)
completion = client.chat.completions.create(
model="gpt-4o", # Replace with your model deployment name.
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "When was Microsoft founded?"}
],
stream=True
)
for chunk in completion:
if chunk.choices and chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end='',)
chat.completions.create() - 图像输入
from openai import OpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(
DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default"
)
client = OpenAI(
base_url = "https://YOUR-RESOURCE-NAME.openai.azure.com/openai/v1/",
api_key=token_provider,
)
completion = client.chat.completions.create(
model="gpt-4o",
messages=[
{
"role": "user",
"content": [
{"type": "text", "text": "What's in this image?"},
{
"type": "image_url",
"image_url": {
"url": "https://raw.githubusercontent.com/MicrosoftDocs/azure-ai-docs/main/articles/ai-foundry/openai/media/how-to/generated-seattle.png",
}
},
],
}
],
max_tokens=300,
)
print(completion.model_dump_json(indent=2))
嵌入
embeddings.create()
嵌入式技术当前不支持在 Azure OpenAI 和 v1 API 中使用 Microsoft Entra ID。
微调
错误处理
# from openai import OpenAI
# client = OpenAI()
import openai
try:
client.fine_tuning.jobs.create(
model="gpt-4o",
training_file="file-test",
)
except openai.APIConnectionError as e:
print("The server could not be reached")
print(e.__cause__) # an underlying Exception, likely raised within httpx.
except openai.RateLimitError as e:
print("A 429 status code was received; we should back off a bit.")
except openai.APIStatusError as e:
print("Another non-200-range status code was received")
print(e.status_code)
print(e.response)
错误代码
| 状态代码 | 错误类型 |
|---|---|
| 400 | BadRequestError |
| 401 | AuthenticationError |
| 403 | PermissionDeniedError |
| 404 | NotFoundError |
| 422 | UnprocessableEntityError |
| 429 | RateLimitError |
| >=500 | InternalServerError |
| 无 | APIConnectionError |
请求 ID
若要检索请求的 ID,可以使用对应于 _request_id 响应头的 x-request-id 属性。
print(completion._request_id)
print(legacy_completion._request_id)
重试
默认情况下,以下错误会自动停用两次,并出现短暂的指数退避:
- 连接错误
- 408 请求超时
- 429 速率限制
-
>=500 内部错误
使用 max_retries 设置/禁用重试行为:
# For all requests
from openai import OpenAI
client = OpenAI(
max_retries=0
)
# max retires for specific requests
client.with_options(max_retries=5).chat.completions.create(
messages=[
{
"role": "user",
"content": "When was Microsoft founded?",
}
],
model="gpt-4o",
)
后续步骤
- 若要查看当前支持哪些模型,请查看 Azure OpenAI 模型页