PromptExecutionSettings.Builder Class   
	- java.lang. Object 
- com.microsoft. semantickernel. orchestration. PromptExecutionSettings. Builder 
 
- com.
public static class PromptExecutionSettings.Builder
Builder for PromptExecutionSettings.
Constructor Summary
| Constructor | Description | |
|---|---|---|
| Builder() | ||
Method Summary
| Modifier and Type | Method and Description | 
|---|---|
| Prompt | build() Build the Prompt | 
| Builder | withBestOf(int bestOf) Set the best of setting for prompt execution. | 
| Builder | withFrequencyPenalty(double frequencyPenalty) Set the frequency penalty setting for prompt execution. | 
| Builder | withJsonSchemaResponseFormat(Class<?> responseFormat) Set the response format to use a json schema generated for the given class. | 
| Builder | withMaxTokens(int maxTokens) Set the maximum number of tokens to generate in the output. | 
| Builder | withModelId(String modelId) Set the id of the model to use for prompt execution. | 
| Builder | withPresencePenalty(double presencePenalty) Set the presence penalty setting for prompt execution. | 
| Builder | withResponseFormat(ResponseFormat responseFormat) Set the response format to use for prompt execution. | 
| Builder | withResponseFormat(ResponseFormat.Type responseFormat) Set the response format to use for prompt execution. | 
| Builder | withResultsPerPrompt(int resultsPerPrompt) Set the number of results to generate for each prompt. | 
| Builder | withServiceId(String serviceId) Set the id of the AI service to use for prompt execution. | 
| Builder | withStopSequences(List<String> stopSequences) Set the stop sequences to use for prompt execution. | 
| Builder | withTemperature(double temperature) Set the temperature setting for prompt execution. | 
| Builder | withTokenSelectionBiases(Map<Integer,Integer> tokenSelectionBiases) Set the token selection biases to use for prompt execution. | 
| Builder | withTopP(double topP) Set the topP setting for prompt execution. | 
| Builder | withUser(String user) Set the user to associate with the prompt execution. | 
Methods inherited from java.lang.Object
Constructor Details
Builder
public Builder()
Method Details
build
public PromptExecutionSettings build()
Build the PromptExecutionSettings.
Returns:
withBestOf
public PromptExecutionSettings.Builder withBestOf(int bestOf)
Set the best of setting for prompt execution. The value is clamped to the range [1, Integer.MAX_VALUE], and the default is 1.
Parameters:
Returns:
withFrequencyPenalty
public PromptExecutionSettings.Builder withFrequencyPenalty(double frequencyPenalty)
Set the frequency penalty setting for prompt execution. The value is clamped to the range [-2.0, 2.0], and the default is 0.0.
Parameters:
Returns:
withJsonSchemaResponseFormat
public PromptExecutionSettings.Builder withJsonSchemaResponseFormat(Class responseFormat)
Set the response format to use a json schema generated for the given class. The name of the response format will be the name of the class.
Parameters:
Returns:
withMaxTokens
public PromptExecutionSettings.Builder withMaxTokens(int maxTokens)
Set the maximum number of tokens to generate in the output. The value is clamped to the range [1, Integer.MAX_VALUE], and the default is 256.
Parameters:
Returns:
withModelId
public PromptExecutionSettings.Builder withModelId(String modelId)
Set the id of the model to use for prompt execution.
Parameters:
Returns:
withPresencePenalty
public PromptExecutionSettings.Builder withPresencePenalty(double presencePenalty)
Set the presence penalty setting for prompt execution. The value is clamped to the range [-2.0, 2.0], and the default is 0.0.
Parameters:
Returns:
withResponseFormat
public PromptExecutionSettings.Builder withResponseFormat(ResponseFormat responseFormat)
Set the response format to use for prompt execution.
Parameters:
Returns:
withResponseFormat
public PromptExecutionSettings.Builder withResponseFormat(ResponseFormat.Type responseFormat)
Set the response format to use for prompt execution.
Parameters:
Returns:
withResultsPerPrompt
public PromptExecutionSettings.Builder withResultsPerPrompt(int resultsPerPrompt)
Set the number of results to generate for each prompt. The value is clamped to the range [1, Integer.MAX_VALUE], and the default is 1.
Parameters:
Returns:
withServiceId
public PromptExecutionSettings.Builder withServiceId(String serviceId)
Set the id of the AI service to use for prompt execution.
Parameters:
Returns:
withStopSequences
public PromptExecutionSettings.Builder withStopSequences(List
Set the stop sequences to use for prompt execution.
Parameters:
Returns:
withTemperature
public PromptExecutionSettings.Builder withTemperature(double temperature)
Set the temperature setting for prompt execution. The value is clamped to the range [0.0, 2.0], and the default is 1.0.
Parameters:
Returns:
withTokenSelectionBiases
public PromptExecutionSettings.Builder withTokenSelectionBiases(Map
Set the token selection biases to use for prompt execution. The bias values are clamped to the range [-100, 100].
Parameters:
Returns:
withTopP
public PromptExecutionSettings.Builder withTopP(double topP)
Set the topP setting for prompt execution. The value is clamped to the range [0.0, 1.0], and the default is 1.0.
Parameters:
Returns:
withUser
public PromptExecutionSettings.Builder withUser(String user)
Set the user to associate with the prompt execution.
Parameters:
Returns: