ModelBatchDeploymentSettings Class   
Model Batch Deployment Settings entity.
Constructor
ModelBatchDeploymentSettings(*, mini_batch_size: int | None, instance_count: int | None = None, max_concurrency_per_instance: int | None = None, output_action: BatchDeploymentOutputAction | str | None = None, output_file_name: str | None = None, retry_settings: BatchRetrySettings | None = None, environment_variables: Dict[str, str] | None = None, error_threshold: int | None = None, logging_level: str | None = None, **kwargs: Any)
		Parameters
| Name | Description | 
|---|---|
| 
		 mini_batch_size 
			
				Required
			 
	 | 
	
		
		 Size of the mini-batch passed to each batch invocation, defaults to 10  | 
| 
		 instance_count 
			
				Required
			 
	 | 
	
		
		 Number of instances the interfering will run on. Equivalent to resources.instance_count.  | 
| 
		 output_action 
			
				Required
			 
	 | 
	
		
		 Indicates how the output will be organized. Possible values include: "summary_only", "append_row". Defaults to "append_row"  | 
| 
		 output_file_name 
			
				Required
			 
	 | 
	
		
		 Customized output file name for append_row output action, defaults to "predictions.csv"  | 
| 
		 max_concurrency_per_instance 
			
				Required
			 
	 | 
	
		
		 Indicates maximum number of parallelism per instance, defaults to 1  | 
| 
		 retry_settings 
			
				Required
			 
	 | 
	
		
		 Retry settings for a batch inference operation, defaults to None  | 
| 
		 environment_variables 
			
				Required
			 
	 | 
	
		
		 Environment variables that will be set in deployment.  | 
| 
		 error_threshold 
			
				Required
			 
	 | 
	
		
		 Error threshold, if the error count for the entire input goes above this value, the batch inference will be aborted. Range is [-1, int.MaxValue] -1 value indicates, ignore all failures during batch inference For FileDataset count of file failures For TabularDataset, this is the count of record failures, defaults to -1  | 
| 
		 logging_level 
			
				Required
			 
	 | 
	
		
		 Logging level for batch inference operation, defaults to "info"  | 
Keyword-Only Parameters
| Name | Description | 
|---|---|
| 
		 mini_batch_size 
			
				Required
			 
	 | 
	
		 | 
| 
		 instance_count 
	 | 
	
		 Default value: None 
			 | 
| 
		 max_concurrency_per_instance 
	 | 
	
		 Default value: None 
			 | 
| 
		 output_action 
	 | 
	
		 Default value: None 
			 | 
| 
		 output_file_name 
	 | 
	
		 Default value: None 
			 | 
| 
		 retry_settings 
	 | 
	
		 Default value: None 
			 | 
| 
		 environment_variables 
	 | 
	
		 Default value: None 
			 | 
| 
		 error_threshold 
	 | 
	
		 Default value: None 
			 | 
| 
		 logging_level 
	 | 
	
		 Default value: None 
			 | 
Examples
Creating a Model Batch Deployment Settings object.
   from azure.ai.ml.entities._deployment.model_batch_deployment_settings import ModelBatchDeploymentSettings
   modelBatchDeploymentSetting = ModelBatchDeploymentSettings(
       mini_batch_size=256,
       instance_count=5,
       max_concurrency_per_instance=2,
       output_file_name="output-file-name",
       environment_variables={"env1": "value1", "env2": "value2"},
       error_threshold=2,
       logging_level="1",
   )