TestModel class 
A PromptCompletionModel used for testing.
Constructors
Properties
| events | Events emitted by the model. | 
Methods
Constructor Details
		TestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)
	      
	Creates a new OpenAIModel instance.
new TestModel(handler: (model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)Parameters
- handler
- 
				(model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>> 
Property Details
events
Events emitted by the model.
PromptCompletionModelEmitter eventsProperty Value
An event emitter for the model.
Method Details
		completePrompt(TurnContext, Memory, PromptFunctions, Tokenizer, PromptTemplate)
	    
	Completes a prompt using OpenAI or Azure OpenAI.
function completePrompt(context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate): Promise<PromptResponse<string>>Parameters
- context
- 
				TurnContext 
Current turn context.
- memory
- Memory
An interface for accessing state values.
- functions
- PromptFunctions
Functions to use when rendering the prompt.
- tokenizer
- Tokenizer
Tokenizer to use when rendering the prompt.
- template
- PromptTemplate
Prompt template to complete.
Returns
Promise<PromptResponse<string>>
A PromptResponse with the status and message.
		createTestModel((model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>)
	       
	static function createTestModel(handler: (model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>>): TestModelParameters
- handler
- 
				(model: TestModel, context: TurnContext, memory: Memory, functions: PromptFunctions, tokenizer: Tokenizer, template: PromptTemplate) => Promise<PromptResponse<string>> 
Returns
		returnContent(string, number)
	 
	static function returnContent(content: string, delay?: number): TestModelParameters
- content
- 
				string 
- delay
- 
				number 
Returns
		returnError(Error, number)
	 
	static function returnError(error: Error, delay?: number): TestModelParameters
- error
- 
				Error 
- delay
- 
				number 
Returns
		returnRateLimited(Error, number)
	  
	static function returnRateLimited(error: Error, delay?: number): TestModelParameters
- error
- 
				Error 
- delay
- 
				number 
Returns
		returnResponse(PromptResponse<string>, number)
	  
	static function returnResponse(response: PromptResponse<string>, delay?: number): TestModelParameters
- response
- 
				PromptResponse<string> 
- delay
- 
				number 
Returns
		streamTextChunks(string[], number)
	  
	static function streamTextChunks(chunks: string[], delay?: number): TestModelParameters
- chunks
- 
				string[] 
- delay
- 
				number