MistralAI
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:79
MistralAI LLM implementation
Extends
Section titled “Extends”ToolCallLLM<ToolCallLLMMessageOptions>
Constructors
Section titled “Constructors”Constructor
Section titled “Constructor”new MistralAI(
init?):MistralAI
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:91
Parameters
Section titled “Parameters”Partial<MistralAI>
Returns
Section titled “Returns”MistralAI
Overrides
Section titled “Overrides”ToolCallLLM<ToolCallLLMMessageOptions>.constructor
Properties
Section titled “Properties”model:
"mistral-small-latest"|"mistral-large-latest"|"codestral-latest"|"pixtral-large-latest"|"ministral-8b-latest"|"ministral-3b-latest"|"mistral-tiny"|"mistral-small"|"mistral-medium"|"mistral-saba-latest"|"mistral-embed"|"mistral-moderation-latest"
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:81
temperature
Section titled “temperature”temperature:
number
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:82
topP:
number
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:83
maxTokens?
Section titled “maxTokens?”
optionalmaxTokens:number
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:84
apiKey?
Section titled “apiKey?”
optionalapiKey:string
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:85
safeMode
Section titled “safeMode”safeMode:
boolean
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:86
randomSeed?
Section titled “randomSeed?”
optionalrandomSeed:number
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:87
Accessors
Section titled “Accessors”metadata
Section titled “metadata”Get Signature
Section titled “Get Signature”get metadata():
object
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:102
Returns
Section titled “Returns”object
model:
"mistral-small-latest"|"mistral-large-latest"|"codestral-latest"|"pixtral-large-latest"|"ministral-8b-latest"|"ministral-3b-latest"|"mistral-tiny"|"mistral-small"|"mistral-medium"|"mistral-saba-latest"|"mistral-embed"|"mistral-moderation-latest"
temperature
Section titled “temperature”temperature:
number
topP:
number
maxTokens
Section titled “maxTokens”maxTokens:
undefined|number
contextWindow
Section titled “contextWindow”contextWindow:
number
tokenizer
Section titled “tokenizer”tokenizer:
undefined=undefined
structuredOutput
Section titled “structuredOutput”structuredOutput:
boolean=false
Overrides
Section titled “Overrides”ToolCallLLM.metadata
supportToolCall
Section titled “supportToolCall”Get Signature
Section titled “Get Signature”get supportToolCall():
boolean
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:114
Returns
Section titled “Returns”boolean
Overrides
Section titled “Overrides”ToolCallLLM.supportToolCall
Methods
Section titled “Methods”formatMessages()
Section titled “formatMessages()”formatMessages(
messages): ({role:"assistant";content:string;toolCalls:object[];toolCallId?:undefined; } | {toolCalls?:undefined;role:"tool";content:string;toolCallId:string; } | {toolCalls?:undefined;toolCallId?:undefined;role:MessageType;content:string; })[]
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:118
Parameters
Section titled “Parameters”messages
Section titled “messages”ChatMessage<ToolCallLLMMessageOptions>[]
Returns
Section titled “Returns”({ role: "assistant"; content: string; toolCalls: object[]; toolCallId?: undefined; } | { toolCalls?: undefined; role: "tool"; content: string; toolCallId: string; } | { toolCalls?: undefined; toolCallId?: undefined; role: MessageType; content: string; })[]
toTool()
Section titled “toTool()”
statictoTool(tool):Tool
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:171
Parameters
Section titled “Parameters”BaseTool
Returns
Section titled “Returns”Tool
chat()
Section titled “chat()”Call Signature
Section titled “Call Signature”chat(
params):Promise<AsyncIterable<ChatResponseChunk,any,any>>
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:186
Parameters
Section titled “Parameters”params
Section titled “params”LLMChatParamsStreaming
Returns
Section titled “Returns”Promise<AsyncIterable<ChatResponseChunk, any, any>>
Overrides
Section titled “Overrides”ToolCallLLM.chat
Call Signature
Section titled “Call Signature”chat(
params):Promise<ChatResponse<object>>
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:189
Parameters
Section titled “Parameters”params
Section titled “params”LLMChatParamsNonStreaming<ToolCallLLMMessageOptions>
Returns
Section titled “Returns”Promise<ChatResponse<object>>
Overrides
Section titled “Overrides”ToolCallLLM.chat
streamChat()
Section titled “streamChat()”
protectedstreamChat(messages,tools?):AsyncIterable<ChatResponseChunk<ToolCallLLMMessageOptions>>
Defined in: .build/typescript/packages/providers/mistral/src/llm.ts:240
Parameters
Section titled “Parameters”messages
Section titled “messages”ChatMessage[]
tools?
Section titled “tools?”BaseTool<any>[]
Returns
Section titled “Returns”AsyncIterable<ChatResponseChunk<ToolCallLLMMessageOptions>>