Skip to content

Commit 8652085

Browse files
final changes
1 parent e31ac5a commit 8652085

File tree

1 file changed

+106
-0
lines changed

1 file changed

+106
-0
lines changed

integrations/llms/mistral-ai.mdx

Lines changed: 106 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -163,6 +163,112 @@ You can manage all prompts to Mistral AI in the [Prompt Library](/product/prompt
163163

164164
Once you're ready with your prompt, you can use the `portkey.prompts.completions.create` interface to use the prompt in your application.
165165

166+
167+
### Mistral Tool Calling
168+
Tool calling feature lets models trigger external tools based on conversation context. You define available functions, the model chooses when to use them, and your application executes them and returns results.
169+
170+
Portkey supports Groq Tool Calling and makes it interoperable across multiple providers. With Portkey Prompts, you can templatize various your prompts & tool schemas as well.
171+
172+
<Card title="Supported Groq Models with Tool Calling" href="https://console.groq.com/docs/tool-use#supported-models">
173+
174+
</Card>
175+
176+
<Tabs>
177+
<Tab title="Node.js">
178+
```javascript Get Weather Tool
179+
let tools = [{
180+
type: "function",
181+
function: {
182+
name: "getWeather",
183+
description: "Get the current weather",
184+
parameters: {
185+
type: "object",
186+
properties: {
187+
location: { type: "string", description: "City and state" },
188+
unit: { type: "string", enum: ["celsius", "fahrenheit"] }
189+
},
190+
required: ["location"]
191+
}
192+
}
193+
}];
194+
195+
let response = await portkey.chat.completions.create({
196+
model: "your_mistral_model_name",
197+
messages: [
198+
{ role: "system", content: "You are a helpful assistant." },
199+
{ role: "user", content: "What's the weather like in Delhi - respond in JSON" }
200+
],
201+
tools,
202+
tool_choice: "auto",
203+
});
204+
205+
console.log(response.choices[0].finish_reason);
206+
```
207+
</Tab>
208+
<Tab title="Python">
209+
```python Get Weather Tool
210+
tools = [{
211+
"type": "function",
212+
"function": {
213+
"name": "getWeather",
214+
"description": "Get the current weather",
215+
"parameters": {
216+
"type": "object",
217+
"properties": {
218+
"location": {"type": "string", "description": "City and state"},
219+
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
220+
},
221+
"required": ["location"]
222+
}
223+
}
224+
}]
225+
226+
response = portkey.chat.completions.create(
227+
model="your_mistral_model_name",
228+
messages=[
229+
{"role": "system", "content": "You are a helpful assistant."},
230+
{"role": "user", "content": "What's the weather like in Delhi - respond in JSON"}
231+
],
232+
tools=tools,
233+
tool_choice="auto"
234+
)
235+
236+
print(response.choices[0].finish_reason)
237+
```
238+
</Tab>
239+
<Tab title="cURL">
240+
```curl Get Weather Tool
241+
curl -X POST "https://api.portkey.ai/v1/chat/completions" \
242+
-H "Content-Type: application/json" \
243+
-H "Authorization: Bearer YOUR_PORTKEY_API_KEY" \
244+
-d '{
245+
"model": "your_mistral_model_name",
246+
"messages": [
247+
{"role": "system", "content": "You are a helpful assistant."},
248+
{"role": "user", "content": "What'\''s the weather like in Delhi - respond in JSON"}
249+
],
250+
"tools": [{
251+
"type": "function",
252+
"function": {
253+
"name": "getWeather",
254+
"description": "Get the current weather",
255+
"parameters": {
256+
"type": "object",
257+
"properties": {
258+
"location": {"type": "string", "description": "City and state"},
259+
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
260+
},
261+
"required": ["location"]
262+
}
263+
}
264+
}],
265+
"tool_choice": "auto"
266+
}'
267+
```
268+
</Tab>
269+
</Tabs>
270+
271+
166272
## Next Steps
167273

168274
The complete list of features supported in the SDK are available on the link below.

0 commit comments

Comments
 (0)