You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/fine-tuning-functions.md
+103-8Lines changed: 103 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,24 +1,118 @@
1
1
---
2
2
title: Fine-tuning function calls with Azure OpenAI Service
3
-
description: Learn how to improve function calling performance with Azure OpenAI fine-tuning
3
+
description: Learn how to improve tool calling performance with Azure OpenAI fine-tuning
4
4
#services: cognitive-services
5
5
manager: nitinme
6
6
ms.service: azure-ai-openai
7
7
ms.topic: how-to
8
-
ms.date: 09/05/2024
8
+
ms.date: 02/20/2025
9
9
author: mrbullwinkle
10
10
ms.author: mbullwin
11
11
---
12
12
13
13
14
-
# Fine-tuning and function calling
14
+
# Fine-tuning and tool calling
15
15
16
-
Models that use the chat completions API support [function calling](../how-to/function-calling.md). Unfortunately, functions defined in your chat completion calls don't always perform as expected. Fine-tuning your model with function calling examples can improve model output by enabling you to:
16
+
Models that use the chat completions API support [tool calling](../how-to/function-calling.md). Unfortunately, functions defined in your chat completion calls don't always perform as expected. Fine-tuning your model with tool calling examples can improve model output by enabling you to:
17
17
18
18
* Get similarly formatted responses even when the full function definition isn't present. (Allowing you to potentially save money on prompt tokens.)
19
19
* Get more accurate and consistent outputs.
20
20
21
-
## Constructing a training file
21
+
> [!NOTE]
22
+
> `function_call` and `functions` have been deprecated in favor of `tools`.
23
+
> It is recommended to use the `tools` parameter instead.
24
+
25
+
26
+
## Tool calling (recommended)
27
+
### Constructing a training file
28
+
29
+
When constructing a training file of tool calling examples, you would take a function definition like this:
30
+
31
+
```json
32
+
{
33
+
"messages": [
34
+
{ "role": "user", "content": "What is the weather in San Francisco?" },
And express the information as a single line within your `.jsonl` training file as below:
73
+
74
+
```jsonl
75
+
{"messages":[{"role":"user","content":"What is the weather in San Francisco?"},{"role":"assistant","tool_calls":[{"id":"call_id","type":"function","function":{"name":"get_current_weather","arguments":"{\"location\": \"San Francisco, USA\", \"format\": \"celsius\"}"}}]}],"tools":[{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and country, eg. San Francisco, USA"},"format":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location","format"]}}}]}
76
+
```
77
+
78
+
As with all fine-tuning training your example file requires at least 10 examples.
79
+
80
+
### Optimize for cost
81
+
82
+
OpenAI recommends that if you're trying to optimize to use fewer prompt tokens post fine-tuning your model on the full function definitions you can experiment with:
83
+
84
+
* Omit function and parameter descriptions: remove the description field from function and parameters.
85
+
* Omit parameters: remove the entire properties field from the parameters object.
86
+
* Omit function entirely: remove the entire function object from the functions array.
87
+
88
+
### Optimize for quality
89
+
90
+
Alternatively, if you're trying to improve the quality of the tool calling output, it's recommended that the function definitions present in the fine-tuning training dataset and subsequent chat completion calls remain identical.
91
+
92
+
### Customize model responses to function outputs
93
+
94
+
Fine-tuning based on tool calling examples can also be used to improve the model's response to function outputs. To accomplish this, you include examples consisting of function response messages and assistant response messages where the function response is interpreted and put into context by the assistant.
95
+
96
+
```json
97
+
{
98
+
"messages": [
99
+
{"role": "user", "content": "What is the weather in San Francisco?"},
{"role": "assistant", "content": "It is 21 degrees celsius in San Francisco, CA"}
103
+
],
104
+
"tools": [] // same as before
105
+
}
106
+
```
107
+
108
+
As with the example before, this example is artificially expanded for readability. The actual entry in the `.jsonl` training file would be a single line:
109
+
110
+
```jsonl
111
+
{"messages":[{"role":"user","content":"What is the weather in San Francisco?"},{"role":"assistant","tool_calls":[{"id":"call_id","type":"function","function":{"name":"get_current_weather","arguments":"{\"location\": \"San Francisco, USA\", \"format\": \"celsius\"}"}}]},{"role":"tool","tool_call_id":"call_id","content":"21.0"},{"role":"assistant","content":"It is 21 degrees celsius in San Francisco, CA"}],"tools":[]}
112
+
```
113
+
114
+
## Function calling
115
+
### Constructing a training file
22
116
23
117
When constructing a training file of function calling examples, you would take a function definition like this:
24
118
@@ -51,19 +145,19 @@ And express the information as a single line within your `.jsonl` training file
51
145
52
146
As with all fine-tuning training your example file requires at least 10 examples.
53
147
54
-
## Optimize for cost
148
+
### Optimize for cost
55
149
56
150
OpenAI recommends that if you're trying to optimize to use fewer prompt tokens post fine-tuning your model on the full function definitions you can experiment with:
57
151
58
152
* Omit function and parameter descriptions: remove the description field from function and parameters.
59
153
* Omit parameters: remove the entire properties field from the parameters object.
60
154
* Omit function entirely: remove the entire function object from the functions array.
61
155
62
-
## Optimize for quality
156
+
### Optimize for quality
63
157
64
158
Alternatively, if you're trying to improve the quality of the function calling output, it's recommended that the function definitions present in the fine-tuning training dataset and subsequent chat completion calls remain identical.
65
159
66
-
## Customize model responses to function outputs
160
+
### Customize model responses to function outputs
67
161
68
162
Fine-tuning based on function calling examples can also be used to improve the model's response to function outputs. To accomplish this, you include examples consisting of function response messages and assistant response messages where the function response is interpreted and put into context by the assistant.
69
163
@@ -85,6 +179,7 @@ As with the example before, this example is artificially expanded for readabilit
85
179
{"messages": [{"role": "user", "content": "What is the weather in San Francisco?"}, {"role": "assistant", "function_call": {"name": "get_current_weather", "arguments": "{\"location\": \"San Francisco, USA\", \"format\": \"celcius\"}"}}, {"role": "function", "name": "get_current_weather", "content": "21.0"}, {"role": "assistant", "content": "It is 21 degrees celsius in San Francisco, CA"}], "functions": []}
0 commit comments