You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: apps/kilocode-docs/docs/features/experimental/native-function-calling.md
+10-4Lines changed: 10 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,12 +41,16 @@ Because of these risks and considerations, this capability is experiment, and of
41
41
42
42
To enable and use native function calling, consider and perform the following:
43
43
44
-
1. Ensure you are using a provider that has been enabled in Kilo Code for this experiment. As of Oct 16, 2025, they include:
44
+
1. Ensure you are using a provider that has been enabled in Kilo Code for this experiment. As of Oct 21, 2025, they include:
45
45
46
46
- OpenRouter
47
47
- Kilo Code
48
48
- LM Studio
49
49
- OpenAI Compatible
50
+
- Z.ai
51
+
- Synthetic
52
+
- X.ai
53
+
- Chutes
50
54
51
55
By default, native function calling is _disabled_ for most models. Should you wish to try it, open the Advanced settings for a given provider profile that is included in the testing group.
52
56
@@ -55,11 +59,13 @@ Change the Tool Calling Style to `JSON`, and save the profile.
55
59
## Caveats
56
60
57
61
This feature is currently experimental and mostly intended for users interested in contributing to its development.
58
-
It is so far only supported when using OpenRouter or Kilo Code providers. There are possible issues including, but not limited to:
59
62
60
-
- Missing tools
63
+
There are possible issues including, but not limited to:
64
+
65
+
-~~Missing tools~~: As of Oct 21, all tools are supported
61
66
- Tools calls not updating the UI until they are complete
62
-
- MCP servers not working
67
+
-~~MCP servers not working~~: As of Oct 21, MCPs are supported
63
68
- Errors specific to certain inference providers
69
+
- Not all inference providers use servers that are fully compatible with the OpenAI specification. As a result, behavior will vary, even with the same model across providers.
64
70
65
71
While nearly any provider can be configured via the OpenAI Compatible profile, testers should be aware that this is enabled purely for ease of testing and should be prepared to experience unexpected responses from providers that are not prepared to handle native function calls.
Copy file name to clipboardExpand all lines: src/core/prompts/sections/mcp-servers.ts
+8-8Lines changed: 8 additions & 8 deletions
Original file line number
Diff line number
Diff line change
@@ -11,6 +11,11 @@ export async function getMcpServersSection(
11
11
if(!mcpHub){
12
12
return""
13
13
}
14
+
// kilocode_change start
15
+
if(toolUseStyle==="json"){
16
+
return""
17
+
}
18
+
// kilocode_change end
14
19
15
20
constconnectedServers=
16
21
mcpHub.getServers().length>0
@@ -68,19 +73,14 @@ ${connectedServers}`
68
73
returnbaseSection
69
74
}
70
75
71
-
letdescSection=
76
+
return(
72
77
baseSection+
73
78
`
74
79
## Creating an MCP Server
75
80
76
-
The user may ask you something along the lines of "add a tool" that does some function, in other words to create an MCP server that provides tools and resources that may connect to external APIs for example. If they do, you should obtain detailed instructions on this topic using the fetch_instructions tool, `
77
-
// kilocode_change: toolUseStyle
78
-
if(toolUseStyle!=="json"){
79
-
descSection+=`like this:
81
+
The user may ask you something along the lines of "add a tool" that does some function, in other words to create an MCP server that provides tools and resources that may connect to external APIs for example. If they do, you should obtain detailed instructions on this topic using the fetch_instructions tool, like this:
0 commit comments