You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/tool-calling.qmd
+44-25Lines changed: 44 additions & 25 deletions
Original file line number
Diff line number
Diff line change
@@ -20,57 +20,58 @@ from chatlas import ChatOpenAI
20
20
21
21
### Motivating example
22
22
23
-
Let's take a look at an example where we really need an external tool. Chat models generally do not know the current time, which makes questions like these impossible.
23
+
Let's take a look at an example where we really need an external tool. Chat models generally do not have access to "real-time" information, such as current events, weather, etc. Let's see what happens when we ask the chat model about the weather in a specific location:
24
24
25
25
```{python}
26
-
chat = ChatOpenAI(model="gpt-4o")
27
-
_ = chat.chat("How long ago exactly was the moment Neil Armstrong touched down on the moon?")
26
+
chat = ChatOpenAI(model="gpt-4o-mini")
27
+
_ = chat.chat("What's the weather like today in Duluth, MN?")
28
28
```
29
+
29
30
30
-
Unfortunately, the LLM doesn't hallucinates the current date. Let's give the chat model the ability to determine the current time and try again.
31
+
Fortunately, the model is smart enough to know that it doesn't have access to real-time information, and it doesn't try to make up an answer. However, we can help it out by providing a tool that can fetch the weather for a given location.
31
32
32
33
### Defining a tool function
33
34
34
-
The first thing we'll do is define a Python function that returns the current time. This will be our tool.
35
+
At it turns out, LLMs are pretty good at figuring out 'structure' like latitude and longitude from 'unstructured' things like a location name. So we can write a tool function that takes a latitude and longitude and returns the current temperature at that location. Here's an example of how you might write such a function using the [Open-Meteo API](https://open-meteo.com/):
Note that we've gone through the trouble of adding the following to our function:
58
59
59
-
- Type hints for arguments and the return value
60
-
- A docstring that explains what the function does and what arguments it expects
60
+
- Type hints for function arguments
61
+
- A docstring that explains what the function does and what arguments it expects (as well as descriptions for the arguments themselves)
61
62
62
-
**Providing these hints and context is very important**, as it helps the chat model understand how to use your tool correctly!
63
+
**Providing these hints and documentation is very important**, as it helps the chat model understand how to use your tool correctly!
63
64
64
65
Let's test it:
65
66
66
67
```{python}
67
-
get_current_time()
68
+
get_current_temperature(46.7867, -92.1005)
68
69
```
69
70
70
71
71
72
### Using the tool
72
73
73
-
In order for the LLM to make use of our tool, we need to register it with the chat object. This is done by calling the `register_tool` method on the chat object.
74
+
In order for the LLM to make use of our tool, we need to register it with the `chat` object. This is done by calling the `register_tool` method on the chat object.
_ = chat.chat("How long ago exactly was the moment Neil Armstrong touched down on the moon?")
83
+
_ = chat.chat("What's the weather like today in Duluth, MN?")
83
84
```
84
85
85
86
That's correct! Without any further guidance, the chat model decided to call our tool function and successfully used its result in formulating its response.
86
87
87
-
This tool example was extremely simple, but you can imagine doing much more interesting things from tool functions: calling APIs, reading from or writing to a database, kicking off a complex simulation, or even calling a complementary GenAI model (like an image generator). Or if you are using chatlas in a Shiny app, you could use tools to set reactive values, setting off a chain of reactive updates.
88
+
This tool example was extremely simple, but you can imagine doing much more interesting things from tool functions: calling APIs, reading from or writing to a database, kicking off a complex simulation, or even calling a complementary GenAI model (like an image generator). Or if you are using chatlas in a Shiny app, you could use tools to set reactive values, setting off a chain of reactive updates. This is precisely what the [sidebot dashboard](https://github.com/jcheng5/py-sidebot) does to allow for an AI assisted "drill-down" into the data.
89
+
90
+
### Trouble-shooting
91
+
92
+
When the execution of a tool function fails, chatlas sends the exception message back to the chat model. This can be useful for gracefully handling errors in the chat model. However, this can also lead to confusion as to why a response did not come back as expected. If you encounter such a situation, you can set `echo="all"` in the `chat.chat()` method to see the full conversation, including tool calls and their results.
0 commit comments