-
Notifications
You must be signed in to change notification settings - Fork 7
Creating a Hello World prompt tool
Creating a new "tool" in Octopus Copilot simply means you want to include additional functionality to handle another prompt case.
In this exercise, we'll create a tool that responds to prompts like:
Hello world, from Andrew
The first step is to navigate to the project's tools
folder. This folder contains the tools that are executed when OpenAI matches a user's prompt to one of the tools in this folder.
The tools are further categorized by the types of the tools. For example, we have tools that are classified under:
-
CLI: These tools live in the
cli
folder. They are executed from the command line. See this section for an example. -
Generic: These tools live in the
generic
folder and aren't specific to any particular context (with regards to how they are called) -
GitHub Actions: These tools live in the
githubactions
folder. These are where most of the tools live today, as users originally interacted with the tools via the GitHub Copilot chat window in VS Code or Jetbrains Rider, etc.
Note
You might have noticed that there is a wrapper
folder too. To separate the implementation of the prompt handler from its definition, we add slim function definitions in this folder. In nearly all cases, they pass through a callback to the actual implementation and add some basic comments to help OpenAI make the right selection when choosing a tool, e.g., by providing sample prompts it should use.
First, we create the wrapper for the Hello World tool. Create a new Python file called hello_world.py
in the tools/wrapper
folder and include the following code:
def hello_world_wrapper(query, callback, logging):
def hello_world(
persons_name,
**kwargs,
):
"""Answers a prompt like "Hello World!". Use this function when the query is not a question, but someone
saying Hello World to you, optionally including their own name. Queries can look like those in the following list:
* Hello World!
* Hello World, from Mary!
Args:
persons_name: The (optional) persons name
"""
if logging:
logging("Enter:", "hello_world")
for key, value in kwargs.items():
if logging:
logging(f"Unexpected Key: {key}", "Value: {value}")
# This is just a passthrough to the original callback
return callback(query, persons_name)
return hello_world
The code provides a comment at the top, and a single parameter called persons_name
. The comment provides help to the LLM (OpenAI) by giving examples of the type of prompt that could be suitable for this function. There is no consistently reliable way to build these examples in the comments, it's mostly trial and error and executing tests to ensure the right function is selected at runtime by the LLM (OpenAI).
You can add multiple parameters as necessary. Just ensure you pass them through in the callback at the end.
The function also adds basic logging to show the call is being executed and logs any unexpected arguments passed to it. This can be beneficial for debugging since AI has been known to hallucinate.
To add the tool to the web interface (the Chrome extension) and the GitHub Copilot extension, we must create the tool implementation in the tools/githubactions
folder.
Open the tools/githubactions
folder and create a file called hello_world_implementation.py
.
Add the following code:
from domain.response.copilot_response import CopilotResponse
from domain.tools.debug import get_params_message
def hello_world(github_user, logging):
def hello_world_implementation(persons_name):
"""Returns a response to a hello world request."""
debug_text = get_params_message(
github_user, True, hello_world.__name__, persons_name=persons_name
)
# Do any additional prompt processing here. For example, calling out to Octopus, or other APIs
# If you need to add a callback for post-confirmation processing, you can also save any arguments needed.
logging(
"hello_world",
f"""
Persons Name: {persons_name}""",
)
response = ["Hello world back to you."]
if persons_name:
response.extend(f"Nice to meet you, {persons_name}!")
response.extend(debug_text)
return CopilotResponse("\n\n".join(response))
return hello_world_implementation
Note
The tool implementation can also include confirmation prompt handlers where necessary. Looking at other examples such as cancel_task.py
, you will see a cancel_task_confirm_callback_wrapper
function at the top of the file. This is because best practice recommends that any mutating actions performed by the Copilot extension (originally via the Copilot chat window in VS Code etc) should first prompt the user on the action that's about to take place, and then once they confirm, then the action is performed.
Once you have your tool, we next need to add it to the available tools list and wire any callbacks. For the web interface (the Chrome extension) and the GitHub Copilot extension, this needs to be done in the copilot_request_context.py file.
Navigate to the build_form_tools
function, where the collection of available tools is configured.
Add the following code to the function at the end of all the other functions, after the last FunctionDefinition
:
FunctionDefinition(
hello_world_wrapper(
query,
callback=hello_world(get_github_user_from_form(req), log_query),
logging=log_query,
)
),
Important
Ensure the fallback
and invalid
parameters are after your inserted function.