Skip to content

Commit 904e122

Browse files
Final tweaks
1 parent a0d9944 commit 904e122

File tree

4 files changed

+32
-20
lines changed

4 files changed

+32
-20
lines changed

content/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/_index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,7 @@ armips:
3030
tools_software_languages:
3131
- Python
3232
- AWS Graviton
33+
- AI
3334
operatingsystems:
3435
- Linux
3536

content/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/agent-output.md

Lines changed: 27 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Understand and Test the AI Agent
2+
title: Explore and Test Your AI Agent
33
weight: 5
44

55
### FIXED, DO NOT MODIFY
@@ -8,11 +8,13 @@ layout: learningpathall
88

99
## AI Agent Function Calls
1010

11-
An AI agent, powered by a LLM, selects the most appropriate function by analyzing the prompt or input it receives, identifying the relevant intent or task, and then matching the intent to the most appropriate function from a pre-defined set of available functions based on its understanding of the language and context.
11+
An AI agent, powered by an LLM, selects the most appropriate function by analyzing the input, identifying the relevant intent, and matching it to predefined functions based on its understanding of the language and context.
1212

13-
Have a look at how this is implemented in the python script `agent.py`:
13+
You will now walk through how this is implemented in the excerpt from a Python script called `agent.py`.
1414

15-
- This code section of `agent.py` shown below creates an instance of the quantized `llama3.1 8B` model for more efficient inference on Arm-based systems.
15+
#### Initialize the Quantized Model
16+
17+
This code section of `agent.py` shown below creates an instance of the quantized `llama3.1 8B` model for more efficient inference on Arm-based systems:
1618
```output
1719
llama_model = Llama(
1820
model_path="./models/dolphin-2.9.4-llama3.1-8b-Q4_0.gguf",
@@ -22,13 +24,17 @@ llama_model = Llama(
2224
n_threads_batch=64,
2325
)
2426
```
27+
#### Define a Provider
2528

26-
- Next, you define a provider that leverages the `llama.cpp` Python bindings:
29+
Now define a provider that leverages the `llama.cpp` Python bindings:
2730
```output
2831
provider = LlamaCppPythonProvider(llama_model)
2932
```
33+
#### Define Functions
34+
35+
The LLM has access to certain tools or functions and can take a general user input and decide which functions to call. The function’s docstring guides the LLM on when and how to invoke it.
3036

31-
- The LLM has access to certain tools or functions and can take a general user input and decide which functions to call. The function’s docstring guides the LLM on when and how to invoke it. In `agent.py` three such tools or functions are defined; `open_webpage`, `get_current_time`, and `calculator`:
37+
In `agent.py` three such tools or functions are defined; `open_webpage`, `get_current_time`, and `calculator`:
3238

3339
```output
3440
def open_webpage():
@@ -86,16 +92,19 @@ def calculator(
8692
else:
8793
raise ValueError("Unknown operation.")
8894
```
95+
#### Create Output Settings to Enable Function Calls
8996

90-
- `from_functions` creates an instance of `LlmStructuredOutputSettings` by passing in a list of callable Python functions. The LLM can then decide if and when to use these functions based on user queries:
97+
`from_functions` creates an instance of `LlmStructuredOutputSettings` by passing in a list of callable Python functions. The LLM can then decide if and when to use these functions based on user queries:
9198

9299
```output
93100
output_settings = LlmStructuredOutputSettings.from_functions(
94101
[get_current_time, open_webpage, calculator], allow_parallel_function_calling=True
95102
)
96103
97104
```
98-
- The user's prompt is then collected and processed through `LlamaCppAgent`. The agent decides whether to call any defined functions based on the request:
105+
#### Collect and Process User Input
106+
107+
The user's prompt is then collected and processed through `LlamaCppAgent`. The agent decides whether to call any defined functions based on the request:
99108
```
100109
user = input("Please write your prompt here: ")
101110
@@ -111,15 +120,15 @@ result = llama_cpp_agent.get_chat_response(
111120
)
112121
```
113122

114-
## Test the AI Agent
123+
## Test and Run the AI Agent
115124

116-
You are now ready to test and execute the AI agent python script. Start the application:
125+
You're now ready to test and run the AI agent Python script. Start the application:
117126

118127
```bash
119128
python3 agent.py
120129
```
121130

122-
You will see lots of interesting statistics being printed from `llama.cpp` about the model and the system, followed by the prompt as shown:
131+
You will see lots of interesting statistics being printed from `llama.cpp` about the model and the system, followed by the prompt for input, as shown:
123132

124133
```output
125134
llama_kv_cache_init: CPU KV buffer size = 1252.00 MiB
@@ -142,9 +151,11 @@ Please write your prompt here:
142151

143152
## Test the AI agent
144153

145-
When you are presented with "Please write your prompt here:" test it with an input prompt. Enter "What is the current time?"
154+
When you are presented with `Please write your prompt here:` test it with an input prompt.
155+
156+
Enter `What is the current time?`
146157

147-
- As part of the prompt, a list of executable functions is sent to the LLM, allowing the agent to select the appropriate function:
158+
As part of the prompt, a list of executable functions is sent to the LLM, allowing the agent to select the appropriate function:
148159

149160
```output
150161
Read and follow the instructions below:
@@ -181,7 +192,7 @@ To call a function, respond with a JSON object (to call one function) or a list
181192
- "arguments": Put the arguments to pass to the function here.
182193
```
183194

184-
The AI Agent then decides to invoke the appropriate function and return the result as shown:
195+
The AI agent then decides to invoke the appropriate function and returns the result as shown:
185196

186197
```output
187198
[
@@ -197,9 +208,9 @@ Response from AI Agent:
197208
----------------------------------------------------------------
198209
```
199210

200-
You have now tested when you enter, "What is the current time?", the AI Agent will choose to call the `get_current_time()` function, and return a result in **H:MM AM/PM** format.
211+
You have now tested the `What is the current time?` question. The AI agent evaluates the query and calls the `get_current_time()` function, and returns a result in **H:MM AM/PM** format.
201212

202-
You have successfully run an AI agent. You can ask different questions to trigger and execute other functions. You can extend your AI agent by defining custom functions so it can handle specific tasks.
213+
You have successfully run and tested your AI agent. Experiment with different prompts or define custom functions to expand your AI agent's capabilities.
203214

204215

205216

content/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/ai-agent.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,20 +13,20 @@ An AI agent is an integrated system that extends beyond basic text generation by
1313
Here’s a closer look at the underlying elements:
1414

1515
- **System**: Each AI agent functions as an interconnected ecosystem of components. Below is a list of key factors and components that affect system performance:
16-
- **Environment**: The domain in which the AI Agent operates. For instance, in a system that books travel itineraries, this might include airline reservation systems and hotel booking tools.
16+
- **Environment**: The domain in which the AI agent operates. For instance, in a system that books travel itineraries, this might include airline reservation systems and hotel booking tools.
1717
- **Sensors**: The methods the AI agent uses to observe its environment. For a travel agent, these might be APIs that inform the agent about seat availability on flights or room occupancy in hotels.
1818
- **Actuators**: Ways the AI agent exerts influence within the environment. In the example of a travel agent, placing a booking or modifying an existing reservation illustrates how actuators function to enact changes within the environment.
1919

2020
- **Large Language Models**: While agents have long existed, LLMs enhance these systems with powerful language comprehension and data-processing capabilities.
2121
- **Action Execution**: Rather than just produce text, LLMs within an agent context interpret user instructions and interact with tools to achieve specific objectives.
2222
- **Tools**: The agent’s available toolkit depends on the software environment and developer-defined boundaries. In the travel agent example in this Learning Path, these tools might be limited to flight and hotel reservation APIs.
23-
- **Knowledge**: Beyond immediate data sources, the agent can fetch additional details - perhaps from databases or web services - for enhanced decision making.
23+
- **Knowledge**: Beyond immediate data sources, the agent can fetch additional details - perhaps from databases or web services - for enhanced decision-making.
2424

2525

2626

2727
## Types of AI Agents
2828

29-
AI Agents come in multiple forms. The table below provides an overview of some agent types and examples of their roles in a travel booking system:
29+
AI agents come in multiple forms. The table below provides an overview of some agent types and examples of their roles in a travel booking system:
3030

3131
| **Agent Category** | **Key Characteristics** | **Example Usage in a Travel Booking System** |
3232
|--------------------------|--------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------|

content/learning-paths/servers-and-cloud-computing/ai-agent-on-cpu/set-up.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ The instructions in this Learning Path have been designed for Arm servers runnin
1414

1515
## Overview
1616

17-
In this Learning Path, you will learn how to build an AI agent application using `llama-cpp-python`, and `llama-cpp-agent`. `llama-cpp-python` is a Python binding for `llama.cpp` that enables efficient LLM inference on Arm CPUs and `llama-cpp-agent` provides an interface for processing text using agentic chains with tools.
17+
In this Learning Path, you will learn how to build an AI agent application using `llama-cpp-python` and `llama-cpp-agent`. `llama-cpp-python` is a Python binding for `llama.cpp` that enables efficient LLM inference on Arm CPUs and `llama-cpp-agent` provides an interface for processing text using agentic chains with tools.
1818

1919
## Install Dependencies
2020

0 commit comments

Comments
 (0)