Skip to content

Commit e18dab2

Browse files
authored
Merge branch 'agiresearch:main' into main
2 parents 196f8e2 + 098a295 commit e18dab2

File tree

3 files changed

+95
-97
lines changed

3 files changed

+95
-97
lines changed

.github/workflows/test_ollama.yml

Lines changed: 24 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -101,35 +101,34 @@ jobs:
101101
sleep 1
102102
done
103103
104-
- name: Run the run-agent code
104+
- name: Run all Ollama tests
105105
run: |
106-
# Debug: Show source of code
107-
echo "=== Code Source Info ==="
108-
echo "Source Repo: ${{ github.event.pull_request.head.repo.full_name || github.repository }}"
109-
echo "Source Branch: ${{ github.head_ref || github.ref_name }}"
110-
echo "Commit: $(git rev-parse --short HEAD)"
111-
echo "PR: ${{ github.event.pull_request.html_url || 'Not a PR' }}"
106+
cd aios_root/Cerebrum
112107
113-
# Run agent and capture exit code
114-
run-agent \
115-
--mode remote \
116-
--agent_author example \
117-
--agent_name test_agent \
118-
--agent_version 0.0.3 \
119-
--agenthub_url https://app.aios.foundation \
120-
--task "Tell me what is the capital of United States"
108+
# Create test results directory
109+
mkdir -p test_results
121110
122-
# Check for specific error patterns in the log
123-
# if grep -q "Failed to initialize client: 500 Server Error" agent.log; then
124-
# echo "Error: LLM initialization failed. Please check your API key configuration."
125-
# exit 1
126-
# fi
111+
# Find and run all test files containing "ollama" in their name
112+
echo "=== Running all Ollama tests ==="
113+
find tests -name "*ollama*.py" -type f | while read -r test_file; do
114+
echo "Running test: $test_file"
115+
python -m $test_file | tee -a ollama_tests.log
116+
echo "----------------------------------------"
117+
done
118+
119+
# Check if any tests were found and run
120+
if [ ! -s ollama_tests.log ]; then
121+
echo "No Ollama tests were found or executed!"
122+
exit 1
123+
fi
127124
128-
# # Check if the agent actually completed successfully
129-
# if ! grep -q "Final Result:" agent.log; then
130-
# echo "Error: Agent did not complete successfully"
131-
# exit 1
132-
# fi
125+
# Check for test failures
126+
if grep -i "FAILED" ollama_tests.log; then
127+
echo "Some tests failed. See logs for details."
128+
exit 1
129+
else
130+
echo "All Ollama tests passed successfully!"
131+
fi
133132
134133
- name: Upload a Build Artifact
135134
if: always() # Upload logs even if job fails

README.md

Lines changed: 68 additions & 70 deletions
Original file line numberDiff line numberDiff line change
@@ -127,7 +127,72 @@ Please see our ongoing [documentation](https://docs.aios.foundation/) for more i
127127
##### Python
128128
- Supported versions: **Python 3.10 - 3.11**
129129

130-
#### Set Up API Keys
130+
#### Installation from source
131+
132+
##### Step 1: Install AIOS Kernel
133+
Git clone AIOS kernel
134+
```bash
135+
git clone https://github.com/agiresearch/AIOS.git
136+
```
137+
Create venv environment
138+
```bash
139+
python3.x -m venv venv # Only support for Python 3.10 and 3.11
140+
source venv/bin/activate
141+
```
142+
or create conda environment
143+
```bash
144+
conda create -n venv python=3.x # Only support for Python 3.10 and 3.11
145+
conda activate venv
146+
```
147+
148+
> [!TIP]
149+
> We strongly recommend using [uv](https://github.com/astral-sh/uv) for faster and more reliable package installation.
150+
> To install uv: `pip install uv`
151+
152+
**For GPU environments:**
153+
```bash
154+
uv pip install -r requirements-cuda.txt
155+
```
156+
157+
**For CPU-only environments:**
158+
```bash
159+
uv pip install -r requirements.txt
160+
```
161+
162+
Alternatively, if you prefer using pip:
163+
164+
**For GPU environments:**
165+
```bash
166+
pip install -r requirements-cuda.txt
167+
```
168+
169+
**For CPU-only environments:**
170+
```bash
171+
pip install -r requirements.txt
172+
```
173+
174+
##### Step 2: Install AIOS SDK (Cerebrum)
175+
1. Clone the Cerebrum repository:
176+
```bash
177+
git clone https://github.com/agiresearch/Cerebrum.git
178+
```
179+
180+
2. Install using uv (recommended):
181+
```bash
182+
cd Cerebrum && uv pip install -e .
183+
```
184+
185+
Or using pip:
186+
```bash
187+
cd Cerebrum && pip install -e .
188+
```
189+
190+
**Note**: The machine where the AIOS kernel (AIOS) is installed must also have the AIOS SDK (Cerebrum) installed. Installing AIOS kernel will install the AIOS SDK automatically by default. If you are using the Local Kernel mode, i.e., you are running AIOS and agents on the same machine, then simply install both AIOS and Cerebrum on that machine. If you are using Remote Kernel mode, i.e., running AIOS on Machine 1 and running agents on Machine 2 and the agents remotely interact with the kernel, then you need to install both AIOS kernel and AIOS SDK on Machine 1, and install the AIOS SDK alone on Machine 2. Please follow the guidelines at [Cerebrum](https://github.com/agiresearch/Cerebrum) regarding how to install the SDK.
191+
192+
### Quickstart
193+
Before launching AIOS, it is required to set up configurations. AIOS provides two ways of setting up configurations, one is to set up by directly modifying the configuration file, another is to set up interactively.
194+
195+
#### Set up configuration file directly (Recommended)
131196
You need API keys for services like OpenAI, Anthropic, Groq and HuggingFace. The simplest way to configure them is to edit the `aios/config/config.yaml`.
132197

133198
> [!TIP]
@@ -204,10 +269,9 @@ You can configure HuggingFace models with specific GPU memory allocation:
204269
eval_device: "cuda:0" # Device for model evaluation
205270
```
206271

207-
##### Detailed Setup Instructions
208-
For detailed instructions on setting up API keys and configuration files, see [Environment Variables Configuration](https://app.gitbook.com/o/6h6b4xbBVMu2pFXdNM0D/s/5h7XvlMFgKMtRboLGG1i/~/diff/~/changes/73/getting-started/environment-variables-configuration).
272+
#### Set up interactively
209273

210-
Alternatively, you can set them as environment variables directly:
274+
Alternatively, you can set up aios configurations interactively by using the following command.
211275

212276
- `aios env list`: Show current environment variables, or show available API keys if no variables are set
213277
- `aios env set`: Show current environment variables, or show available API keys if no variables are set
@@ -224,72 +288,6 @@ When no environment variables are set, the following API keys will be shown:
224288
- `HF_AUTH_TOKEN`: HuggingFace authentication token for accessing models
225289
- `HF_HOME`: Optional path to store HuggingFace models
226290

227-
228-
229-
#### Installation from source
230-
231-
##### Step 1: Install AIOS Kernel
232-
Git clone AIOS kernel
233-
```bash
234-
git clone https://github.com/agiresearch/AIOS.git
235-
```
236-
Create venv environment
237-
```bash
238-
python3.x -m venv venv # Only support for Python 3.10 and 3.11
239-
source venv/bin/activate
240-
```
241-
or create conda environment
242-
```bash
243-
conda create -n venv python=3.x # Only support for Python 3.10 and 3.11
244-
conda activate venv
245-
```
246-
247-
> [!TIP]
248-
> We strongly recommend using [uv](https://github.com/astral-sh/uv) for faster and more reliable package installation.
249-
> To install uv: `pip install uv`
250-
251-
**For GPU environments:**
252-
```bash
253-
uv pip install -r requirements-cuda.txt
254-
```
255-
256-
**For CPU-only environments:**
257-
```bash
258-
uv pip install -r requirements.txt
259-
```
260-
261-
Alternatively, if you prefer using pip:
262-
263-
**For GPU environments:**
264-
```bash
265-
pip install -r requirements-cuda.txt
266-
```
267-
268-
**For CPU-only environments:**
269-
```bash
270-
pip install -r requirements.txt
271-
```
272-
273-
##### Step 2: Install AIOS SDK (Cerebrum)
274-
1. Clone the Cerebrum repository:
275-
```bash
276-
git clone https://github.com/agiresearch/Cerebrum.git
277-
```
278-
279-
2. Install using uv (recommended):
280-
```bash
281-
cd Cerebrum && uv pip install -e .
282-
```
283-
284-
Or using pip:
285-
```bash
286-
cd Cerebrum && pip install -e .
287-
```
288-
289-
**Note**: The machine where the AIOS kernel (AIOS) is installed must also have the AIOS SDK (Cerebrum) installed. Installing AIOS kernel will install the AIOS SDK automatically by default. If you are using the Local Kernel mode, i.e., you are running AIOS and agents on the same machine, then simply install both AIOS and Cerebrum on that machine. If you are using Remote Kernel mode, i.e., running AIOS on Machine 1 and running agents on Machine 2 and the agents remotely interact with the kernel, then you need to install both AIOS kernel and AIOS SDK on Machine 1, and install the AIOS SDK alone on Machine 2. Please follow the guidelines at [Cerebrum](https://github.com/agiresearch/Cerebrum) regarding how to install the SDK.
290-
291-
### Quickstart
292-
293291
#### Launch AIOS
294292
After you setup your keys or environment parameters, then you can follow the instructions below to start.
295293

aios/syscall/syscall.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -649,11 +649,12 @@ def execute_request(self, agent_name: str, query: Any) -> Dict[str, Any]:
649649
```
650650
"""
651651
if isinstance(query, LLMQuery):
652-
if query.action_type == "chat" or query.action_type == "chat_with_json_response" or query.action_type == "chat_with_tool_use_response":
652+
# breakpoint()
653+
if query.action_type == "chat" or query.action_type == "chat_with_json_output" or query.action_type == "chat_with_tool_call_output":
653654
llm_response = self.execute_llm_syscall(agent_name, query)
654655
return llm_response
655656

656-
elif query.action_type == "tool_use":
657+
elif query.action_type == "call_tool":
657658
llm_response = self.execute_llm_syscall(agent_name, query)["response"]
658659
# breakpoint()
659660
tool_query = ToolQuery(

0 commit comments

Comments
 (0)