Skip to content

Commit 93db6f8

Browse files
authored
add web search agent (#3)
1 parent 22d7edc commit 93db6f8

File tree

7 files changed

+6399
-0
lines changed

7 files changed

+6399
-0
lines changed
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
TAVILY_API_KEY=
2+
OPENAI_API_KEY=
3+
AGENTOPS_API_KEY=

agents/web_search_agent/.gitignore

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
.env
2+
.venv
3+
.idea
4+
.vscode
5+
.DS_Store
6+
**/__pycache__
7+
swarmzero-data
8+
*.db
9+
*.log
10+
*.log.*

agents/web_search_agent/README.md

Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
# Web Search Agent
2+
3+
A powerful web search agent built using SwarmZero framework that enables intelligent web searching capabilities.
4+
5+
## Description
6+
7+
This agent utilizes the Tavily API for performing web searches and is built on top of the SwarmZero framework, providing enhanced search capabilities with AI-powered processing.
8+
9+
## Prerequisites
10+
11+
- Python 3.11 or higher
12+
- Poetry package manager
13+
- Tavily API key
14+
- AgentOps API key
15+
16+
## Installation
17+
18+
1. Clone the repository:
19+
```bash
20+
git clone https://github.com/swarmzero/examples.git
21+
cd examples/agents/web-search-agent
22+
```
23+
24+
2. Install dependencies using Poetry:
25+
```bash
26+
poetry install --no-root
27+
```
28+
29+
3. Set up environment variables:
30+
Create a `.env` file in the root directory and add your API keys:
31+
```
32+
TAVILY_API_KEY=your_tavily_api_key_here
33+
AGENTOPS_API_KEY=your_agentops_api_key_here
34+
OPENAI_API_KEY=your_openai_api_key_here
35+
```
36+
37+
## Usage
38+
39+
1. Activate the Poetry shell:
40+
```bash
41+
poetry shell
42+
```
43+
44+
2. Run the agent:
45+
```bash
46+
poetry run python main.py
47+
```
48+
49+
3. Send a message to the agent:
50+
```bash
51+
curl -X 'POST' \
52+
'http://localhost:8000/api/v1/chat' \
53+
-H 'accept: application/json' \
54+
-H 'Content-Type: multipart/form-data' \
55+
-F 'user_id=test_user' \
56+
-F 'session_id=test_web_search_agent' \
57+
-F 'chat_data={"messages":[{"role":"user","content":"what is swarmzero.ai about?"}]}'
58+
```
59+
60+
4. AgentOps will automatically capture the session:
61+
- View the [agentops.log](agentops.log) file
62+
- See the[AgentOps Dashboard](https://app.agentops.ai/drilldown)
63+
64+
## Dependencies
65+
66+
- `swarmzero`: Main framework for agent development
67+
- `agentops`: Agent operations and monitoring
68+
- `tavily-python`: Web search API client
69+
70+
## Learn more
71+
Visit [SwarmZero](https://swarmzero.ai) to learn more about the SwarmZero framework.

agents/web_search_agent/main.py

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
import os
2+
3+
import agentops
4+
from dotenv import load_dotenv
5+
from swarmzero import Agent
6+
from tavily import TavilyClient
7+
8+
load_dotenv()
9+
agentops.init(os.getenv("AGENTOPS_API_KEY"))
10+
tavily_client = TavilyClient(api_key=os.getenv("TAVILY_API_KEY"))
11+
12+
13+
async def web_search(query: str) -> dict:
14+
response = tavily_client.search(query)
15+
results = []
16+
for result in response["results"][:3]:
17+
results.append({"title": result["title"], "url": result["url"], "content": result["content"]})
18+
return results
19+
20+
21+
async def extract_from_urls(urls: list[str]) -> dict:
22+
response = tavily_client.extract(urls=urls)
23+
24+
if response["failed_results"]:
25+
print(f"Failed to extract from {response['failed_results']}")
26+
27+
results = []
28+
for result in response["results"]:
29+
results.append({"url": result["url"], "raw_content": result["raw_content"]})
30+
31+
return results
32+
33+
34+
if __name__ == "__main__":
35+
my_agent = Agent(
36+
name="workflow-assistant",
37+
functions=[
38+
web_search,
39+
extract_from_urls,
40+
],
41+
config_path="./swarmzero_config.toml",
42+
instruction="You are a helpful assistant that can search the web and extract information from a given URL.",
43+
)
44+
45+
my_agent.run() # see agent API at localhost:8000/docs

0 commit comments

Comments
 (0)