Skip to content

Commit f4e48a6

Browse files
authored
Merge pull request #2226 from oracle-devrel/lsa_crewai01
added crewai integration, rel 1
2 parents a1438a0 + 8c3d654 commit f4e48a6

File tree

8 files changed

+626
-0
lines changed

8 files changed

+626
-0
lines changed
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) 2025 Luigi Saetta
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE.
Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
# CrewAI ↔ OCI Generative AI Integration
2+
3+
This repository provides examples and configuration guidelines for integrating **[CrewAI](https://github.com/joaomdmoura/crewAI)** with **Oracle Cloud Infrastructure (OCI) Generative AI** services.
4+
The goal is to demonstrate how CrewAI agents can seamlessly leverage OCI-hosted models through the **LiteLLM gateway**.
5+
6+
Reviewed: 31.10.2025
7+
8+
---
9+
10+
## 🔐 Security Configuration
11+
12+
Before running the demos, you must configure access credentials for OCI.
13+
14+
In these examples, we use a **locally stored key pair** for authentication.
15+
Ensure your local OCI configuration (`~/.oci/config` and private key) is correctly set up and accessible to the Python SDK.
16+
17+
To correctly start the **LiteLLM gateway** you need to create and configure correctly a **config.yml** file. To create this file use the [template](./config_template.yml).
18+
19+
In addition, you should be **enabled** to use OCI Generative AI Service in your tenant. If you haven't yet used OCI GenAI ask to your tenant's admin to setup the **needed policies**.
20+
21+
---
22+
23+
## 🧩 Demos Included
24+
25+
- [Simple CrewAI Agent](./simple_test_crewai_agent.py) — basic CrewAI agent interacting with an LLM through OCI
26+
- [OCi Consumption Report](./crew_agent_mcp02.py)
27+
- *(More demos to be added soon)*
28+
29+
---
30+
31+
## 📦 Dependencies
32+
33+
The project relies on the following main packages:
34+
35+
| Dependency | Purpose |
36+
|-------------|----------|
37+
| **CrewAI** | Framework for creating multi-agent workflows |
38+
| **OCI Python SDK** | Access OCI services programmatically |
39+
| **LiteLLM (Gateway)** | OpenAI-compatible proxy for accessing OCI Generative AI models |
40+
41+
To connect CrewAI to OCI models, we use a **LiteLLM gateway**, which exposes OCI GenAI via an **OpenAI-compatible** REST API.
42+
43+
---
44+
45+
## ⚙️ Environment Setup
46+
47+
1. **Create a Conda environment**
48+
```bash
49+
conda create -n crewai python=3.11
50+
```
51+
52+
2. **Activate** the environment
53+
```
54+
conda activate crewai
55+
```
56+
57+
3. **Install** the required **packages**
58+
```
59+
pip install -U oci litellm "litellm[proxy]" crewai
60+
```
61+
62+
4. Run the LiteLLM Gateway
63+
64+
Start the LiteLLM gateway using your configuration file (config.yml):
65+
```
66+
./start_gateway.sh
67+
```
68+
69+
Make sure the gateway starts successfully and is listening on the configured port (e.g., http://localhost:4000/v1).
70+
71+
🧠 Test the Integration
72+
73+
Run the sample CrewAI agent to verify that CrewAI can connect to OCI through LiteLLM:
74+
75+
```
76+
python simple_test_crewai_agent.py
77+
```
78+
79+
If the setup is correct, you should see the agent’s output using an OCI model.
80+
81+
## Integrate Agents with **MCP** servers.
82+
Install this additional package:
83+
84+
```
85+
pip install 'crewai-tools[mcp]'
86+
```
87+
88+
You can test the integration with **MCP** using [OCI Consumption report](./crew_agent_mcp02.py) that generates a report
89+
of the consumption in your tenant (top 5 compartments, for 4 weeks).
90+
91+
To have this demo up&running:
92+
* download the code for the MCP server from [here](https://github.com/oracle-devrel/technology-engineering/blob/main/ai/gen-ai-agents/mcp-oci-integration/mcp_consumption.py)
93+
* start the MCP server, on a free port (for example 9500)
94+
* register the URL, in [source](./crew_agent_mcp02.py), in the section:
95+
```
96+
server_params = {
97+
"url": "http://localhost:9500/mcp",
98+
"transport": "streamable-http"
99+
}
100+
```
101+
102+
If you don't want to secure (with JWT) the communication with the MCP server, put
103+
```
104+
ENABLE_JWT_TOKEN = False
105+
```
106+
in the config.py file.
107+
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
# config.yaml for litellm with OCI Grok models
2+
litellm_settings:
3+
drop_params: true
4+
# drop unsupported params instead of 500 errors
5+
additional_drop_params: ["max_retries"]
6+
7+
# Common OCI connection parameters
8+
common_oci: &common_oci
9+
provider: oci
10+
oci_region: us-chicago-1
11+
oci_serving_mode: ON_DEMAND
12+
supports_tool_calls: true
13+
oci_user: your-oci-user-ocid
14+
oci_fingerprint: your-oci-api-key-fingerprint
15+
oci_tenancy: your-oci-tenancy-ocid
16+
oci_compartment_id: your-oci-compartment-ocid
17+
oci_key_file: /path/to/your/oci_api_key.pem
18+
api_key: key4321
19+
20+
21+
# List of models
22+
model_list:
23+
- model_name: grok4-oci
24+
litellm_params:
25+
<<: *common_oci # merge common OCI params
26+
model: oci/xai.grok-4
27+
28+
- model_name: grok4-fast-oci
29+
litellm_params:
30+
<<: *common_oci
31+
model: oci/xai.grok-4-fast-reasoning
32+
33+
general_settings:
34+
telemetry: false
35+
proxy_logging: false
36+
allow_model_alias: true
Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,58 @@
1+
"""
2+
CrewAI agent with MCP
3+
4+
This one is doing Deep research using internet search tools via MCP server.
5+
6+
see:
7+
https://docs.crewai.com/en/mcp/overview
8+
https://docs.crewai.com/en/mcp/multiple-servers
9+
"""
10+
import os
11+
from crewai import Agent, Task, Crew, LLM
12+
from crewai_tools import MCPServerAdapter
13+
14+
# Disable telemetry, tracing, and logging
15+
os.environ["CREWAI_LOGGING_ENABLED"] = "false"
16+
os.environ["CREWAI_TELEMETRY_ENABLED"] = "false"
17+
os.environ["CREWAI_TRACING_ENABLED"] = "false"
18+
19+
llm = LLM(
20+
model="grok4-fast-oci",
21+
# LiteLLM proxy endpoint
22+
base_url="http://localhost:4000/v1",
23+
api_key="sk-local-any",
24+
temperature=0.2,
25+
max_tokens=4000,
26+
)
27+
28+
server_params = {
29+
"url": "http://localhost:8500/mcp",
30+
"transport": "streamable-http"
31+
}
32+
33+
# Create agent with MCP tools
34+
with MCPServerAdapter(server_params, connect_timeout=60) as mcp_tools:
35+
print(f"Available tools: {[tool.name for tool in mcp_tools]}")
36+
37+
research_agent = Agent(
38+
role="Research Analyst",
39+
goal="Find and analyze information using advanced search tools",
40+
backstory="Expert researcher with access to multiple data sources",
41+
llm=llm,
42+
tools=mcp_tools,
43+
verbose=True
44+
)
45+
46+
# Create task
47+
research_task = Task(
48+
description="Research the latest developments in AI agent frameworks",
49+
expected_output="Comprehensive research report with citations",
50+
agent=research_agent
51+
)
52+
53+
# Create and run crew
54+
crew = Crew(agents=[research_agent], tasks=[research_task])
55+
56+
result = crew.kickoff()
57+
58+
print(result)
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
"""
2+
CrewAI agent with MCP
3+
4+
This one is analyzing tenant consumption via MCP server.
5+
6+
see:
7+
https://docs.crewai.com/en/mcp/overview
8+
https://docs.crewai.com/en/mcp/multiple-servers
9+
"""
10+
import os
11+
from datetime import datetime
12+
from crewai import Agent, Task, Crew, LLM
13+
from crewai_tools import MCPServerAdapter
14+
15+
# Disable telemetry, tracing, and logging
16+
os.environ["CREWAI_LOGGING_ENABLED"] = "false"
17+
os.environ["CREWAI_TELEMETRY_ENABLED"] = "false"
18+
os.environ["CREWAI_TRACING_ENABLED"] = "false"
19+
20+
llm = LLM(
21+
model="grok4-oci",
22+
# LiteLLM proxy endpoint
23+
base_url="http://localhost:4000/v1",
24+
api_key="sk-local-any",
25+
temperature=0.,
26+
max_tokens=4000,
27+
)
28+
29+
# OCI consumption
30+
server_params = {
31+
"url": "http://localhost:9500/mcp",
32+
"transport": "streamable-http"
33+
}
34+
35+
# Create agent with MCP tools
36+
with MCPServerAdapter(server_params, connect_timeout=60) as mcp_tools:
37+
print(f"Available tools: {[tool.name for tool in mcp_tools]}")
38+
39+
research_agent = Agent(
40+
role="OCI Consumption Analyst",
41+
goal="Find and analyze information about OCI tenant consumption.",
42+
backstory="Expert analyst with access to multiple data sources",
43+
llm=llm,
44+
tools=mcp_tools,
45+
max_iter=30,
46+
max_retry_limit=5,
47+
verbose=True
48+
)
49+
50+
# Create task
51+
research_task = Task(
52+
description="Identify the top 5 compartments by consumption (amount) for the OCI tenant "
53+
"in the weeks of the month of september 2025, analyze the trends and provide insights on usage patterns."
54+
"Analyze fully the top 5 compartments. Use only the amount, not the quantity.",
55+
expected_output="Comprehensive report with data-backed insights.",
56+
agent=research_agent
57+
)
58+
59+
# Create and run crew
60+
crew = Crew(agents=[research_agent], tasks=[research_task])
61+
62+
result = crew.kickoff()
63+
64+
print(result)
65+
66+
# --- Save the result to a Markdown file ---
67+
# Create an output directory if it doesn’t exist
68+
output_dir = "reports"
69+
os.makedirs(output_dir, exist_ok=True)
70+
71+
# Use timestamped filename for clarity
72+
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
73+
output_path = os.path.join(output_dir, f"oci_consumption_report_{timestamp}.md")
74+
75+
# Write the result
76+
with open(output_path, "w", encoding="utf-8") as f:
77+
f.write(str(result))
78+
79+
print(f"\n✅ Report saved successfully to: {output_path}")

0 commit comments

Comments
 (0)