You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Create a Portkey API key with optional budget/rate limits from the [Portkey dashboard](https://app.portkey.ai/). You can attach configurations for reliability, caching, and more to this key.
33
+
Create a Portkey API key with optional budget/rate limits from the [Portkey dashboard](https://app.portkey.ai/). You can also attach configurations for reliability, caching, and more to this key. More on this later.
34
34
</Step>
35
35
36
36
<Steptitle="Configure CrewAI with Portkey">
@@ -42,20 +42,27 @@ from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL
42
42
43
43
# Create an LLM instance with Portkey integration
44
44
gpt_llm = LLM(
45
-
model="gpt-3.5-turbo",
46
-
max_tokens=100,
45
+
model="gpt-4o",
47
46
base_url=PORTKEY_GATEWAY_URL,
48
47
api_key="dummy", # We are using a Virtual key, so this is a placeholder
49
48
extra_headers=createHeaders(
50
49
api_key="YOUR_PORTKEY_API_KEY",
51
50
virtual_key="YOUR_LLM_VIRTUAL_KEY",
52
51
trace_id="unique-trace-id", # Optional, for request tracing
53
-
metadata={ # Optional, for request segmentation
54
-
"app_env": "production",
55
-
"_user": "user_123"# Special _user field for user analytics
56
-
}
57
52
)
58
53
)
54
+
55
+
#Use them in your Crew Agents like this:
56
+
57
+
@agent
58
+
deflead_market_analyst(self) -> Agent:
59
+
return Agent(
60
+
config=self.agents_config['lead_market_analyst'],
61
+
verbose=True,
62
+
memory=False,
63
+
llm=gpt_llm
64
+
)
65
+
59
66
```
60
67
61
68
<Info>
@@ -82,7 +89,6 @@ Traces provide a hierarchical view of your crew's execution, showing the sequenc
82
89
# Add trace_id to enable hierarchical tracing in Portkey
83
90
portkey_llm = LLM(
84
91
model="gpt-4o",
85
-
max_tokens=1000,
86
92
base_url=PORTKEY_GATEWAY_URL,
87
93
api_key="dummy",
88
94
extra_headers=createHeaders(
@@ -134,7 +140,6 @@ Add custom metadata to your CrewAI LLM configuration to enable powerful filterin
134
140
```python
135
141
portkey_llm = LLM(
136
142
model="gpt-4o",
137
-
max_tokens=1000,
138
143
base_url=PORTKEY_GATEWAY_URL,
139
144
api_key="dummy",
140
145
extra_headers=createHeaders(
@@ -246,18 +251,19 @@ from portkey_ai import createHeaders, PORTKEY_GATEWAY_URL, Portkey
0 commit comments