You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To produce this table, we ran the TextGrad pip-installed repo on 2024-10-30, and we also include the numbers reported in the TextGrad paper.
@@ -269,6 +269,31 @@ The LLM APIs are called around the same time to ensure a fair comparison. TextGr
269
269
You can also easily implement your own optimizer that works directly with `TraceGraph` (more tutorials on how to work
270
270
with TraceGraph coming soon).
271
271
272
+
## LLM API Setup
273
+
274
+
Currently we rely on AutoGen for LLM caching and API-Key management.
275
+
AutoGen relies on `OAI_CONFIG_LIST`, which is a file you put in your working directory. It has the format of:
276
+
277
+
```json lines
278
+
[
279
+
{
280
+
"model": "gpt-4",
281
+
"api_key": "<your OpenAI API key here>"
282
+
},
283
+
{
284
+
"model": "claude-sonnet-3.5-latest",
285
+
"api_key": "<your Anthropic API key here>"
286
+
}
287
+
]
288
+
```
289
+
You switch between different LLM models by changing the `model` field in this configuration file.
290
+
291
+
You can also set an `os.environ` variable `OAI_CONFIG_LIST` to point to the location of this file or directly set a JSON string as the value of this variable.
292
+
293
+
For convenience, we also provide a method that directly grabs the API key from the environment variable `OPENAI_API_KEY` or `ANTHROPIC_API_KEY`.
294
+
However, doing so, we specify the model version you use, which is `gpt-4o` for OpenAI and `claude-sonnet-3.5-latest` for Anthropic.
295
+
296
+
272
297
## Citation
273
298
274
299
If you use this code in your research please cite the following [publication](https://arxiv.org/abs/2406.16218):
0 commit comments