Skip to content

Commit 9e17ad4

Browse files
committed
Attempt at fixing Bedrock support
1 parent 792d8e8 commit 9e17ad4

File tree

2 files changed

+59
-30
lines changed

2 files changed

+59
-30
lines changed

README.md

Lines changed: 29 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -15,25 +15,25 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
1515

1616
## Installation
1717

18-
> [!NOTE]
19-
>
20-
> CWhy needs to be connected to an [OpenAI account](https://openai.com/api/).
21-
> _Your account will need to have a positive balance for this to work_
22-
> ([check your OpenAI balance](https://platform.openai.com/usage)).
23-
> [Get an OpenAI key here](https://platform.openai.com/api-keys).
24-
>
25-
> You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
26-
>
27-
> Once you have an API key, set it as an environment variable called `OPENAI_API_KEY`.
28-
>
29-
> ```bash
30-
> # On Linux/MacOS:
31-
> export OPENAI_API_KEY=<your-api-key>
32-
>
33-
> # On Windows:
34-
> $env:OPENAI_API_KEY=<your-api-key>
35-
> ```
36-
18+
> [!NOTE]
19+
>
20+
> CWhy needs to be connected to an [OpenAI account](https://openai.com/api/).
21+
> _Your account will need to have a positive balance for this to work_
22+
> ([check your OpenAI balance](https://platform.openai.com/usage)).
23+
> [Get an OpenAI key here](https://platform.openai.com/api-keys).
24+
>
25+
> You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
26+
>
27+
> Once you have an API key, set it as an environment variable called `OPENAI_API_KEY`.
28+
>
29+
> ```bash
30+
> # On Linux/MacOS:
31+
> export OPENAI_API_KEY=<your-api-key>
32+
>
33+
> # On Windows:
34+
> $env:OPENAI_API_KEY=<your-api-key>
35+
> ```
36+
3737
```bash
3838
python3 -m pip install cwhy
3939
```
@@ -60,16 +60,19 @@ If your provider does not support OpenAI style API calls, such as AWS Bedrock wh
6060
using the [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy).
6161

6262
```bash
63-
pip install 'litellm[proxy]'
63+
# In a separate terminal:
6464
# Set AWS_ACCESS_KEY_ID, AWS_REGION_NAME, and AWS_SECRET_ACCESS_KEY.
65-
litellm --model bedrock/anthropic.claude-v2
65+
pip install --upgrade 'litellm[proxy]'
66+
litellm --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0 --port 4000
67+
68+
# Back in the debugging terminal:
6669
export OPENAI_BASE_URL=http://0.0.0.0:4000
6770
cwhy --- clang++ tests/c++/missing-hash.cpp
6871
```
6972

7073
Note that when using the LiteLLM Proxy, CWhy's `--llm` argument will be ignored completely.
7174

72-
## Usage
75+
## Usage
7376

7477
### Linux/MacOS
7578

@@ -125,9 +128,9 @@ An example action YAML file covering all three platforms
125128

126129
These options can be displayed with `cwhy --help`.
127130

128-
- `--llm`: pick a specific OpenAI LLM. CWhy has been tested with `gpt-3.5-turbo` and `gpt-4`.
129-
- `--timeout`: pick a different timeout than the default for API calls.
130-
- `--show-prompt` (debug): print prompts before calling the API.
131+
- `--llm`: pick a specific OpenAI LLM. CWhy has been tested with `gpt-3.5-turbo` and `gpt-4`.
132+
- `--timeout`: pick a different timeout than the default for API calls.
133+
- `--show-prompt` (debug): print prompts before calling the API.
131134

132135
## Examples
133136

@@ -207,6 +210,7 @@ In file included from /usr/lib/gcc/x86_64-linux-gnu/10/../../../../include/c++/1
207210
^~~~~~~
208211
3 errors generated.
209212
```
213+
210214
</details>
211215

212216
And here's the English-language explanation from `cwhy`:
@@ -243,7 +247,6 @@ std::unordered_set<std::pair<int, int>, PairHash> visited;
243247
With this change, the code should now compile and work as expected.
244248
````
245249

246-
247250
### Rust
248251

249252
```

src/cwhy/explain.py

Lines changed: 30 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
import argparse
2+
import os
23
import sys
34

45
import llm_utils
@@ -69,12 +70,37 @@ def evaluate(client: openai.OpenAI, args: argparse.Namespace, stdin: str) -> Non
6970

7071

7172
def explain(args: argparse.Namespace, stdin: str) -> None:
72-
try:
73-
client = openai.OpenAI()
74-
except openai.OpenAIError as e:
75-
print("Please set the OPENAI_API_KEY environment variable.")
73+
if (
74+
"OPENAI_BASE_URL" in os.environ
75+
and "api.openai.com" not in os.environ["OPENAI_BASE_URL"]
76+
):
77+
# Pass a dummy API key on purpose:
78+
# None would make the OpenAI client throw an error.
79+
# A blank string will cause an invalid HTTP header error.
80+
client = openai.OpenAI(
81+
api_key=os.environ.get("OPENAI_API_KEY", "OPENAI_API_KEY")
82+
)
83+
elif "OPENAI_API_KEY" not in os.environ:
84+
print("The OPENAI_API_KEY environment variable is not set.")
7685
print("You can get an API key at https://platform.openai.com/account/api-keys.")
86+
if all(
87+
k in os.environ
88+
for k in ("AWS_ACCESS_KEY_ID", "AWS_REGION_NAME", "AWS_SECRET_ACCESS_KEY")
89+
):
90+
print()
91+
print(
92+
"Found AWS credentials. To use Amazon Bedrock, run the following commands:"
93+
)
94+
print("```")
95+
print("pip install --upgrade 'litellm[proxy]'")
96+
print(
97+
"litellm --model bedrock/anthropic.claude-opus-4-20250514-v1:0 --port 4000"
98+
)
99+
print("export OPENAI_BASE_URL=http://0.0.0.0:4000")
100+
print("```")
77101
return
102+
else:
103+
client = openai.OpenAI()
78104

79105
evaluate(client, args, stdin)
80106

0 commit comments

Comments
 (0)