@@ -15,25 +15,25 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
1515
1616## Installation
1717
18- > [ !NOTE]
19- >
20- > CWhy needs to be connected to an [ OpenAI account] ( https://openai.com/api/ ) .
21- > _ Your account will need to have a positive balance for this to work_
22- > ([ check your OpenAI balance] ( https://platform.openai.com/usage ) ).
23- > [ Get an OpenAI key here] ( https://platform.openai.com/api-keys ) .
24- >
25- > You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
26- >
27- > Once you have an API key, set it as an environment variable called ` OPENAI_API_KEY ` .
28- >
29- > ``` bash
30- > # On Linux/MacOS:
31- > export OPENAI_API_KEY=< your-api-key>
32- >
33- > # On Windows:
34- > $env :OPENAI_API_KEY=< your-api-key>
35- > ` ` `
36-
18+ > [ !NOTE]
19+ >
20+ > CWhy needs to be connected to an [ OpenAI account] ( https://openai.com/api/ ) .
21+ > _ Your account will need to have a positive balance for this to work_
22+ > ([ check your OpenAI balance] ( https://platform.openai.com/usage ) ).
23+ > [ Get an OpenAI key here] ( https://platform.openai.com/api-keys ) .
24+ >
25+ > You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
26+ >
27+ > Once you have an API key, set it as an environment variable called ` OPENAI_API_KEY ` .
28+ >
29+ > ``` bash
30+ > # On Linux/MacOS:
31+ > export OPENAI_API_KEY=< your-api-key>
32+ >
33+ > # On Windows:
34+ > $env :OPENAI_API_KEY=< your-api-key>
35+ > ` ` `
36+
3737` ` ` bash
3838python3 -m pip install cwhy
3939```
@@ -60,16 +60,19 @@ If your provider does not support OpenAI style API calls, such as AWS Bedrock wh
6060using the [ LiteLLM Proxy Server] ( https://docs.litellm.ai/docs/simple_proxy ) .
6161
6262``` bash
63- pip install ' litellm[proxy] '
63+ # In a separate terminal:
6464# Set AWS_ACCESS_KEY_ID, AWS_REGION_NAME, and AWS_SECRET_ACCESS_KEY.
65- litellm --model bedrock/anthropic.claude-v2
65+ pip install --upgrade ' litellm[proxy]'
66+ litellm --model bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0 --port 4000
67+
68+ # Back in the debugging terminal:
6669export OPENAI_BASE_URL=http://0.0.0.0:4000
6770cwhy --- clang++ tests/c++/missing-hash.cpp
6871```
6972
7073Note that when using the LiteLLM Proxy, CWhy's ` --llm ` argument will be ignored completely.
7174
72- ## Usage
75+ ## Usage
7376
7477### Linux/MacOS
7578
@@ -125,9 +128,9 @@ An example action YAML file covering all three platforms
125128
126129These options can be displayed with ` cwhy --help ` .
127130
128- - ` --llm ` : pick a specific OpenAI LLM. CWhy has been tested with ` gpt-3.5-turbo ` and ` gpt-4 ` .
129- - ` --timeout ` : pick a different timeout than the default for API calls.
130- - ` --show-prompt ` (debug): print prompts before calling the API.
131+ - ` --llm ` : pick a specific OpenAI LLM. CWhy has been tested with ` gpt-3.5-turbo ` and ` gpt-4 ` .
132+ - ` --timeout ` : pick a different timeout than the default for API calls.
133+ - ` --show-prompt ` (debug): print prompts before calling the API.
131134
132135## Examples
133136
@@ -207,6 +210,7 @@ In file included from /usr/lib/gcc/x86_64-linux-gnu/10/../../../../include/c++/1
207210 ^~~~~~~
2082113 errors generated.
209212```
213+
210214</details >
211215
212216And here's the English-language explanation from ` cwhy ` :
@@ -243,7 +247,6 @@ std::unordered_set<std::pair<int, int>, PairHash> visited;
243247With this change, the code should now compile and work as expected.
244248````
245249
246-
247250### Rust
248251
249252```
0 commit comments