Skip to content

Commit 7638fbc

Browse files
authored
fix: typos (#130)
* Update README.md * Update facade.ts * Update openrouter-chat-settings.ts
1 parent 716e418 commit 7638fbc

File tree

3 files changed

+4
-4
lines changed

3 files changed

+4
-4
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# OpenRouter Provider for Vercel AI SDK
22

3-
The [OpenRouter](https://openrouter.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) gives access to over 300 large language model on the OpenRouter chat and completion APIs.
3+
The [OpenRouter](https://openrouter.ai/) provider for the [Vercel AI SDK](https://sdk.vercel.ai/docs) gives access to over 300 large language models on the OpenRouter chat and completion APIs.
44

55
## Setup
66

@@ -126,7 +126,7 @@ await streamText({
126126
{
127127
role: 'system',
128128
content:
129-
'You are a podcast summary assistant. You are detail oriented and critical about the content.',
129+
'You are a podcast summary assistant. You are detail-oriented and critical about the content.',
130130
},
131131
{
132132
role: 'user',

src/facade.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ The default prefix is `https://openrouter.ai/api/v1`.
2323
readonly baseURL: string;
2424

2525
/**
26-
API key that is being send using the `Authorization` header.
26+
API key that is being sent using the `Authorization` header.
2727
It defaults to the `OPENROUTER_API_KEY` environment variable.
2828
*/
2929
readonly apiKey?: string;

src/types/openrouter-chat-settings.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ token from being generated.
2323
/**
2424
Return the log probabilities of the tokens. Including logprobs will increase
2525
the response size and can slow down response times. However, it can
26-
be useful to better understand how the model is behaving.
26+
be useful to understand better how the model is behaving.
2727
2828
Setting to true will return the log probabilities of the tokens that
2929
were generated.

0 commit comments

Comments
 (0)