-
Notifications
You must be signed in to change notification settings - Fork 1.1k
azure/bedrock api integration #1048
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
azure/bedrock api integration #1048
Conversation
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Summary
This PR refactors the LLM client architecture to support Azure OpenAI and AWS Bedrock API integrations. The changes focus on standardizing client option handling across different AI providers while maintaining backwards compatibility.
The core changes include:
- Type System Refactoring: Modified
ClientOptions
type to include a genericRecord<string, string>
union alongside existingOpenAIClientOptions
andAnthropicClientOptions
, enabling support for diverse provider-specific configurations - Client Architecture Updates: Made
clientOptions
property optional across all LLM client implementations (OpenAIClient
,AnthropicClient
,CerebrasClient
) and removed it from the abstractLLMClient
base class - Provider System Enhancement: Updated
LLMProvider.getAISDKLanguageModel()
to accept unifiedClientOptions
parameter instead of separateapiKey
andbaseURL
parameters, allowing more flexible provider configuration - Import Organization: Standardized import ordering across LLM client files, moving error type imports to the top for consistency
- API Key Handling: Simplified API key resolution in the main Stagehand constructor to rely exclusively on environment variables for legacy providers (OpenAI, Anthropic, Google)
These changes integrate with the existing AI SDK framework that supports multiple providers through factory functions, enabling the codebase to accommodate cloud-based AI services that require additional configuration beyond simple API keys (such as regions, resource names, and custom endpoints).
PR Description Notes:
- The PR description is empty and lacks the required "Why", "What Changed", and "Test Plan" sections as outlined in the repository's pull request template
Confidence score: 3/5
- This PR introduces significant architectural changes that could impact existing integrations and require careful testing
- Score reflects the complexity of the type system changes and potential for runtime issues with the loosened type constraints
- Pay close attention to
types/model.ts
,lib/llm/LLMProvider.ts
, andlib/index.ts
as they contain the most critical changes
Context used:
Context - We enforce linting and prettier at the CI level, so no code style comments that aren't obvious. (link)
8 files reviewed, 4 comments
lib/llm/CerebrasClient.ts
Outdated
@@ -1,5 +1,6 @@ | |||
import OpenAI from "openai"; | |||
import { CreateChatCompletionResponseError } from "@/types/stagehandErrors"; | |||
import type { ClientOptions } from "openai"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: Import ClientOptions
from openai but use OpenAI.ClientOptions
in constructor parameter - consider using consistent type reference
import type { ClientOptions } from "openai"; | |
clientOptions?: ClientOptions; |
lib/llm/LLMProvider.ts
Outdated
} | ||
const provider = creator(providerConfig); | ||
// Create the provider instance with the custom configuration options | ||
const provider = creator(modelClientOptions); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: Type safety concern: ClientOptions is a union type that may not match what the creator function expects, potentially causing runtime errors
lib/api.ts
Outdated
// Add modelClientOptions as a header if provided | ||
if (modelClientOptions) { | ||
defaultHeaders["x-model-client-options"] = | ||
JSON.stringify(modelClientOptions); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why send it as a header? modelClientOptions already gets sent in the payload. A stringified json in a header is probably not the move
fc0c9e5
to
8eccd56
Compare
lib/api.ts
Outdated
this.logger({ | ||
category: "execute", | ||
message: `Executing ${method} with args: ${JSON.stringify(args)}`, | ||
level: 1, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe level 2
const result = await this.api.act({ | ||
...observeResult, | ||
frameId: this.rootFrameId, | ||
modelClientOptions: this.stagehand["modelClientOptions"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We will need this once we add self-healing
result = await this.api.extract<T>({ frameId: this.rootFrameId }); | ||
result = await this.api.extract<T>({ | ||
frameId: this.rootFrameId, | ||
modelClientOptions: this.stagehand["modelClientOptions"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this required?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes because otherwise it doesn't get sent to the API. We need this param on all api calls now
message: | ||
"No Amazon Bedrock authentication credentials found. Please provide credentials via modelClientOptions (accessKeyId/secretAccessKey or bearerToken) or environment variables (AWS_ACCESS_KEY_ID/AWS_SECRET_ACCESS_KEY or AWS_BEARER_TOKEN_BEDROCK)", | ||
level: 0, | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we throw here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would throw once they make an LLM call. We let people use stagehand without an LLM so I figured we just print a warning for now. Down to change it tho
# why # what changed # test plan
# why anthropic released a new sota computer use model # what changed added claude-sonnet-4-5-20250929 as a model to the list # test plan ran evals
Why Custom AI SDK tools and MCP integrations weren't working properly with Anthropic CUA - parameters were empty {} and tools weren't tracked. What Changed - Convert Zod schemas to JSON Schema before sending to Anthropic (using zodToJsonSchema) - Track custom tool calls in the actions array - Silence "Unknown tool name" warnings for custom tools Test Plan Tested with examples file. Parameters passed correctly ({"city":"San Francisco"} instead of {}) Custom tools execute and appear in actions array No warnings
# why To improve context # what changed Added current page and url to the system prompt # test plan
# why To inform the user throughout the agent execution process # what changed Added logs to tool calls, and on the stagehand agent handler # test plan - [x] tested locally
why
Need to support Azure/Bedrock and all other AI SDK providers that have more than just an
apiKey
parameter.what changed
Allowed free input of
modelClientOptions
and added logic for authenticating with AWS for Bedrock usage.test plan