Skip to content

Commit d45bb45

Browse files
authored
docs: update README and instructions to include AWS Bedrock as an AI provider (#96)
1 parent ce4993e commit d45bb45

File tree

2 files changed

+23
-5
lines changed

2 files changed

+23
-5
lines changed

.github/copilot-instructions.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
## Project Overview
44

5-
The Explain Error Plugin is a Jenkins plugin that provides AI-powered explanations for build failures and pipeline errors. It integrates with multiple AI providers (OpenAI, Google Gemini, Ollama) to analyze error logs and provide human-readable insights to help developers understand and resolve build issues.
5+
The Explain Error Plugin is a Jenkins plugin that provides AI-powered explanations for build failures and pipeline errors. It integrates with multiple AI providers (OpenAI, Google Gemini, AWS Bedrock, Ollama) to analyze error logs and provide human-readable insights to help developers understand and resolve build issues.
66

77
## Architecture
88

99
### Key Components
1010

1111
- **GlobalConfigurationImpl**: Main plugin configuration class with `@Symbol("explainError")` for Configuration as Code support, handles migration from legacy enum-based configuration
1212
- **BaseAIProvider**: Abstract base class for AI provider implementations with nested `Assistant` interface and `BaseProviderDescriptor` for extensibility
13-
- **OpenAIProvider** / **GeminiProvider** / **OllamaProvider**: LangChain4j-based AI service implementations with provider-specific configurations
13+
- **OpenAIProvider** / **GeminiProvider** / **BedrockProvider** / **OllamaProvider**: LangChain4j-based AI service implementations with provider-specific configurations
1414
- **ExplainErrorStep**: Pipeline step implementation for `explainError()` function
1515
- **ConsoleExplainErrorAction**: Adds "Explain Error" button to console output for manual triggering
1616
- **ConsoleExplainErrorActionFactory**: TransientActionFactory that dynamically injects ConsoleExplainErrorAction into all runs (new and existing)
@@ -39,6 +39,7 @@ src/main/java/io/jenkins/plugins/explain_error/
3939
├── BaseAIProvider.java # Abstract AI service with Assistant interface
4040
├── OpenAIProvider.java # OpenAI/LangChain4j implementation
4141
├── GeminiProvider.java # Google Gemini/LangChain4j implementation
42+
├── BedrockProvider.java # AWS Bedrock/LangChain4j implementation
4243
└── OllamaProvider.java # Ollama/LangChain4j implementation
4344
```
4445

@@ -60,7 +61,7 @@ src/main/java/io/jenkins/plugins/explain_error/
6061

6162
### AI Service Integration
6263
- All AI services extend `BaseAIProvider` and implement `ExtensionPoint`
63-
- LangChain4j integration (v1.9.1) for OpenAI, Gemini, and Ollama providers
64+
- LangChain4j integration (v1.9.1) for OpenAI, Gemini, AWS Bedrock, and Ollama providers
6465
- Structured output parsing using `JenkinsLogAnalysis` record with `@Description` annotations
6566
- Each provider implements `createAssistant()` to build LangChain4j assistants
6667
- Provider descriptors extend `BaseProviderDescriptor` with `@Symbol` annotations for CasC
@@ -87,7 +88,7 @@ src/main/java/io/jenkins/plugins/explain_error/
8788
### Maven Configuration
8889
- Jenkins baseline: 2.479.3
8990
- Java 17+ required
90-
- LangChain4j: v1.9.1 (langchain4j, langchain4j-open-ai, langchain4j-google-ai-gemini, langchain4j-ollama)
91+
- LangChain4j: v1.9.1 (langchain4j, langchain4j-open-ai, langchain4j-google-ai-gemini, langchain4j-bedrock, langchain4j-ollama)
9192
- Key Jenkins dependencies: `jackson2-api`, `workflow-step-api`, `commons-lang3-api`
9293
- SLF4J and Jackson exclusions to avoid conflicts with Jenkins core
9394
- Test dependencies: `workflow-cps`, `workflow-job`, `workflow-durable-task-step`, `workflow-basic-steps`, `test-harness`

README.md

Lines changed: 18 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ Whether it’s a compilation error, test failure, or deployment hiccup, this plu
7474
| Setting | Description | Default |
7575
|---------|-------------|---------|
7676
| **Enable AI Error Explanation** | Toggle plugin functionality | ✅ Enabled |
77-
| **AI Provider** | Choose between OpenAI, Google Gemini, or Ollama | `OpenAI` |
77+
| **AI Provider** | Choose between OpenAI, Google Gemini, AWS Bedrock, or Ollama | `OpenAI` |
7878
| **API Key** | Your AI provider API key | Get from [OpenAI](https://platform.openai.com/settings) or [Google AI Studio](https://aistudio.google.com/app/apikey) |
7979
| **API URL** | AI service endpoint | **Leave empty** for official APIs (OpenAI, Gemini). **Specify custom URL** for OpenAI-compatible services and air-gapped environments. |
8080
| **AI Model** | Model to use for analysis | *Required*. Specify the model name offered by your selected AI provider |
@@ -143,6 +143,17 @@ unclassified:
143143
enableExplanation: true
144144
```
145145
146+
**AWS Bedrock Configuration:**
147+
```yaml
148+
unclassified:
149+
explainError:
150+
aiProvider:
151+
bedrock:
152+
model: "anthropic.claude-3-5-sonnet-20240620-v1:0"
153+
region: "us-east-1" # Optional, uses AWS SDK default if not specified
154+
enableExplanation: true
155+
```
156+
146157
This allows you to manage the plugin configuration alongside your other Jenkins settings in version control.
147158
148159
## Supported AI Providers
@@ -159,6 +170,12 @@ This allows you to manage the plugin configuration alongside your other Jenkins
159170
- **Endpoint**: Leave empty for official Google AI API, or specify custom URL for Gemini-compatible services
160171
- **Best for**: Fast, efficient analysis with competitive quality
161172

173+
### AWS Bedrock
174+
- **Models**: `anthropic.claude-3-5-sonnet-20240620-v1:0`, `eu.anthropic.claude-3-5-sonnet-20240620-v1:0` (EU cross-region), `meta.llama3-8b-instruct-v1:0`, `us.amazon.nova-lite-v1:0`, etc.
175+
- **API Key**: Not required — uses AWS credential chain (instance profiles, environment variables, etc.)
176+
- **Region**: AWS region (e.g., `us-east-1`, `eu-west-1`). Optional — defaults to AWS SDK region resolution
177+
- **Best for**: Enterprise AWS environments, data residency compliance, using Claude models with AWS infrastructure
178+
162179
### Ollama (Local/Private LLM)
163180
- **Models**: `gemma3:1b`, `gpt-oss`, `deepseek-r1`, and any model available in your Ollama instance
164181
- **API Key**: Not required by default (unless your Ollama server is secured)

0 commit comments

Comments
 (0)