Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,15 @@

## Project Overview

The Explain Error Plugin is a Jenkins plugin that provides AI-powered explanations for build failures and pipeline errors. It integrates with multiple AI providers (OpenAI, Google Gemini, Ollama) to analyze error logs and provide human-readable insights to help developers understand and resolve build issues.
The Explain Error Plugin is a Jenkins plugin that provides AI-powered explanations for build failures and pipeline errors. It integrates with multiple AI providers (OpenAI, Google Gemini, AWS Bedrock, Ollama) to analyze error logs and provide human-readable insights to help developers understand and resolve build issues.

## Architecture

### Key Components

- **GlobalConfigurationImpl**: Main plugin configuration class with `@Symbol("explainError")` for Configuration as Code support, handles migration from legacy enum-based configuration
- **BaseAIProvider**: Abstract base class for AI provider implementations with nested `Assistant` interface and `BaseProviderDescriptor` for extensibility
- **OpenAIProvider** / **GeminiProvider** / **OllamaProvider**: LangChain4j-based AI service implementations with provider-specific configurations
- **OpenAIProvider** / **GeminiProvider** / **BedrockProvider** / **OllamaProvider**: LangChain4j-based AI service implementations with provider-specific configurations
- **ExplainErrorStep**: Pipeline step implementation for `explainError()` function
- **ConsoleExplainErrorAction**: Adds "Explain Error" button to console output for manual triggering
- **ConsoleExplainErrorActionFactory**: TransientActionFactory that dynamically injects ConsoleExplainErrorAction into all runs (new and existing)
Expand Down Expand Up @@ -39,6 +39,7 @@ src/main/java/io/jenkins/plugins/explain_error/
├── BaseAIProvider.java # Abstract AI service with Assistant interface
├── OpenAIProvider.java # OpenAI/LangChain4j implementation
├── GeminiProvider.java # Google Gemini/LangChain4j implementation
├── BedrockProvider.java # AWS Bedrock/LangChain4j implementation
└── OllamaProvider.java # Ollama/LangChain4j implementation
```

Expand All @@ -60,7 +61,7 @@ src/main/java/io/jenkins/plugins/explain_error/

### AI Service Integration
- All AI services extend `BaseAIProvider` and implement `ExtensionPoint`
- LangChain4j integration (v1.9.1) for OpenAI, Gemini, and Ollama providers
- LangChain4j integration (v1.9.1) for OpenAI, Gemini, AWS Bedrock, and Ollama providers
- Structured output parsing using `JenkinsLogAnalysis` record with `@Description` annotations
- Each provider implements `createAssistant()` to build LangChain4j assistants
- Provider descriptors extend `BaseProviderDescriptor` with `@Symbol` annotations for CasC
Expand All @@ -87,7 +88,7 @@ src/main/java/io/jenkins/plugins/explain_error/
### Maven Configuration
- Jenkins baseline: 2.479.3
- Java 17+ required
- LangChain4j: v1.9.1 (langchain4j, langchain4j-open-ai, langchain4j-google-ai-gemini, langchain4j-ollama)
- LangChain4j: v1.9.1 (langchain4j, langchain4j-open-ai, langchain4j-google-ai-gemini, langchain4j-bedrock, langchain4j-ollama)
- Key Jenkins dependencies: `jackson2-api`, `workflow-step-api`, `commons-lang3-api`
- SLF4J and Jackson exclusions to avoid conflicts with Jenkins core
- Test dependencies: `workflow-cps`, `workflow-job`, `workflow-durable-task-step`, `workflow-basic-steps`, `test-harness`
Expand Down
19 changes: 18 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Whether it’s a compilation error, test failure, or deployment hiccup, this plu
| Setting | Description | Default |
|---------|-------------|---------|
| **Enable AI Error Explanation** | Toggle plugin functionality | ✅ Enabled |
| **AI Provider** | Choose between OpenAI, Google Gemini, or Ollama | `OpenAI` |
| **AI Provider** | Choose between OpenAI, Google Gemini, AWS Bedrock, or Ollama | `OpenAI` |
| **API Key** | Your AI provider API key | Get from [OpenAI](https://platform.openai.com/settings) or [Google AI Studio](https://aistudio.google.com/app/apikey) |
| **API URL** | AI service endpoint | **Leave empty** for official APIs (OpenAI, Gemini). **Specify custom URL** for OpenAI-compatible services and air-gapped environments. |
| **AI Model** | Model to use for analysis | *Required*. Specify the model name offered by your selected AI provider |
Expand Down Expand Up @@ -143,6 +143,17 @@ unclassified:
enableExplanation: true
```

**AWS Bedrock Configuration:**
```yaml
unclassified:
explainError:
aiProvider:
bedrock:
model: "anthropic.claude-3-5-sonnet-20240620-v1:0"
region: "us-east-1" # Optional, uses AWS SDK default if not specified
enableExplanation: true
```

This allows you to manage the plugin configuration alongside your other Jenkins settings in version control.

## Supported AI Providers
Expand All @@ -159,6 +170,12 @@ This allows you to manage the plugin configuration alongside your other Jenkins
- **Endpoint**: Leave empty for official Google AI API, or specify custom URL for Gemini-compatible services
- **Best for**: Fast, efficient analysis with competitive quality

### AWS Bedrock
- **Models**: `anthropic.claude-3-5-sonnet-20240620-v1:0`, `eu.anthropic.claude-3-5-sonnet-20240620-v1:0` (EU cross-region), `meta.llama3-8b-instruct-v1:0`, `us.amazon.nova-lite-v1:0`, etc.
- **API Key**: Not required — uses AWS credential chain (instance profiles, environment variables, etc.)
- **Region**: AWS region (e.g., `us-east-1`, `eu-west-1`). Optional — defaults to AWS SDK region resolution
- **Best for**: Enterprise AWS environments, data residency compliance, using Claude models with AWS infrastructure

### Ollama (Local/Private LLM)
- **Models**: `gemma3:1b`, `gpt-oss`, `deepseek-r1`, and any model available in your Ollama instance
- **API Key**: Not required by default (unless your Ollama server is secured)
Expand Down
Loading