Skip to content

Conversation

@KariHall619
Copy link
Contributor

Depends on: #825 (file: pkg/server/ai_interface.go), please review the corresponding PR first。

What this PR does / why we need it:
This PR implements the AI plugin calling logic that enables the main project to communicate with AI plugins using the existing unified
plugin architecture. AI plugins are treated like other plugins but identified by categories: ["ai"].

  Key implementations:
  - Added CallAI and GetAICapabilities RPC methods in server.proto with HTTP endpoints
  - Implemented AI plugin calling logic in remote_server.go using existing Query interface
  - AI plugins use standard map[string]string parameters: method, model, prompt, config
  - Response follows DataResult.Pairs format: content, meta, success, error
  - HTTP endpoints: POST /api/v1/ai/generate, GET /api/v1/ai/capabilities/{plugin_name}

  Technical approach:
  - Reuses existing plugin discovery via GetStores(ctx, &SimpleQuery{Kind: "ai"})
  - Calls AI plugins through loader.Query(map[string]string{...}) interface
  - Maintains full compatibility with existing plugin architecture
  - No new dependencies or special AI-specific code paths

  AI plugin developers need to:
  1. Configure categories: ["ai"] in extension.yaml
  2. Implement Query method to handle "ai.generate" and "ai.capabilities"
  3. Return standard DataResult format with keys: content, meta, success, error

  The implementation maintains architectural consistency while providing standardized AI communication patterns.

  **Which issue(s) this PR fixes**:
  Fixes #

This commit implements the AI plugin calling logic that enables the main
project to communicate with AI plugins using the existing unified plugin
architecture. AI plugins are treated like other plugins but identified
by categories: ["ai"].

Key implementations:
- Added CallAI and GetAICapabilities RPC methods in server.proto
- Implemented AI plugin calling logic in remote_server.go using existing Query interface
- AI plugins use standard map[string]string parameters: method, model, prompt, config
- Response follows DataResult.Pairs format: content, meta, success, error
- HTTP endpoints: POST /api/v1/ai/generate, GET /api/v1/ai/capabilities/{plugin_name}

AI plugin developers need to:
1. Configure categories: ["ai"] in extension.yaml
2. Implement Query method to handle "ai.generate" and "ai.capabilities"
3. Return standard DataResult format

The implementation maintains full compatibility with existing plugin
architecture while providing AI-specific communication standards.
@KariHall619 KariHall619 force-pushed the ai-plugin-call-logic branch 2 times, most recently from fb4dde7 to 0cb08b0 Compare September 12, 2025 13:17
@LinuxSuRen LinuxSuRen added enhancement New feature or request ospp 开源之夏 https://summer-ospp.ac.cn/ labels Sep 14, 2025
- Rollback protoc-gen-go from v1.36.9 to v1.28.1 to reduce compatibility issues
- Keep all AI plugin functionality intact
- Protoc version remains at v5.29.3
- AI interface and endpoints maintained as implemented
Complete protobuf toolchain rollback:
- protoc-gen-go: v1.36.9 → v1.28.1
- protoc: v5.29.3 → v4.22.2

This ensures maximum compatibility while preserving all AI plugin functionality:
- CallAI and GetAICapabilities RPC methods maintained
- All existing endpoints and interfaces preserved
- Reduces potential version conflicts and integration issues

Matches original commit ce1f79f protobuf generation versions.
Resolved merge conflicts between ai-plugin-call-logic and master branches:

- ai_interface.go: Preserved comprehensive AI plugin communication standards
  and interface documentation from HEAD, providing complete protocol specs
  for AI plugin developers

- remote_server.go: Kept complete implementation of CallAI and GetAICapabilities
  methods that properly integrate with existing plugin architecture using
  loader.Query interface

Both resolutions maintain full AI plugin functionality while ensuring
compatibility with the existing codebase architecture.
@KariHall619
Copy link
Contributor Author

KariHall619 commented Sep 15, 2025

The reason for the code conflict between #826 and #825 is that the two branches originated from the same commit. The code for the AI ​​plugin calling logic in #825 was a placeholder implementation. The code in #826 is a complete implementation (including complete CallAI and GetAICapabilities implementations, using the existing plugin manager for communication).

Complete protoc toolchain rollback by downgrading protoc-gen-go-grpc from v1.5.1 to v1.2.0.
This resolves compilation errors and restores concrete struct implementations needed for tests.
Remove unnecessary empty string check for config parameter in CallAI method.
Always include config in query parameters for consistency and simplicity.
Empty string is a valid configuration value that plugins can handle appropriately.
@sonarqubecloud
Copy link

@LinuxSuRen LinuxSuRen merged commit 1d95342 into LinuxSuRen:master Sep 15, 2025
14 checks passed
@KariHall619 KariHall619 deleted the ai-plugin-call-logic branch September 22, 2025 12:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request ospp 开源之夏 https://summer-ospp.ac.cn/

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants