Skip to content

Commit f107a71

Browse files
authored
feat(compute): Extract OpenCL compute infrastructure from ART (#3)
* feat: add PM infrastructure and beads for OpenCL extraction - Add .pm/ project management infrastructure - CONTINUATION.md for session resumption - METHODOLOGY.md for TDD workflow - CONTEXT_PROTOCOL.md for agent handoffs - Add .beads/ for issue tracking with beads - Add AGENTS.md with bd commands reference - Create bead hierarchy for 4-phase OpenCL extraction: - Phase 1: Core interfaces (GPUBuffer, ComputeKernel) - Phase 2: OpenCL implementation - Phase 3: Utilities and stubs - Phase 4: ART migration Plan v2 audited and approved (92% confidence GO). See ChromaDB: plan::gpu-support::art-opencl-extraction::v2 * feat(compute): Phase 1 - extract core compute interfaces from ART Extract portable compute API layer (Layer 2 per architecture decision): - GPUBuffer: Host-device memory transfer interface - ComputeKernel: Unified kernel compilation/execution interface - BufferAccess enum for kernel argument modes - KernelCompilationException, KernelExecutionException - GPUBackend: Enum for METAL, OPENCL, CPU_FALLBACK with priorities - GPUErrorClassifier: Programming vs recoverable error classification - OpenCL error code extraction from exception messages - Fixed self-referencing cause infinite loop (improvement over ART) All interfaces are backend-agnostic. OpenCL implementations (Layer 3) will wrap existing CLKernelHandle/CLBufferHandle (Layer 1) in Phase 2. Beads closed: e63, ad2, kdp, ipz, 6e9 See: plan::gpu-support::art-opencl-extraction::v2 * Phase 2 progress: OpenCLContext, OpenCLBuffer, GPUBackend.isAvailable() - Extract OpenCLContext singleton with reference counting and testReset() - Extract OpenCLBuffer implementing GPUBuffer interface - Add GPUBackend.isAvailable() with cached Metal/OpenCL detection - Remove CL.create() calls to avoid macOS SIGSEGV in forked JVMs - Add dual property name support (gpu.disable, luciferase.gpu.disable) Beads: gij, 9go closed; ilr in progress Note: OpenCLBufferTest has macOS driver crash - needs debugging * Fix OpenCLBufferTest SIGSEGV crash on macOS Replace JUnit Assumptions.assumeTrue() pattern with simple early-return checks in each test method. The Assumptions pattern caused OpenCL driver state issues in Maven Surefire forked JVM processes. - Simplified @BeforeAll to just detect OpenCL availability - Use `if (!openCLAvailable) return;` instead of @beforeeach assumptions - Fix IndexOutOfBounds in FloatBuffer test (remove unnecessary flip()) - All 10 OpenCLBuffer tests pass Closes: gpu-support-ilr * feat(compute): Extract OpenCLKernel from ART Implement OpenCLKernel as ComputeKernel interface for GPU compute: - Kernel compilation with build log on failure - Buffer, float, int, and local memory argument binding - 1D/2D/3D execution with optional local work sizes - Async execution with event-based synchronization - Uses OpenCLContext singleton pattern 16 tests covering: - Compilation lifecycle (compile, double-compile, invalid source) - Argument setting (buffer, scalar, before compile) - Execution (vectorAdd, scale, 2D/3D work sizes) - Resource lifecycle (close, double-close, ops after close) Closes: gpu-support-6pw * feat(compute): Extract BackendSelector with dual env var support Automatic GPU backend selection with priority-based fallback: - Metal (priority 100, macOS only) - OpenCL (priority 90, cross-platform) - CPU fallback (priority 10, always available) Environment variable support: - GPU_BACKEND / GPU_DISABLE (new generic names) - ART_GPU_BACKEND / ART_GPU_DISABLE (legacy, deprecated) CI environment auto-detection (GitHub Actions, Jenkins, etc). 17 tests covering selection logic, caching, and environment info. Closes: gpu-support-cbr * test(compute): Add Phase 2 integration tests Full compute workflow tests: - vectorAdd: context → buffers → kernel → execute → read - SAXPY: scalar float arguments (result = a*x + y) - 2D execution: proper 2D kernel indexing - Large data: 64K elements - Multiple executions: iterative kernel runs - Resource cleanup: try-with-resources pattern 9 integration tests verifying complete OpenCL compute pipeline. Closes: gpu-support-97u * Close Phase 2 feature bead (5wc) * feat(compute): Add KernelLoader with path conventions Kernel loading utility with caching and convention support: - loadOpenCLKernel(name) → kernels/opencl/{name}.cl - loadMetalKernel(name) → kernels/metal/{name}.metal - loadTestKernel(name) → kernels/{name}.cl (flat structure) - ConcurrentHashMap caching for repeated loads - kernelExists() for resource checking Package documentation with usage examples and conventions. 12 tests covering loading, caching, and error handling. Closes: gpu-support-0y1 * Close extraction epic (bsy) - gpu-support extraction complete * fix: Add .pm/ to gitignore and remove from tracking * feat(compute): add high-level ComputeService API with example kernels Add ComputeService facade providing simplified GPU compute with automatic CPU fallback. Includes built-in operations for vector math (vectorAdd, saxpy, scale) and reductions (sum, min, max), plus custom operation support via createOperation(). New resources: - kernels/opencl/vector_add.cl - element-wise vector addition - kernels/opencl/saxpy.cl - SAXPY operations - kernels/opencl/reduce.cl - parallel sum/min/max reductions - kernels/opencl/transform.cl - scale, clamp, abs, square, sqrt Tests: - ComputeServiceTest: 16 tests demonstrating API usage - ComputeServiceStressTest: 23 tests for edge cases, large arrays, concurrent access, and memory pressure Total: 302 tests pass in resource module * docs(compute): add usage guide and runnable examples COMPUTE.md covers: - Basic operations (vectorAdd, saxpy, scale, sum, min, max) - Custom kernel writing - Low-level API usage - Configuration (env vars, backend selection) - Error handling - Performance notes - Thread safety Examples in examples/ package: - VectorMathExample: built-in operations - CustomKernelExample: writing custom kernels - PerformanceExample: GPU vs CPU timing - LowLevelExample: direct buffer/kernel control * docs: add GPU compute section to root README * fix(gpu-test): correct kernel argument passing in SimpleMatrixMultiplyTest Bug: stack.ints(N) allocates N zero-filled ints, not an int containing N. The kernel received size=0, producing all zeros. Fix: Use clSetKernelArg1i/clSetKernelArg1p for scalar and pointer args. * docs: add GPU compute API reference to AGENTS.md for AI discovery
1 parent 359282c commit f107a71

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

45 files changed

+7319
-4
lines changed

.beads/.gitignore

Lines changed: 32 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,32 @@
1+
# SQLite databases
2+
*.db
3+
*.db?*
4+
*.db-journal
5+
*.db-wal
6+
*.db-shm
7+
8+
# Daemon runtime files
9+
daemon.lock
10+
daemon.log
11+
daemon.pid
12+
bd.sock
13+
14+
# Local version tracking (prevents upgrade notification spam after git ops)
15+
.local_version
16+
17+
# Legacy database files
18+
db.sqlite
19+
bd.db
20+
21+
# Merge artifacts (temporary files from 3-way merge)
22+
beads.base.jsonl
23+
beads.base.meta.json
24+
beads.left.jsonl
25+
beads.left.meta.json
26+
beads.right.jsonl
27+
beads.right.meta.json
28+
29+
# Keep JSONL exports and config (source of truth for git)
30+
!issues.jsonl
31+
!metadata.json
32+
!config.json

.beads/README.md

Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
# Beads - AI-Native Issue Tracking
2+
3+
Welcome to Beads! This repository uses **Beads** for issue tracking - a modern, AI-native tool designed to live directly in your codebase alongside your code.
4+
5+
## What is Beads?
6+
7+
Beads is issue tracking that lives in your repo, making it perfect for AI coding agents and developers who want their issues close to their code. No web UI required - everything works through the CLI and integrates seamlessly with git.
8+
9+
**Learn more:** [github.com/steveyegge/beads](https://github.com/steveyegge/beads)
10+
11+
## Quick Start
12+
13+
### Essential Commands
14+
15+
```bash
16+
# Create new issues
17+
bd create "Add user authentication"
18+
19+
# View all issues
20+
bd list
21+
22+
# View issue details
23+
bd show <issue-id>
24+
25+
# Update issue status
26+
bd update <issue-id> --status in_progress
27+
bd update <issue-id> --status done
28+
29+
# Sync with git remote
30+
bd sync
31+
```
32+
33+
### Working with Issues
34+
35+
Issues in Beads are:
36+
- **Git-native**: Stored in `.beads/issues.jsonl` and synced like code
37+
- **AI-friendly**: CLI-first design works perfectly with AI coding agents
38+
- **Branch-aware**: Issues can follow your branch workflow
39+
- **Always in sync**: Auto-syncs with your commits
40+
41+
## Why Beads?
42+
43+
**AI-Native Design**
44+
- Built specifically for AI-assisted development workflows
45+
- CLI-first interface works seamlessly with AI coding agents
46+
- No context switching to web UIs
47+
48+
🚀 **Developer Focused**
49+
- Issues live in your repo, right next to your code
50+
- Works offline, syncs when you push
51+
- Fast, lightweight, and stays out of your way
52+
53+
🔧 **Git Integration**
54+
- Automatic sync with git commits
55+
- Branch-aware issue tracking
56+
- Intelligent JSONL merge resolution
57+
58+
## Get Started with Beads
59+
60+
Try Beads in your own projects:
61+
62+
```bash
63+
# Install Beads
64+
curl -sSL https://raw.githubusercontent.com/steveyegge/beads/main/scripts/install.sh | bash
65+
66+
# Initialize in your repo
67+
bd init
68+
69+
# Create your first issue
70+
bd create "Try out Beads"
71+
```
72+
73+
## Learn More
74+
75+
- **Documentation**: [github.com/steveyegge/beads/docs](https://github.com/steveyegge/beads/tree/main/docs)
76+
- **Quick Start Guide**: Run `bd quickstart`
77+
- **Examples**: [github.com/steveyegge/beads/examples](https://github.com/steveyegge/beads/tree/main/examples)
78+
79+
---
80+
81+
*Beads: Issue tracking that moves at the speed of thought*

.beads/config.yaml

Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
# Beads Configuration File
2+
# This file configures default behavior for all bd commands in this repository
3+
# All settings can also be set via environment variables (BD_* prefix)
4+
# or overridden with command-line flags
5+
6+
# Issue prefix for this repository (used by bd init)
7+
# If not set, bd init will auto-detect from directory name
8+
# Example: issue-prefix: "myproject" creates issues like "myproject-1", "myproject-2", etc.
9+
# issue-prefix: ""
10+
11+
# Use no-db mode: load from JSONL, no SQLite, write back after each command
12+
# When true, bd will use .beads/issues.jsonl as the source of truth
13+
# instead of SQLite database
14+
# no-db: false
15+
16+
# Disable daemon for RPC communication (forces direct database access)
17+
# no-daemon: false
18+
19+
# Disable auto-flush of database to JSONL after mutations
20+
# no-auto-flush: false
21+
22+
# Disable auto-import from JSONL when it's newer than database
23+
# no-auto-import: false
24+
25+
# Enable JSON output by default
26+
# json: false
27+
28+
# Default actor for audit trails (overridden by BD_ACTOR or --actor)
29+
# actor: ""
30+
31+
# Path to database (overridden by BEADS_DB or --db)
32+
# db: ""
33+
34+
# Auto-start daemon if not running (can also use BEADS_AUTO_START_DAEMON)
35+
# auto-start-daemon: true
36+
37+
# Debounce interval for auto-flush (can also use BEADS_FLUSH_DEBOUNCE)
38+
# flush-debounce: "5s"
39+
40+
# Git branch for beads commits (bd sync will commit to this branch)
41+
# IMPORTANT: Set this for team projects so all clones use the same sync branch.
42+
# This setting persists across clones (unlike database config which is gitignored).
43+
# Can also use BEADS_SYNC_BRANCH env var for local override.
44+
# If not set, bd sync will require you to run 'bd config set sync.branch <branch>'.
45+
# sync-branch: "beads-sync"
46+
47+
# Multi-repo configuration (experimental - bd-307)
48+
# Allows hydrating from multiple repositories and routing writes to the correct JSONL
49+
# repos:
50+
# primary: "." # Primary repo (where this database lives)
51+
# additional: # Additional repos to hydrate from (read-only)
52+
# - ~/beads-planning # Personal planning repo
53+
# - ~/work-planning # Work planning repo
54+
55+
# Integration settings (access with 'bd config get/set')
56+
# These are stored in the database, not in this file:
57+
# - jira.url
58+
# - jira.project
59+
# - linear.url
60+
# - linear.api-key
61+
# - github.org
62+
# - github.repo

0 commit comments

Comments
 (0)