A Tackle2 addon that uses Goose to perform AI-assisted code migrations. The addon clones an application's source repo, fetches migration rules, runs a Goose recipe, and uploads the report back to the hub.
The following is a guide to setting up and testing the addon locally.
- Auth disabled on the hub (for now)
oc port-forwardto the hub service- The Addon CR applied to the cluster
The hub API isn't directly accessible through the OpenShift route. Port-forward to bypass it:
oc port-forward svc/tackle-hub 8080:8080 -n konveyor-tackleoc apply -f ai-migrator-addon-cr.yaml -n konveyor-tackleVerify:
oc get addon ai-migrator -n konveyor-tackle -o yamlcurl -sS "http://localhost:8080/applications?limit=50" \
-H "Accept: application/json" \
| python3 -c "
import sys, json
for app in json.load(sys.stdin):
print(f' id={app[\"id\"]} name={app[\"name\"]}')
"Task instances are created via the hub REST API (not Kubernetes CRs). Leave state blank (defaults to Created) so the task manager won't try to launch the container -- for local debugging we run the addon directly.
curl -sS -X POST http://localhost:8080/tasks \
-H "Content-Type: application/json" \
-H "Accept: application/json" \
-d '{
"name": "ai-migrator-test",
"addon": "ai-migrator",
"application": {"id": 231},
"data": {
"profile": {"id": 1},
"sourceTech": "PatternFly 5",
"targetTech": "PatternFly 6"
}
}'Save the id from the response (e.g. 1145).
| Field | Description |
|---|---|
profile |
Analysis profile reference (by ID). Populates rules from the profile's repository/files. |
sourceTech |
Source technology (e.g. "PatternFly 5") |
targetTech |
Target technology (e.g. "PatternFly 6") |
rules |
Optional. Rules directly (repository, rulesets, files). If both profile and rules are set, the profile populates first, then direct rules override. |
Update .env with the task ID from step 2:
TASK=1145
SHARED_PATH=<path_to>/tackle2-addon-ai-migrator/shared
HUB_BASE_URL=http://localhost:8080
GOOSE_BIN=<path_to>/tackle2-addon-ai-migrator/bin/fake-goose
RECIPE_PATH=<path_to>/tackle2-addon-ai-migrator/recipes/goose/recipes/migration.yaml
Note
Set GOOSE_BIN to bin/fake-goose for fast iteration. Build it with make fake-goose.
Then either hit F5 in VS Code (uses .vscode/launch.json) or:
source .env && go run ./cmdTask status:
curl -sS http://localhost:8080/tasks/1145 \
-H "Accept: application/json" | python3 -m json.toolTask attachments (report file):
curl -sS http://localhost:8080/tasks/1145 \
-H "Accept: application/json" \
| python3 -c "
import sys, json
task = json.load(sys.stdin)
for a in (task.get('attached') or []):
print(f' id={a[\"id\"]} name={a.get(\"name\",\"\")}')
"Download the report:
curl -sS http://localhost:8080/files/<fileId> -o report.htmlQuery facts:
curl -sS http://localhost:8080/applications/231/facts/ai-migrator: \
-H "Accept: application/json" \
| python3 -c "
import sys, json
facts = json.load(sys.stdin)
for k, v in facts.items():
if isinstance(v, list):
print(f'{k}: {bytes(v).decode()}')
else:
print(f'{k}: {v}')
"Note
Fact values come back as byte arrays due to a hub serialization issue. The python snippet above decodes them. The primary report access path is the task attachment (addon.Attach), not facts.
Goose output log:
The hub SDK's command.New() automatically captures goose stdout/stderr and uploads it as a task attachment. Find the goose output file in the attachments list (it will be named goose.output or fake-goose.output), then download it by file ID:
curl -sS http://localhost:8080/files/<fileId>make cmd # build the addon binary
make fake-goose # build the fake goose binary for testing
make image-podman # build the container image1. addon.DataWith(&data) -- parse task JSON into Data struct
2. applyProfile(data) -- if profile ID set, fetch it and populate Rules
3. FetchRepository(application) -- clone the app's git repo
4. Rules.Build() -- fetch rules from hub (files, rulesets, git repos)
5. Goose.Run() -- goose run --recipe ... --params ...
6. uploadReport() -- addon.File.Post() + addon.Attach() + store facts