Releases: mixa3607/ML-gfx906
Releases · mixa3607/ML-gfx906
20260309231655
Changelog
- rocm
- add 7.1.1 image
- add 7.2.0 image
- rocm-tensile files
- add 6.3.3
- add 6.4.4
- add 7.0.0
- add 7.0.2
- add 7.1.1
- add 7.2.0
- comfyui
- update image to v0.16.4
- llama.cpp
- update image to b8248
- add rocm 7.2.0 target
- vllm
- update v0.11.0 (nlzy fork)
- add v0.11.2 (nlzy fork)
- add v0.12.0 (nlzy fork)
- add more/move old recipes to https://arkprojects.space/wiki/AMD_GFX906/vllm/recipes
| Project | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.2.0-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.1.1-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.1.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.2-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-7.0.2 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.16.4-torch-v2.9.0-rocm-7.0.2 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.16.4-torch-v2.9.0-rocm-6.3.3 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.12.0-rocm-6.3.3-nlzy |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.11.2-rocm-6.3.3-nlzy |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3-nlzy |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b8248-rocm-7.2.0 |
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b8248-rocm-6.3.3 |
Full Changelog: 2026030...2026030
20260309165414
vLLM update
Changelog
- add vllm 0.12.0
- add vllm 0.11.2
- update vllm 0.11.0
- move som docs to https://arkprojects.space/wiki/AMD_GFX906/vllm
| Project | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.1.0-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.2-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-7.0.2 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-7.0.2 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-6.3.3 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.12.0-rocm-6.3.3-nlzy |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.11.2-rocm-6.3.3-nlzy |
|
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3-nlzy |
|
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.8.5-rocm-6.3.3 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-7.1.0 |
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-6.3.3 |
20251118163804
- comfyui: bump ver v0.3.69
- llama.cpp: bump ver b7091
- llama.cpp: fix #5
| Project | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.1.0-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.2-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-7.0.2 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-7.0.2 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.69-torch-v2.9.0-rocm-6.3.3 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3 |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.8.5-rocm-6.3.3 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-7.1.0 |
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b7091-rocm-6.3.3 |
Full Changelog: 2025110...2025111
20251102214219
A little bit of everything
- add more docs
- rebuild all torch images with vision and audio
- upd vllm
- upd llama.cpp
- upd comfyui
- add rocm 7.1.0
- add qwen3 vllm benchmarks
- add vllm recipes
- add modelscope to vllm image
| Project | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.1.0-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.2-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:7.0.0-complete |
|
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.4.4 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.9.0-rocm-7.0.2 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.67-torch-v2.9.0-rocm-7.0.2 |
| ╠═ | docker.io/mixa3607/comfyui-gfx906:v0.3.67-torch-v2.9.0-rocm-6.4.4 |
|
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.67-torch-v2.9.0-rocm-6.3.3 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3 |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.3.3 |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.8.5-rocm-6.3.3 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6924-rocm-7.1.0 |
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6924-rocm-6.3.3 |
20251014234333
Add docs / Upd deps
- vLLM
- llama.cpp
- ComfyUI
| Solution | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.0.0-20251005035204-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-20251005035204-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-20251005035204-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4-20251010004720 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3-20251010004720 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4-20251010004720 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3-20251010004720 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.65-torch-v2.7.1-rocm-6.4.4-patch-20251014234333 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.65-torch-v2.7.1-rocm-6.3.3-patch-20251014234333 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:0.11.0-rocm-6.3.3-20251014234333 |
| ╠═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.4.4-20251014234333 |
|
| ╚═ | docker.io/mixa3607/vllm-gfx906:0.10.2-rocm-6.3.3-20251014234333 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6765-rocm-7.0.0-patch-20251014234333 |
| ╠═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6765-rocm-6.4.4-patch-20251014234333 |
|
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6765-rocm-6.3.3-patch-20251014234333 |
20251010004720
Add PyTorch image. Upd deps
| Solution | Image | |
|---|---|---|
| ROCm | ╦═ | docker.io/mixa3607/rocm-gfx906:7.0.0-20251005035204-complete |
| ╠═ | docker.io/mixa3607/rocm-gfx906:6.4.4-20251005035204-complete |
|
| ╚═ | docker.io/mixa3607/rocm-gfx906:6.3.3-20251005035204-complete |
|
| PyTorch | ╦═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.4.4-20251010004720 |
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.7.1-rocm-6.3.3-20251010004720 |
|
| ╠═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.4.4-20251010004720 |
|
| ╚═ | docker.io/mixa3607/pytorch-gfx906:v2.8.0-rocm-6.3.3-20251010004720 |
|
| ComfyUI | ╦═ | docker.io/mixa3607/comfyui-gfx906:v0.3.63-torch-v2.7.1-rocm-6.4.4-patch-20251010004720 |
| ╚═ | docker.io/mixa3607/comfyui-gfx906:v0.3.63-torch-v2.7.1-rocm-6.3.3-patch-20251010004720 |
|
| vLLM | ╦═ | docker.io/mixa3607/vllm-gfx906:ella-20251010004720 |
| ╚═ | docker.io/mixa3607/vllm-gfx906:jena-20251010004720 |
|
| llama.cpp | ╦═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6730-rocm-7.0.0-patch-20251010004720 |
| ╠═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6730-rocm-6.4.4-patch-20251010004720 |
|
| ╚═ | docker.io/mixa3607/llama.cpp-gfx906:full-b6730-rocm-6.3.3-patch-20251010004720 |
20251001162807
| Name | Source | OCI registry | app ver | ROCm ver | Status |
|---|---|---|---|---|---|
| ROCm | ROCm, rocBLAS | docker.io/mixa3607/rocm-gfx906 |
7.0 | 7.0 | OK |
| llama.cpp | llama.cpp | docker.io/mixa3607/llama.cpp-gfx906 |
b6653 | 7.0.0 | OK |
| ComfyUI | ComfyUI | docker.io/mixa3607/comfyui-gfx906 |
v0.3.62 | 6.4.4 | OK |
| VLLM | VLLM, triton | docker.io/mixa3607/vllm-gfx906 |
3c5caec9e + v3.3.0gfx906 | 6.3.3 | OK |
20250925000028
Checkpoints:
- ROCm: 6.4.3
- ComfyUI: v0.3.60
- llama.cpp: b6569
20250828201221
Checkpoints:
- ROCm: 6.4.3
- ComfyUI: v0.3.54
- llama.cpp: b6307
20250824165014
Checkpoints:
- ROCm: 6.4.3
- ComfyUI: v0.3.52
- llama.cpp: b6262