This repository provides a unified environment for deploying and testing AMD ROCm on Windows x64 systems. It consolidates Python-based workflows with Conda environments and a standalone MSI installer build system.
For most users, download the one-click installer:
ROCm_windows_x64_1.7.msi - One-click installer (coming soon via GitHub Actions)
Note: The MSI will be automatically built and released when you push the v1.7 tag. Check the Actions tab.
- docs/ — Documentation
- revision_documentation/ — Archived and deduplicated revision documents
- python_with_conda_env/ — Python-specific workflows with Conda environment
- installer/ — MSI installer build system
- claude_skills_for_AI_workflows/ — Claude skills prompts for AI workflow automation
- testing/ — Testing environments (VM, Sandbox)
Status as of End of Day (v1.3) - 2025-10-27:
Branches consolidated, documentation cleaned, repo renamed, Claude skills folder added, and Conda environment synchronized.
Tomorrow: Testing begins on Gillsystems_main with Ryzen 5900X + Radeon 7900XTX. VM and Windows Sandbox are already prepared.
This installer is currently signed with a self-signed certificate for community testing and distribution.
What this means:
- Windows SmartScreen will display a warning when you run the installer
- This is EXPECTED and SAFE for self-signed applications
- The installer is built from open-source code you can inspect
To install:
- Download the MSI from Releases
- Right-click and select "Run as Administrator"
- When SmartScreen appears, click "More info"
- Click "Run anyway"
- Follow the installation wizard
Why self-signed?
- Commercial code signing certificates cost $300-500/year
- This is a community project without corporate funding
- All source code is available for review on GitHub
Future: We plan to obtain a commercial certificate once the project demonstrates community adoption and funding becomes available.
This project provides a rock-solid, production-ready automated installer for AMD's ROCm (Radeon Open Compute) platform on Windows 10 Pro and Windows 11 systems using WSL2. It fills a critical gap left by AMD by providing a streamlined, user-friendly solution for AI enthusiasts, data scientists, and machine learning engineers who want to leverage AMD Radeon RX 7000 Series GPUs for their work.
The installer addresses the complex, error-prone manual installation process by providing:
- Automated hardware detection and compatibility verification
- One-click WSL2 setup with Ubuntu 22.04
- Complete ROCm 6.1.3 installation with all dependencies
- PyTorch 2.1.2 + ROCm integration with proper HSA runtime configuration
- Professional GUI for monitoring and control with comprehensive logging
- Validation and testing to ensure everything works before you start coding
Problem Statement: AMD provides excellent GPU hardware but lacks an easy, automated installation path for Windows users wanting to run ROCm-powered AI applications. Manual installation involves dozens of steps across multiple systems (Windows and Linux) with numerous potential failure points.
Solution: This installer automates the entire process from hardware detection through validation, making ROCm accessible to everyone with compatible hardware.
The ROCm Windows 11 Installer orchestrates a complete installation workflow:
-
System Compatibility Check
- Verifies Windows 10 Pro/Enterprise/Education (Build 19041+) or Windows 11
- Detects AMD Radeon RX 7000 Series GPU or Ryzen AI APU
- Checks AMD driver version (25.9.2+)
- Confirms WSL2 capability and Hyper-V support
- Validates Windows edition (Pro/Enterprise/Education required)
-
WSL2 Environment Setup
- Enables Windows Subsystem for Linux features
- Installs Ubuntu 22.04 (tested and verified version)
- Configures WSL2 as default
- Sets up networking and GPU passthrough
-
ROCm Installation
- Downloads AMD GPU install package (6.1.3)
- Installs AMDGPU drivers for WSL
- Installs ROCm runtime and toolchain
- Configures environment variables
-
PyTorch Integration
- Installs Python 3.10 and dependencies
- Downloads PyTorch 2.1.2 + ROCm 6.1.3 wheels
- Installs TorchVision and TorchAudio
- Fixes HSA runtime library linking
- Validates GPU detection
-
Validation & Testing
- Runs
ROCminfoto verify GPU detection - Tests PyTorch CUDA/ROCm availability
- Creates test tensor on GPU
- Generates installation report
- Runs
V1.0 Delivered:
- Complete automated installation pipeline
- Professional Streamlit web interface
- Comprehensive PowerShell automation scripts
- Bash scripts for WSL2-side installation
- Real-time progress tracking
- Detailed logging system
- Error handling and recovery guidance
- System compatibility checking
- Installation validation and testing
- Requirements template for ML packages
Future Enhancements (V2.0+):
- Automatic rollback on failures
- Modular component installation
- Digital signing for distribution
- Pre-configured ML environment templates
- Stable Diffusion quick-setup wizard
- LLM framework installers (LM Studio, Ollama integration)
- Performance benchmarking tools
- Troubleshooting diagnostic wizard
## Windows 10/11 Host
## Streamlit Web Interface (Python)
- Progress Tracking
- User Controls
- Log Visualization
## PowerShell Automation Layer
- Hardware Detection
- Driver Verification
- WSL2 Setup
- Master Orchestrator
## WSL2 Bridge
## WSL2 Ubuntu 22.04
### Bash Installation Scripts
- ROCm Installation
- PyTorch Setup
### ROCm Runtime Environment
- /opt/rocm/
- PyTorch 2.1.2 + ROCm
- CUDA/ROCm API Layer
## GPU Passthrough
## AMD Radeon RX 7000 Series GPU
- Hardware Acceleration
- ROCm 6.1.3 Compatible Drivers
### Component Interaction Flow
```mermaid
graph TD
A[User] --> B[Streamlit GUI]
B --> C[Hardware Check]
C --> D[Driver Verification]
D --> E[WSL2 Setup]
E --> F[Ubuntu 22.04 Installation]
F --> G[ROCm Installation]
G --> H[PyTorch Installation]
H --> I[Validation Tests]
I --> J[Ready for AI/ML Development]
ROCm_Win11_installer/
---
## ROCm AI Platform — Local LLMs & Docker management
The project now includes an expanded Streamlit UI (`src/gui/streamlit_app.py`) that provides a lightweight "ROCm AI Platform" for managing local AI tooling after ROCm is installed. Key additions:
- A **Docker & Containers** tab to build/manage local images (vLLM server and a PyTorch/JupyterLab developer image).
- A **Models & Chat** tab that displays model configuration (`src/config/llm_config.yaml`) and provides a local chat UI (client) that can connect to a running vLLM service.
- Helper scripts:
- `src/scripts/prepare_wsl_env.ps1` — prepares the WSL environment and copies installation scripts into `/tmp/ROCm_install` inside `Ubuntu-22.04`.
- `src/scripts/run_docker_build.ps1` — wrapper used by the GUI to build Docker images from Windows/PowerShell.
How to run the GUI (PowerShell):
```powershell
# (Optional) Activate the project's Python environment
# conda activate ROCm_installer_env
# Launch Streamlit UI
streamlit run src/gui/streamlit_app.py
Manual Docker build example (PowerShell):
cd src\docker\vllm
docker build -t rocm-vllm:latest .
# Or use the provided helper script (example)
.\src\scripts\run_docker_build.ps1 -Path "$(Resolve-Path src\docker\vllm)" -Tag rocm-vllm:latestNotes:
-
Docker features require Docker Desktop installed and running on Windows.
-
WSL operations assume an Ubuntu distribution named
Ubuntu-22.04. -
The Chat UI is a client; to use it you must run a vLLM server/container or point the config to an existing endpoint.
gui/ streamlit_app.py # Main web interface? scripts/ detect_hardware.ps1 # Windows hardware detection verify_amd_compatibility.ps1 # AMD GPU/driver check wsl2_setup.ps1 # WSL2 installation master_installer.ps1 # Master orchestrator install_ROCm.sh # ROCm installation (WSL) install_pytorch.sh # PyTorch installation (WSL) utils/ logging_utils.py # Logging utilities logs/ # Installation logs docs/ # Documentation environment.yml # Conda environment requirements.txt# Python dependencies README.md # This file
### Key Components
#### 1. **Streamlit Web Interface** (`src/gui/streamlit_app.py`)
- Modern, responsive UI with progress tracking
- Real-time log streaming
- System information display
- Installation control panel
- Validation testing interface
#### 2. **PowerShell Scripts** (`src/scripts/*.ps1`)
- **detect_hardware.ps1**: Detects AMD GPU and Windows version
- **verify_amd_compatibility.ps1**: Validates driver versions and GPU compatibility
- **wsl2_setup.ps1**: Automates WSL2 installation and configuration
- **master_installer.ps1**: Orchestrates the entire installation process
#### 3. **Bash Scripts** (`src/scripts/*.sh`)
- **install_ROCm.sh**: Installs ROCm 6.1.3 in WSL2 Ubuntu
- **install_pytorch.sh**: Installs PyTorch with ROCm support and fixes HSA runtime
#### 4. **Logging System** (`src/utils/logging_utils.py`)
- Timestamped logs
- Multiple log levels (INFO, WARNING, ERROR, SUCCESS)
- File and console output
- System information capture
---
## Actual Working Product Code
### External Dependencies
#### Python Packages (requirements.txt)
streamlit>=1.30.0 # Web interface framework psutil>=5.9.0 # System and process utilities requests>=2.31.0 # HTTP library PyYAML>=6.0 # YAML parser tqdm>=4.65.0 # Progress bars watchdog>=3.0.0 # File system monitoring streamlit-option-menu>=0.3.6 # UI components
#### Conda Environment (environment.yml)
```yaml
name: ROCm_installer_env
channels:
- conda-forge
- defaults
dependencies:
- python=3.10
- pip
- streamlit
- psutil
- pywin32
- pyyaml
- rich
- typer
- requests
- pytest
- black
- flake8
All custom code is thoroughly documented and includes:
- Hardware detection algorithms
- WSL2 automation logic
- ROCm installation orchestration
- PyTorch integration with HSA runtime fixes
- Comprehensive error handling
- Progress tracking system
- Operating System: Windows 10 Pro/Enterprise/Education (Build 19041+) OR Windows 11
- Hardware: AMD Radeon RX 7000 Series GPU
- RX 7900 XTX, XT, GRE
- RX 7800 XT
- RX 7700 XT
- RX 7600 XT, 7600
- Ryzen AI APUs (Radeon 890M, 880M, 780M)
- RAM: 16GB+ recommended (32GB for large models)
- Storage: 50GB+ free disk space
- Network: Internet connection for downloads
- Privileges: Administrator access required
- Requirements:
- Hyper-V capable system (not available on Windows Home editions)
- Virtualization enabled in BIOS/UEFI
- AMD Adrenalin drivers 25.9.2 or later
Note: Windows 10 Home and Windows 11 Home editions are NOT supported because they lack Hyper-V functionality required for WSL2.
-
Clone the Repository
git clone https://github.com/OCNGill/ROCm_Installer_Win11.git cd ROCm_Installer_Win11 -
Set Up Python Environment
# Using Conda (recommended) conda env create -f environment.yml conda activate ROCm_installer_env # OR using pip pip install -r requirements.txt
-
Launch the Installer
streamlit run src/gui/streamlit_app.py
-
Follow the Web Interface
- Open your browser (automatically opens to http://localhost:8501)
- Navigate through the tabs:
- Home: Overview and requirements
- Compatibility: Run system checks
- Installation: Execute installation steps
- Documentation: Access help resources
- Complete Installation
- Click through each installation step
- Monitor progress in real-time
- Review logs for any issues
- Run validation tests
-
Run as Administrator
# Open PowerShell as Administrator cd ROCm_Installer_Win11\src\scripts -
Execute Master Installer
.\master_installer.ps1
-
Follow Prompts
- Confirm each installation step
- Wait for completion
- Review final summary
- Verify Installation
# Open WSL2 wsl -d Ubuntu-22.04
ROCminfo
python3 -c "import torch; print(torch.cuda.is_available())"
2. **Install AI Frameworks** (Optional)
```bash
# Inside WSL2
pip3 install -r ~/ROCm_requirements.txt
- Start Development
- Your system is now ready for AI/ML projects!
- Compatible with Stable Diffusion, LLMs, and more
- DO NOT overwrite PyTorch when installing other packages
- ALWAYS comment out
torchlines in requirements.txt files - KEEP logs for troubleshooting (stored in
logs/directory) - UPDATE AMD drivers regularly for best performance
- BACKUP your work before major system changes
- Automated Rollback: Undo installation if errors occur
- Component Selection: Choose which parts to install
- Profile Manager: Save/load installation profiles
- Update Checker: Automatically check for ROCm updates
- Pre-configured Environments: One-click setups for:
- Stable Diffusion (ComfyUI, Automatic1111)
- LLM Development (LM Studio, Ollama)
- General ML (TensorFlow, JAX)
- Performance Tuning: Optimize ROCm for specific GPUs
- Cloud Integration: Backup and sync configurations
- Community Repository: Share working configurations
- Multi-GPU Support: Manage multiple AMD GPUs
- Remote Installation: Set up systems remotely
- CI/CD Integration: Automate testing and deployment
- Enterprise Features: Bulk deployment, licensing
| Issue | Solution |
|---|---|
| WSL2 not installing | Enable virtualization in BIOS |
| GPU not detected | Update AMD drivers to 24.6.1+ |
| PyTorch can't find GPU | Verify HSA runtime fix applied |
| Installation hangs | Check internet connection, run individually |
| Permission denied | Run PowerShell as Administrator |
- Check Logs: Review installation logs in
logs/directory - AMD Documentation: ROCm for WSL2
- Community Forums: AMD Community
- GitHub Issues: Create an issue
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
MIT License - see LICENSE file for details
- AMD for ROCm platform and documentation
- Microsoft for WSL2 technology
- Streamlit for the excellent GUI framework
- Community Contributors for testing and feedback
- GitHub: OCNGill/ROCm_Installer_Win11
- Issues: Report bugs or request features
If you find this project helpful, you can support ongoing work — thank you!
Donate:


