Skip to content

Latest commit

 

History

History
755 lines (587 loc) · 32.1 KB

File metadata and controls

755 lines (587 loc) · 32.1 KB

Releases

This page describes how to install and use our release artifacts for ROCm and external builds like PyTorch and JAX. We produce build artifacts as part of our Continuous Integration (CI) build/test workflows as well as release artifacts as part of Continuous Delivery (CD) nightly releases. For the development-status of GPU architecture support in TheRock, please see the SUPPORTED_GPUS.md document, which tracks readiness and onboarding progress for each AMD GPU architecture.

See also the Roadmap for support and Build artifacts overview pages.

Important

These instructions assume familiarity with how to use ROCm. Please see https://rocm.docs.amd.com/ for general information about the ROCm software platform.

Prerequisites:

Table of contents:

Installing releases using pip

We recommend installing ROCm and projects like PyTorch and JAX via pip, the Python package installer.

We currently support Python 3.10, 3.11, 3.12, 3.13, and 3.14 (PyTorch 2.9+ only).

Tip

We highly recommend working within a Python virtual environment:

python -m venv .venv
source .venv/bin/activate

Multiple virtual environments can be present on a system at a time, allowing you to switch between them at will.

Warning

If you really want a system-wide install, you can pass --break-system-packages to pip outside a virtual enivornment. In this case, commandline interface shims for executables are installed to /usr/local/bin, which normally has precedence over /usr/bin and might therefore conflict with a previous installation of ROCm.

Python packages release status

Important

Known issues with the Python wheels are tracked at #808.

⚠️ Windows packages are new and may be unstable! ⚠️

Platform ROCm Python packages PyTorch Python packages JAX Python packages
Linux Release portable Linux packages Release Linux PyTorch Wheels Release Linux JAX Wheels
Windows Release Windows packages Release Windows PyTorch Wheels

Index page listing

For now, rocm, torch, and jax packages are published to GPU-architecture-specific index pages and must be installed using an appropriate --find-links argument to pip. They may later be pushed to the Python Package Index (PyPI) or other channels using a process like https://wheelnext.dev/. Please check back regularly as these instructions will change as we migrate to official indexes and adjust project layouts.

Product Name GFX Target GFX Family Install instructions
MI300A/MI300X gfx942 gfx94X-dcgpu rocm // torch // jax
MI350X/MI355X gfx950 gfx950-dcgpu rocm // torch // jax
AMD RX 7900 XTX gfx1100 gfx110X-all rocm // torch // jax
AMD RX 7800 XT gfx1101 gfx110X-all rocm // torch // jax
AMD RX 7700S / Framework Laptop 16 gfx1102 gfx110X-all rocm // torch // jax
AMD Radeon 780M Laptop iGPU gfx1103 gfx110X-all rocm // torch // jax
AMD Strix Halo iGPU gfx1151 gfx1151 rocm // torch // jax
AMD RX 9060 / XT gfx1200 gfx120X-all rocm // torch // jax
AMD RX 9070 / XT gfx1201 gfx120X-all rocm // torch // jax

Installing ROCm Python packages

We provide several Python packages which together form the complete ROCm SDK.

Package name Description
rocm Primary sdist meta package that dynamically determines other deps
rocm-sdk-core OS-specific core of the ROCm SDK (e.g. compiler and utility tools)
rocm-sdk-devel OS-specific development tools
rocm-sdk-libraries OS-specific libraries

rocm for gfx94X-dcgpu

Supported devices in this family:

Product Name GFX Target
MI300A/MI300X gfx942

Install instructions:

pip install --index-url https://rocm.nightlies.amd.com/v2/gfx94X-dcgpu/ "rocm[libraries,devel]"

rocm for gfx950-dcgpu

Supported devices in this family:

Product Name GFX Target
MI350X/MI355X gfx950

Install instructions:

pip install --index-url https://rocm.nightlies.amd.com/v2/gfx950-dcgpu/ "rocm[libraries,devel]"

rocm for gfx110X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 7900 XTX gfx1100
AMD RX 7800 XT gfx1101
AMD RX 7700S / Framework Laptop 16 gfx1102
AMD Radeon 780M Laptop iGPU gfx1103

Install instructions:

pip install --index-url https://rocm.nightlies.amd.com/v2/gfx110X-all/ "rocm[libraries,devel]"

rocm for gfx1151

Supported devices in this family:

Product Name GFX Target
AMD Strix Halo iGPU gfx1151

Install instructions:

pip install --index-url https://rocm.nightlies.amd.com/v2/gfx1151/ "rocm[libraries,devel]"

rocm for gfx120X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 9060 / XT gfx1200
AMD RX 9070 / XT gfx1201

Install instructions:

pip install --index-url https://rocm.nightlies.amd.com/v2/gfx120X-all/ "rocm[libraries,devel]"

Using ROCm Python packages

After installing the ROCm Python packages, you should see them in your environment:

pip freeze | grep rocm
# rocm==6.5.0rc20250610
# rocm-sdk-core==6.5.0rc20250610
# rocm-sdk-devel==6.5.0rc20250610
# rocm-sdk-libraries-gfx110X-all==6.5.0rc20250610

You should also see various tools on your PATH and in the bin directory:

which rocm-sdk
# .../.venv/bin/rocm-sdk

ls .venv/bin
# activate       amdclang++    hipcc      python                 rocm-sdk
# activate.csh   amdclang-cl   hipconfig  python3                rocm-smi
# activate.fish  amdclang-cpp  pip        python3.12             roc-obj
# Activate.ps1   amdflang      pip3       rocm_agent_enumerator  roc-obj-extract
# amdclang       amdlld        pip3.12    rocminfo               roc-obj-ls

The rocm-sdk tool can be used to inspect and test the installation:

$ rocm-sdk --help
usage: rocm-sdk {command} ...

ROCm SDK Python CLI

positional arguments:
  {path,test,version,targets,init}
    path                Print various paths to ROCm installation
    test                Run installation tests to verify integrity
    version             Print version information
    targets             Print information about the GPU targets that are supported
    init                Expand devel contents to initialize rocm[devel]

$ rocm-sdk test
...
Ran 22 tests in 8.284s
OK

$ rocm-sdk targets
gfx1100;gfx1101;gfx1102

To initialize the rocm[devel] package, use the rocm-sdk tool to eagerly expand development contents:

$ rocm-sdk init
Devel contents expanded to '.venv/lib/python3.12/site-packages/_rocm_sdk_devel'

These contents are useful for using the package outside of Python and lazily expanded on the first use when used from Python.

Once you have verified your installation, you can continue to use it for standard ROCm development or install PyTorch, JAX, or another supported Python ML framework.

Installing PyTorch Python packages

Using the index pages listed above, you can also install torch, torchaudio, torchvision, and apex.

Note

By default, pip will install the latest stable versions of each package.

Warning

The torch packages depend on rocm[libraries], so the compatible ROCm packages should be installed automatically for you and you do not need to explicitly install ROCm first. If ROCm is already installed this may result in a downgrade if the torch wheel to be installed requires a different version.

Tip

If you previously installed PyTorch with the pytorch-triton-rocm package, please uninstall it before installing the new packages:

pip uninstall pytorch-triton-rocm

The triton package is now named triton.

torch for gfx94X-dcgpu

Supported devices in this family:

Product Name GFX Target
MI300A/MI300X gfx942
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx94X-dcgpu/ torch torchaudio torchvision
# Optional additional packages on Linux:
#   apex

torch for gfx950-dcgpu

Supported devices in this family:

Product Name GFX Target
MI350X/MI355X gfx950
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx950-dcgpu/ torch torchaudio torchvision
# Optional additional packages on Linux:
#   apex

torch for gfx110X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 7900 XTX gfx1100
AMD RX 7800 XT gfx1101
AMD RX 7700S / Framework Laptop 16 gfx1102
AMD Radeon 780M Laptop iGPU gfx1103
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx110X-all/ torch torchaudio torchvision
# Optional additional packages on Linux:
#   apex

torch for gfx1151

Supported devices in this family:

Product Name GFX Target
AMD Strix Halo iGPU gfx1151
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx1151/ torch torchaudio torchvision
# Optional additional packages on Linux:
#   apex

torch for gfx120X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 9060 / XT gfx1200
AMD RX 9070 / XT gfx1201
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx120X-all/ torch torchaudio torchvision
# Optional additional packages on Linux:
#   apex

Using PyTorch Python packages

After installing the torch package with ROCm support, PyTorch can be used normally:

import torch

print(torch.cuda.is_available())
# True
print(torch.cuda.get_device_name(0))
# e.g. AMD Radeon Pro W7900 Dual Slot

See also the Testing the PyTorch installation instructions in the AMD ROCm documentation.

Installing JAX Python packages

Using the index pages listed above, you can also install jaxlib, jax_rocm7_plugin, and jax_rocm7_pjrt.

Note

By default, pip will install the latest stable versions of each package.

  • If you want to install other versions, the currently supported versions are:

    jax version jaxlib version
    0.8.2 0.8.2
    0.8.0 0.8.0

    See also

Warning

Unlike PyTorch, the JAX wheels do not automatically install rocm[libraries] as a dependency. You must have ROCm installed separately via a tarball installation.

Important

The jax package itself is not published to the TheRock index. After installing jaxlib, jax_rocm7_plugin, and jax_rocm7_pjrt from the GPU-family index, install jax from PyPI:

pip install jax

jax for gfx94X-dcgpu

Supported devices in this family:

Product Name GFX Target
MI300A/MI300X gfx942
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx94X-dcgpu/ jaxlib jax_rocm7_plugin jax_rocm7_pjrt
# Install jax from PyPI
pip install jax

jax for gfx950-dcgpu

Supported devices in this family:

Product Name GFX Target
MI350X/MI355X gfx950
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx950-dcgpu/ jaxlib jax_rocm7_plugin jax_rocm7_pjrt
# Install jax from PyPI
pip install jax

jax for gfx110X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 7900 XTX gfx1100
AMD RX 7800 XT gfx1101
AMD RX 7700S / Framework Laptop 16 gfx1102
AMD Radeon 780M Laptop iGPU gfx1103
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx110X-all/ jaxlib jax_rocm7_plugin jax_rocm7_pjrt
# Install jax from PyPI
pip install jax

jax for gfx1151

Supported devices in this family:

Product Name GFX Target
AMD Strix Halo iGPU gfx1151
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx1151/ jaxlib jax_rocm7_plugin jax_rocm7_pjrt
# Install jax from PyPI
pip install jax

jax for gfx120X-all

Supported devices in this family:

Product Name GFX Target
AMD RX 9060 / XT gfx1200
AMD RX 9070 / XT gfx1201
pip install --index-url https://rocm.nightlies.amd.com/v2/gfx120X-all/ jaxlib jax_rocm7_plugin jax_rocm7_pjrt
# Install jax from PyPI
pip install jax

Using JAX Python packages

After installing the JAX packages with ROCm support, JAX can be used normally:

import jax

print(jax.devices())
# [RocmDevice(id=0)]

For building JAX from source or running the full JAX test suite, see the external-builds/jax README.

Installing from tarballs

Standalone "ROCm SDK tarballs" are a flattened view of ROCm artifacts matching the familiar folder structure seen with system installs on Linux to /opt/rocm/ or on Windows via the HIP SDK:

install/  # Extracted tarball location, file path of your choosing
  .info/
  bin/
  clients/
  include/
  lib/
  libexec/
  share/

Tarballs are just these raw files. They do not come with "install" steps such as setting environment variables.

Warning

Tarballs and per-commit CI artifacts are primarily intended for developers and CI workflows.

For most users, we recommend installing via package managers:

Browsing release tarballs

Release tarballs are uploaded to the following locations:

Tarball index S3 bucket Description
https://repo.amd.com/rocm/tarball/ (not publicly accessible) Stable releases
https://rocm.nightlies.amd.com/tarball/ therock-nightly-tarball Nightly builds from the default development branch
https://rocm.prereleases.amd.com/tarball/ (not publicly accessible) ⚠️ Prerelease builds for QA testing ⚠️
https://rocm.devreleases.amd.com/tarball/ therock-dev-tarball ⚠️ Development builds from project maintainers ⚠️

Manual tarball extraction

To download a tarball and extract it into place manually:

mkdir therock-tarball && cd therock-tarball
# For example...
wget https://rocm.nightlies.amd.com/tarball/therock-dist-linux-gfx110X-all-7.12.0a20260202.tar.gz
mkdir install && tar -xf *.tar.gz -C install

Automated tarball extraction

For more control over artifact installation—including per-commit CI builds, specific release versions, the latest nightly release, and component selection—see the Installing Artifacts developer documentation. The install_rocm_from_artifacts.py script can be used to install artifacts from a variety of sources.

Using installed tarballs

After installing (downloading and extracting) a tarball, you can test it by running programs from the bin/ directory:

ls install
# bin  include  lib  libexec  llvm  share

# Now test some of the installed tools:
./install/bin/rocminfo
./install/bin/test_hip_api

Tip

You may also want to add parts of the install directory to your PATH or set other environment variables like ROCM_HOME.

See also this issue discussing relevant environment variables.

Tip

After extracting a tarball, metadata about which commits were used to build TheRock can be found in the share/therock/therock_manifest.json file:

cat install/share/therock/therock_manifest.json
# {
#   "the_rock_commit": "567dd890a3bc3261ffb26ae38b582378df298374",
#   "submodules": [
#     {
#       "submodule_name": "half",
#       "submodule_path": "base/half",
#       "submodule_url": "https://github.com/ROCm/half.git",
#       "pin_sha": "207ee58595a64b5c4a70df221f1e6e704b807811",
#       "patches": []
#     },
#     ...

Installing from native packages

In addition to Python wheels and tarballs, ROCm native Linux packages are published for Debian-based and RPM-based distributions.

Warning

These builds are primarily intended for development and testing and are currently unsigned.

Native packages release status

Platform Native packages
Linux Build Native Linux Packages
Windows (Coming soon)

GPU family and package mapping

Product Name GFX Target GFX Family Runtime Package Development Package
MI300A/MI300X gfx942 gfx94X amdrocm-gfx94x amdrocm-core-sdk-gfx94x
MI350X/MI355X gfx950 gfx950 amdrocm-gfx950 amdrocm-core-sdk-gfx950
AMD RX 7900 XTX gfx1100 gfx110x amdrocm-gfx110x amdrocm-core-sdk-gfx110x
AMD RX 7800 XT gfx1101 gfx110x amdrocm-gfx110x amdrocm-core-sdk-gfx110x
AMD RX 7700S / Framework Laptop 16 gfx1102 gfx110x amdrocm-gfx110x amdrocm-core-sdk-gfx110x
AMD Radeon 780M Laptop iGPU gfx1103 gfx110x amdrocm-gfx110x amdrocm-core-sdk-gfx110x
AMD Strix Point iGPU gfx1150 gfx1150 amdrocm-gfx1150 amdrocm-core-sdk-gfx1150
AMD Strix Halo iGPU gfx1151 gfx1151 amdrocm-gfx1151 amdrocm-core-sdk-gfx1151
AMD Fire Range iGPU gfx1152 gfx1152 amdrocm-gfx1152 amdrocm-core-sdk-gfx1152
AMD Strix Halo XT gfx1153 gfx1153 amdrocm-gfx1153 amdrocm-core-sdk-gfx1153
AMD RX 9060 / XT gfx1200 gfx120X amdrocm-gfx120x amdrocm-core-sdk-gfx120x
AMD RX 9070 / XT gfx1201 gfx120X amdrocm-gfx120x amdrocm-core-sdk-gfx120x
Radeon VII gfx906 gfx906 amdrocm-gfx906 amdrocm-core-sdk-gfx906
MI100 gfx908 gfx908 amdrocm-gfx908 amdrocm-core-sdk-gfx908
MI200 series gfx90a gfx90a amdrocm-gfx90a amdrocm-core-sdk-gfx90a
AMD RX 5700 XT gfx1010 gfx101x amdrocm-gfx101x amdrocm-core-sdk-gfx101x
AMD RX 6900 XT gfx1030 gfx103x amdrocm-gfx103x amdrocm-core-sdk-gfx103x
AMD RX 6800 XT gfx1031 gfx103x amdrocm-gfx103x amdrocm-core-sdk-gfx103x

Tip

To find the latest available release:

Installing on Debian-based systems (Ubuntu, Debian, etc.)

# Step 1: Find the latest release from https://rocm.nightlies.amd.com/deb/
#         Look for directories like "20260310-12345678"
# Step 2: Look at the "GPU family and package mapping" table above to find
#         the GFX Family for your GPU (e.g., gfx94x, gfx110x, gfx1151)
# Step 3: Set the variables below

export RELEASE_ID=20260310-12345678  # Replace with actual date-runid
export GFX_ARCH=gfx110x              # Replace with GFX Family from the mapping table

# Step 4: Add repository and install
sudo apt update
sudo apt install -y ca-certificates
echo "deb [trusted=yes] https://rocm.nightlies.amd.com/deb/${RELEASE_ID} stable main" \
  | sudo tee /etc/apt/sources.list.d/rocm-nightly.list
sudo apt update
sudo apt install amdrocm-core-sdk-${GFX_ARCH}
# If only runtime is needed, install amdrocm-${GFX_ARCH} instead

Installing on RPM-based systems (RHEL, SLES, AlmaLinux etc.)

Note

The following instructions are for RHEL-based operating systems.

# Step 1: Find the latest release from https://rocm.nightlies.amd.com/rpm/
#         Look for directories like "20260310-12345678"
# Step 2: Look at the "GPU family and package mapping" table above to find
#         the GFX Family for your GPU (e.g., gfx94x, gfx110x, gfx1151)
# Step 3: Set the variables below

export RELEASE_ID=20260310-12345678  # Replace with actual date-runid
export GFX_ARCH=gfx110x              # Replace with GFX Family from the mapping table

# Step 4: Add repository and install
sudo dnf install -y ca-certificates
sudo tee /etc/yum.repos.d/rocm-nightly.repo <<EOF
[rocm-nightly]
name=ROCm Nightly Repository
baseurl=https://rocm.nightlies.amd.com/rpm/${RELEASE_ID}/x86_64
enabled=1
gpgcheck=0
priority=50
EOF
sudo dnf install amdrocm-core-sdk-${GFX_ARCH}
# If only runtime is needed, install amdrocm-${GFX_ARCH} instead

Verifying your installation

After installing ROCm via either pip packages, tarballs or native packages, you can verify that your GPU is properly recognized.

Linux

Run one of the following commands to verify that your GPU is detected and properly initialized by the ROCm stack:

rocminfo
# or
amd-smi

Windows

Run the following command to verify GPU detection:

hipInfo.exe

Additional troubleshooting

If your GPU is not recognized or you encounter issues: