Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,24 @@ The install script installs system dependencies along with torchforge. Note that

Optional: By default, the packages installation uses conda. If you want to install system packages on the target machine instead of conda, you can pass the `--use-sudo` flag to the installation script: `./scripts/install.sh --use-sudo`.

### XPU Installation

XPU (Intel GPU) users can install with the dedicated script:

```bash
conda create -n forge python=3.12
conda activate forge
./scripts/install_xpu.sh
```

Notes:
- Requires Intel oneAPI toolkit installed at `$ONEAPI_ROOT`, `/opt/intel/oneapi`, or loadable via `module load intel/oneapi`.
- Python version must match `XPU_PYTHON_VERSION` in `assets/versions.sh`.
- XPU build installs Monarch with `USE_TENSOR_ENGINE=0`, so RDMA and distributed tensor features are disabled for now.
- Optional flag: `--use-sudo` (system packages via `apt`/`dnf` instead of conda).
- Re-activate your conda environment after install to pick up the oneAPI activation hook.


### Pixi

Pixi combines benefits of uv with access to conda forge for system dependencies. [pixi.toml](./pixi.toml) provides a manifest with build tasks with `install` as a the combined install all task.
Expand Down
9 changes: 7 additions & 2 deletions assets/versions.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,9 +9,14 @@

# Stable versions of upstream libraries for OSS repo
PYTORCH_VERSION="2.9.0"
# ROCm builds vLLM from source (no prebuilt ROCm wheels available)
# ROCm/XPU builds vLLM from source (no prebuilt ROCm/XPU wheels available)
VLLM_ROCM_VERSION="v0.10.0"
VLLM_XPU_VERSION="v0.17.0"
# PyTorch XPU version (vLLM v0.16+ dropped IPEX in favour of native XPU support)
PYTORCH_XPU_VERSION="2.10.0"
# vllm-xpu-kernels wheels only ship for Python 3.12
XPU_PYTHON_VERSION="3.12"
TORCHSTORE_BRANCH="no-monarch-2026.01.05"
# ROCm install builds these from source (no ROCm wheels); CUDA uses pyproject pins.
# ROCm/XPU builds these from source (no ROCm/XPU wheels); CUDA uses pyproject pins.
TORCHTITAN_VERSION="v0.2.0"
MONARCH_VERSION="v0.2.0"
Loading