Skip to content

Commit

Permalink
README: replace PTDB with DLE (#2708)
Browse files Browse the repository at this point in the history
Fixes #2690.
  • Loading branch information
pbchekin authored Nov 14, 2024
1 parent 876ce90 commit 1da08d7
Showing 1 changed file with 6 additions and 8 deletions.
14 changes: 6 additions & 8 deletions .github/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,12 @@ Intel® XPU Backend for Triton\* is a out of tree backend module for [Triton](ht
* GPU Cards:
* [Intel® Data Center GPU Max Series](https://www.intel.com/content/www/us/en/products/details/discrete-gpus/data-center-gpu/max-series.html)
* [Intel® Data Center Flex Series](https://www.intel.com/content/www/us/en/products/details/discrete-gpus/data-center-gpu/flex-series.html)
* [Intel Arc A770](https://www.intel.com/content/www/us/en/products/sku/229151/intel-arc-a770-graphics-16gb/specifications.html)
* [Intel® Arc A770](https://www.intel.com/content/www/us/en/products/sku/229151/intel-arc-a770-graphics-16gb/specifications.html)
* GPU Drivers:
* Latest [Long Term Support (LTS) Release](https://dgpu-docs.intel.com/driver/installation.html)
* Latest [Rolling Release](https://dgpu-docs.intel.com/driver/installation-rolling.html)
* Toolchain:
* Latest [PyTorch Prerequisites for Intel GPUs](https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpus.html)
* Latest [Intel® Deep Learning Essentials](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html?packages=dl-essentials&dl-lin=offline&dl-essentials-os=linux)

Note that Intel® XPU Backend for Triton\* is not compatible with Intel® Extension for PyTorch\* and Intel® oneAPI Base Toolkit\*.

Expand All @@ -27,8 +27,7 @@ Note that Intel® XPU Backend for Triton\* is not compatible with Intel® Extens
## Prerequisites

1. Latest [Rolling Release](https://dgpu-docs.intel.com/driver/installation-rolling.html) or [Long Term Support Release](https://dgpu-docs.intel.com/driver/installation.html) of GPU driver
2. Latest release of [PyTorch Prerequisites for Intel GPUs](https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpus.html)
3. Latest release of [Profiling Tools Interfaces for Intel GPU (PTI for GPU)](https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpus.html)
2. Latest release of [Intel® Deep Learning Essentials](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html?packages=dl-essentials&dl-lin=offline&dl-essentials-os=linux)

## Install PyTorch and Triton from nightly wheels

Expand All @@ -45,7 +44,7 @@ Before using Intel® XPU Backend for Triton\* you need to initialize the toolcha
The default location is `/opt/intel/oneapi` (if installed as a `root` user) or `~/intel/oneapi` (if installed as a regular user).

```shell
# replace /opt/intel/oneapi with the actual location of PyTorch Prerequisites for Intel GPUs
# replace /opt/intel/oneapi with the actual location of Intel® Deep Learning Essentials
source /opt/intel/oneapi/setvars.sh
```

Expand All @@ -54,8 +53,7 @@ source /opt/intel/oneapi/setvars.sh
## Prerequisites

1. Latest [Rolling Release](https://dgpu-docs.intel.com/driver/installation-rolling.html) or [Long Term Support Release](https://dgpu-docs.intel.com/driver/installation.html) of GPU driver
2. Latest release of [PyTorch Prerequisites for Intel GPUs](https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpus.html)
3. Latest release of [Profiling Tools Interfaces for Intel GPU (PTI for GPU)](https://www.intel.com/content/www/us/en/developer/articles/tool/pytorch-prerequisites-for-intel-gpus.html)
2. Latest release of [Intel® Deep Learning Essentials](https://www.intel.com/content/www/us/en/developer/tools/oneapi/base-toolkit-download.html?packages=dl-essentials&dl-lin=offline&dl-essentials-os=linux)

## Compile PyTorch and Triton from source

Expand All @@ -65,7 +63,7 @@ Before compiling PyTorch and Intel® XPU Backend for Triton\* you need to initia
The default location is `/opt/intel/oneapi` (if installed as a `root` user) or `~/intel/oneapi` (if installed as a regular user).

```shell
# replace /opt/intel/oneapi with the actual location of PyTorch Prerequisites for Intel GPUs
# replace /opt/intel/oneapi with the actual location of Intel® Deep Learning Essentials
source /opt/intel/oneapi/setvars.sh
```

Expand Down

0 comments on commit 1da08d7

Please sign in to comment.