Skip to content

πŸš€ Automated installation script for vLLM on HPC systems with ROCm support, optimized for AMD MI300X GPUs.

License

Notifications You must be signed in to change notification settings

AI-DarwinLabs/vllm-hpc-installer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

vLLM HPC Installer

Automated installation script for vLLM on HPC systems with ROCm support. This installer specifically targets AMD MI300X GPUs and similar architectures in HPC environments.

Features

  • Automated installation of vLLM and all dependencies
  • ROCm support with PyTorch nightly builds
  • Flash Attention integration
  • Anaconda environment management
  • Comprehensive logging and error handling
  • HPC module management
  • Support for AMD MI300X GPUs (customizable for other architectures)

Prerequisites

  • Access to an HPC system with ROCm support
  • AMD GPU (default configuration for MI300X)
  • Git
  • Anaconda/Miniconda
  • Module environment system

Quick Start

  1. Clone the repository:
git clone https://github.com/AI-DarwinLabs/vllm-hpc-installer.git
cd vllm-hpc-installer
  1. Make the script executable:
chmod +x install.sh
  1. Run the installer:
./install.sh

Configuration

You can customize the installation by editing config/default_config.sh:

  • PYTHON_VERSION: Python version (default: 3.11)
  • PYTORCH_ROCM_VERSION: ROCm version for PyTorch (default: 6.2)
  • GPU_ARCH: GPU architecture (default: gfx942 for MI300X)
  • CONDA_ENV_NAME: Name of the conda environment (default: vllm)

Directory Structure

vllm-hpc-installer/
β”œβ”€β”€ install.sh           # Main installation script
β”œβ”€β”€ config/             # Configuration files
β”œβ”€β”€ modules/            # Installation modules
└── docs/              # Documentation

Documentation

Support

For issues and feature requests, please use the GitHub Issues page.

Contributing

Contributions are welcome! Please read our Contributing Guidelines before submitting pull requests.

About

πŸš€ Automated installation script for vLLM on HPC systems with ROCm support, optimized for AMD MI300X GPUs.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages