Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

CyberZHG/torch-multi-head-attention

Repository files navigation

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)