Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

Latest commit

 

History

History
18 lines (12 loc) · 537 Bytes

README.md

File metadata and controls

18 lines (12 loc) · 537 Bytes

PyTorch Multi-Head Attention

Travis Coverage

Install

pip install torch-multi-head-attention

Usage

from torch_multi_head_attention import MultiHeadAttention

MultiHeadAttention(in_features=768, head_num=12)