diff --git a/README.md b/README.md index d80c832c..5fd26611 100644 --- a/README.md +++ b/README.md @@ -559,7 +559,6 @@ Attention 18% faster with sigmoid instead of attention ```python import torch from zeta import SigmoidAttention -from loguru import logger batch_size = 32 seq_len = 128