Skip to content

Commit

Permalink
[README]
Browse files Browse the repository at this point in the history
  • Loading branch information
Kye committed Jan 13, 2024
1 parent 3ec88fe commit 0c83313
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)

# Mamba Transformer
![Mamba Transformer](https://www.figma.com/file/YZxnPMjtj5XrEA4XC2ANIE/Mamba-Transformer?type=whiteboard&node-id=1%3A2&t=ybC6tnz8xiYie1hb-1)

![Mamba Transformer](https://www.figma.com/file/YZxnPMjtj5XrEA4XC2ANIE/Mamba-Transformer?type=whiteboard&node-id=1%3A2&t=ybC6tnz8xiYie1hb-1?raw=true)

Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling.

This is 100% novel architecture that I have designed to combine the strengths and weaknesses out of SSMs and Attention for an all-new advanced architecture with the purpose of surpassing our old limits. Faster processing speed, longer context lengths, lower perplexity over long sequences, enhanced and superior reasoning while remaining small and compact.
Expand Down

0 comments on commit 0c83313

Please sign in to comment.