Become a sponsor to Frank Odom
Your sponsorship helps to support open-source artificial intelligence research. My recent work focuses on efficient Transformer-based architectures, especially for NLP, CV, and self-driving. All of this comes from my personal time, and is released under permissive licenses (e.g. MIT).
If you have benefited from my projects, please consider becoming a sponsor.
1 sponsor has funded fkodom’s work.
Featured work
-
fkodom/fft-conv-pytorch
Implementation of 1D, 2D, and 3D FFT convolutions in PyTorch. Much faster than direct convolutions for large kernel sizes.
Python 476 -
fkodom/transformer-from-scratch
Code implementation from my blog post: https://fkodom.substack.com/p/transformers-from-scratch-in-pytorch
Python 90 -
fkodom/yet-another-retnet
A simple but robust PyTorch implementation of RetNet from "Retentive Network: A Successor to Transformer for Large Language Models" (https://arxiv.org/pdf/2307.08621.pdf)
Python 100 -
fkodom/clip-text-decoder
Generate text captions for images from their embeddings.
Python 101 -
fkodom/dilated-attention-pytorch
(Unofficial) Implementation of dilated attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens" (https://arxiv.org/abs/2307.02486)
Python 50
0% towards 10 monthly sponsors goal
Be the first to sponsor this goal!