Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
-
Updated
Oct 24, 2024 - Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
list of efficient attention modules
[TPAMI 2023 ESI Highly Cited Paper] SePiCo: Semantic-Guided Pixel Contrast for Domain Adaptive Semantic Segmentation https://arxiv.org/abs/2204.08808
This repository contains my research work on building the state of the art next basket recommendations using techniques such as Autoencoders, TF-IDF, Attention based BI-LSTM and Transformer Networks
Implementation of Transformer Pointer-Critic Deep Reinforcement Learning Algorithm
Implementation of Basic Conversational Agent(a.k.a Chatbot) using PyTorch Transformer Module
Using Bayesian optimization via Ax platform + SAASBO model to simultaneously optimize 23 hyperparameters in 100 iterations (set a new Matbench benchmark).
A PyTorch implementation of a transformer network trained using back-translation
Implementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
Codes and write-up for Red Dragon AI Advanced NLP Course.
The objective of the project is to generate a abstractive summary from a bigger article. The process includes all the preprocessing step and summarizing the whole article. This will be very helpful to get the important context of bigger article.
Add a description, image, and links to the transformer-network topic page so that developers can more easily learn about it.
To associate your repository with the transformer-network topic, visit your repo's landing page and select "manage topics."