Skip to content

A library built upon PyTorch for building embeddings on discrete event sequences using self-supervision

License

Notifications You must be signed in to change notification settings

dllllb/pytorch-lifestream

Repository files navigation

ptls-logo

GitHub license PyPI version GitHub issues Telegram

pytorch-lifestream or ptls a library built upon PyTorch for building embeddings on discrete event sequences using self-supervision. It can process terabyte-size volumes of raw events like game history events, clickstream data, purchase history or card transactions.

It supports various methods of self-supervised training, adapted for event sequences:

  • Contrastive Learning for Event Sequences (CoLES)
  • Contrastive Predictive Coding (CPC)
  • Replaced Token Detection (RTD) from ELECTRA
  • Next Sequence Prediction (NSP) from BERT
  • Sequences Order Prediction (SOP) from ALBERT
  • Masked Language Model (MLM) from ROBERTA

It supports several types of encoders, including Transformer and RNN. It also supports many types of self-supervised losses.

The following variants of the contrastive losses are supported:

Install from PyPi

pip install pytorch-lifestream

Install from source

# Ubuntu 20.04

sudo apt install python3.8 python3-venv
pip3 install pipenv

pipenv sync --dev # install packages exactly as specified in Pipfile.lock
pipenv shell
pytest

Demo notebooks

We have a demo notebooks here, some of them:

  • Supervised model training notebook Open In Colab
  • Self-supervided training and embeddings for downstream task notebook Open In Colab
  • Self-supervided embeddings in CatBoost notebook
  • Self-supervided training and fine-tuning notebook Open In Colab
  • Self-supervised TrxEncoder only training with Masked Language Model task and fine-tuning notebook
  • Pandas data preprocessing options notebook Open In Colab
  • PySpark and Parquet for data preprocessing notebook
  • Fast inference on large dataset notebook
  • Supervised multilabel classification notebook Open In Colab
  • CoLES multimodal notebook Open In Colab

And we have a tutorials here

Docs

Documentation

Library description index

Experiments on public datasets

pytorch-lifestream usage experiments on several public event datasets are available in the separate repo

PyTorch-LifeStream in ML Competitions

How to contribute

  1. Make your chages via Fork and Pull request.
  2. Write unit test for new code in ptls_tests.
  3. Check unit test via pytest: Example.

Citation

We have a paper you can cite it:

@inproceedings{
   Babaev_2022, series={SIGMOD/PODS ’22},
   title={CoLES: Contrastive Learning for Event Sequences with Self-Supervision},
   url={http://dx.doi.org/10.1145/3514221.3526129},
   DOI={10.1145/3514221.3526129},
   booktitle={Proceedings of the 2022 International Conference on Management of Data},
   publisher={ACM},
   author={Babaev, Dmitrii and Ovsov, Nikita and Kireev, Ivan and Ivanova, Maria and Gusev, Gleb and Nazarov, Ivan and Tuzhilin, Alexander},
   year={2022},
   month=jun, collection={SIGMOD/PODS ’22}
}

About

A library built upon PyTorch for building embeddings on discrete event sequences using self-supervision

Resources

License

Stars

Watchers

Forks

Packages

No packages published