Skip to content

DeepRec is a high-performance recommendation deep learning framework based on TensorFlow. It is hosted in incubation in LF AI & Data Foundation.

License

Notifications You must be signed in to change notification settings

paipai-Studio/DeepRec

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DeepRec Logo


Introduction

DeepRec is a high-performance recommendation deep learning framework based on TensorFlow 1.15, Intel-TensorFlow and NVIDIA-TensorFlow. It is hosted in incubation in LF AI & Data Foundation.

Background

Recommendation models have huge commercial values for areas such as retailing, media, advertisements, social networks and search engines. Unlike other kinds of models, recommendation models have large amount of non-numeric features such as id, tag, text and so on which lead to huge parameters.

DeepRec has been developed since 2016, which supports core businesses such as Taobao Search, recommendation and advertising. It precipitates a list of features on basic frameworks and has excellent performance in recommendation models training and inference. So far, in addition to Alibaba Group, dozens of companies have used DeepRec in their business scenarios.

Key Features

DeepRec has super large-scale distributed training capability, supporting recommendation model training of trillion samples and over ten trillion parameters. For recommendation models, in-depth performance optimization has been conducted across CPU and GPU platform. It contains list of features to improve usability and performance for super-scale scenarios.

Embedding & Optimizer

  • Embedding Variable.
  • Dynamic Dimension Embedding Variable.
  • Adaptive Embedding Variable.
  • Multiple Hash Embedding Variable.
  • Multi-tier Hybrid Embedding Storage.
  • Group Embedding.
  • AdamAsync Optimizer.
  • AdagradDecay Optimizer.

Training

  • Asynchronous Distributed Training Framework (Parameter Server), such as grpc+seastar, FuseRecv, StarServer etc.
  • Synchronous Distributed Training Framework (Collective), such as HybridBackend, Sparse Operation Kits (SOK) etc.
  • Runtime Optimization, such as Graph Aware Memory Allocator (GAMMA), Critical-path based Executor etc.
  • Runtime Optimization (GPU), GPU Multi-Stream Engine which support multiple CUDA compute stream and CUDA Graph.
  • Operator level optimization, such as BF16 mixed precision optimization, embedding operator optimization and EmbeddingVariable on PMEM and GPU, new hardware feature enabling, etc.
  • Graph level optimization, such as AutoGraphFusion, SmartStage, AutoPipeline, Graph Template Engine, Sample-awared Graph Compression, MicroBatch etc.
  • Compilation optimization, support BladeDISC, XLA etc.

Deploy and Serving

  • Delta checkpoint loading and exporting.
  • Super-scale recommendation model distributed serving.
  • Multi-tier hybrid storage and multi backend supported.
  • Online deep learning with low latency.
  • High performance inference framework SessionGroup (share-nothing), with multiple threadpool and multiple CUDA stream supported.
  • Model Quantization.

Installation

Prepare for installation

CPU Platform

alideeprec/deeprec-build:deeprec-dev-cpu-py38-ubuntu20.04

GPU Platform

alideeprec/deeprec-build:deeprec-dev-gpu-py38-cu116-ubuntu20.04

How to Build

Configure

$ ./configure

Compile for CPU and GPU defaultly

$ bazel build -c opt --config=opt //tensorflow/tools/pip_package:build_pip_package

Compile for CPU and GPU: ABI=0

$ bazel build --cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" --host_cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" -c opt --config=opt //tensorflow/tools/pip_package:build_pip_package

Compile for CPU optimization: oneDNN + Unified Eigen Thread pool

$ bazel build -c opt --config=opt --config=mkl_threadpool //tensorflow/tools/pip_package:build_pip_package

Compile for CPU optimization and ABI=0

$ bazel build --cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" --host_cxxopt="-D_GLIBCXX_USE_CXX11_ABI=0" -c opt --config=opt --config=mkl_threadpool //tensorflow/tools/pip_package:build_pip_package

Create whl package

$ ./bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg

Install whl package

$ pip3 install /tmp/tensorflow_pkg/tensorflow-1.15.5+${version}-cp38-cp38m-linux_x86_64.whl

Latest Release Images

Image for CPU

alideeprec/deeprec-release:deeprec2304-cpu-py38-ubuntu20.04

Image for GPU CUDA11.6

alideeprec/deeprec-release:deeprec2304-gpu-py38-cu116-ubuntu20.04

Continuous Build Status

Official Build

Build Type Status
Linux CPU CPU Build
Linux GPU GPU Build
Linux CPU Serving CPU Serving Build
Linux GPU Serving GPU Serving Build

Official Unit Tests

Unit Test Type Status
Linux CPU C CPU C Unit Tests
Linux CPU CC CPU CC Unit Tests
Linux CPU Contrib CPU Contrib Unit Tests
Linux CPU Core CPU Core Unit Tests
Linux CPU Examples CPU Examples Unit Tests
Linux CPU Java CPU Java Unit Tests
Linux CPU JS CPU JS Unit Tests
Linux CPU Python CPU Python Unit Tests
Linux CPU Stream Executor CPU Stream Executor Unit Tests
Linux GPU C GPU C Unit Tests
Linux GPU CC GPU CC Unit Tests
Linux GPU Contrib GPU Contrib Unit Tests
Linux GPU Core GPU Core Unit Tests
Linux GPU Examples GPU Examples Unit Tests
Linux GPU Java GPU Java Unit Tests
Linux GPU JS GPU JS Unit Tests
Linux GPU Python GPU Python Unit Tests
Linux GPU Stream Executor GPU Stream Executor Unit Tests
Linux CPU Serving UT CPU Serving Unit Tests
Linux GPU Serving UT GPU Serving Unit Tests

User Document

Chinese: https://deeprec.readthedocs.io/zh/latest/

English: https://deeprec.readthedocs.io/en/latest/

Contact Us

Join the Official Discussion Group on DingTalk

Join the Official Discussion Group on WeChat

License

Apache License 2.0

About

DeepRec is a high-performance recommendation deep learning framework based on TensorFlow. It is hosted in incubation in LF AI & Data Foundation.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 55.3%
  • Python 32.6%
  • HTML 3.0%
  • Starlark 2.4%
  • Jupyter Notebook 1.9%
  • MLIR 1.3%
  • Other 3.5%