diff --git a/docs/general/CPU_DEVCATALOG.md b/docs/general/CPU_DEVCATALOG.md index 03a772e02..666e1e5a3 100644 --- a/docs/general/CPU_DEVCATALOG.md +++ b/docs/general/CPU_DEVCATALOG.md @@ -25,13 +25,6 @@ The tables below link to documentation on how to run each use case using docker | PyTorch | [ResNet 50](../../models_v2/pytorch/resnet50/training/cpu/CONTAINER.md) | FP32,BF32,BF16 | Training | ImageNet 2012 | | PyTorch | [ResNet 50](../../models_v2/pytorch/resnet50/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,INT8 | Inference | ImageNet 2012 | | PyTorch | [Vision Transformer](../../models_v2/pytorch/vit/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,INT8-FP32,INT8-BF16 | Inference | ImageNet 2012 | -| TensorFlow | [MobileNet V1*](../../quickstart/image_recognition/tensorflow/mobilenet_v1/inference/cpu/README_DEV_CAT.md) | FP32,BF32,FP16,INT8 | Inference | ImageNet 2012 | - -## Image Segmentation - -| Framework | Model | Precisions | Mode | Dataset | -| --------| ------------------------------------------------------ | ---------- | ------| --------------------- | -| TensorFlow | [3D U-Net MLPerf*](../../quickstart/image_segmentation/tensorflow/3d_unet_mlperf/inference/cpu/README_DEV_CAT.md) | FP32,BF16,INT8 | Inference | BRATS 2019 | ### Object Detection @@ -42,9 +35,6 @@ The tables below link to documentation on how to run each use case using docker | PyTorch |[SSD-ResNet34](../../models_v2/pytorch/ssd-resnet34/training/cpu/CONTAINER.md) | FP32,BF32,BF16 | Training | COCO 2017 | | PyTorch |[SSD-ResNet34](../../models_v2/pytorch/ssd-resnet34/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,INT8 | Inference | COCO 2017 | | PyTorch |[YOLO v7](../../models_v2/pytorch/yolov7/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,FP16,INT8 | Inference | COCO 2017 | -| TensorFlow | [SSD-ResNet34](../../quickstart/object_detection/tensorflow/ssd-resnet34/training/cpu/README_DEV_CAT.md) | FP32,BF32,BF16 |Training | COCO 2017 | -| TensorFlow | [SSD-ResNet34](../../quickstart/object_detection/tensorflow/ssd-resnet34/inference/cpu/README_DEV_CAT.md) | FP32,BF16,INT8 |Inference | COCO 2017 | -| TensorFlow | [SSD-MobileNet*](../../quickstart/object_detection/tensorflow/ssd-mobilenet/inference/cpu/README_DEV_CAT.md) | FP32,BF32,BF16,INT8 | Inference | COCO 2017 | ### Language Modeling @@ -57,13 +47,6 @@ The tables below link to documentation on how to run each use case using docker | PyTorch |[DistilBERT base](../../models_v2/pytorch/distilbert/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,INT8-BF16,INT8-BF32 | Inference | SST-2 | | TensorFlow | [BERT large](../../quickstart/language_modeling/tensorflow/bert_large/training/cpu/README_DEV_CAT.md) | FP32,BF16 | Training | SQuAD and MRPC | | TensorFlow | [BERT large](../../quickstart/language_modeling/tensorflow/bert_large/inference/cpu/README_DEV_CAT.md) | FP32,BF32,BF16,INT8 |Inference | SQuAD | -| TensorFlow | [DistilBERT Base](../../quickstart/language_modeling/tensorflow/distilbert_base/inference/cpu/README_DEV_CAT.md) | FP32,BF16,INT8 | Inference | SST-2 | - -## Language Translation -| Framework | Model | Precisions | Mode | Dataset | -| --------| ------------------------------------------------------ | ---------- | ------| --------------------- | -| TensorFlow | [Transformer_LT_mlperf*](../../quickstart/language_translation/tensorflow/transformer_mlperf/training/cpu/README_DEV_CAT.md) | FP32,BF16 | Training | WMT English-German dataset | -| TensorFlow | [Transformer_LT_mlperf*](../../quickstart/language_translation/tensorflow/transformer_mlperf/inference/cpu/README_DEV_CAT.md) | FP32,BF32,BF16,INT8 | Inference | WMT English-German dataset | ### Recommendation @@ -72,4 +55,3 @@ The tables below link to documentation on how to run each use case using docker | PyTorch | [DLRM](../../models_v2/pytorch/dlrm/training/cpu/CONTAINER.md) | FP32,BF32,BF16 | Training | Criteo Terabyte | | PyTorch | [DLRM](../../models_v2/pytorch/dlrm/inference/cpu/CONTAINER.md) | FP32,BF32,BF16,INT8 | Inference | Criteo Terabyte | | PyTorch | [DLRM v2](../../models_v2/pytorch/torchrec_dlrm/inference/cpu/CONTAINER.md) | FP32,BF16,FP16,INT8 | Inference | Criteo Terabyte | -| TensorFlow | [DIEN](../../quickstart/recommendation/tensorflow/dien/inference/cpu/README_DEV_CAT.md) | FP32,BF32,BF16 | Inference | DIEN dataset |