Skip to content

Exporting the Mask RCNN with TF-Serving (REST, GRPC, Notebooks provided)

Notifications You must be signed in to change notification settings

moganesyan/tensorflow_model_deployment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Exporting the Mask RCNN

Currently, this repo contains the code necessary to export Matterport's Mask RCNN into a TF-Serving deployment ready format. I've also added the tools to serve the exported model locally in TF-Serving (REST and GRPC interface).

The repo will be expanded to include code for deploying and scaling the Mask RCNN on Google Cloud in the near future. So don't forget to check back in!!!

This repository includes the code to:

  • Freeze the keras/tensorflow Mask RCNN model
  • Apply optional graph optimizations such as weights quantization
  • Export into the SavedModel format to be used for TF-Serving
  • Build a Docker image based on top of TF-Serving
  • Jupyter Notebook to test out the deployment

Getting Started

  • Make sure that your Python 3 environment has all of the requirements.
  • Edit the export config file. You can add/remove optional graph optimizations.
  • Go to the tf_serving folder and run python3 make_tf_serving.py. This will export the Keras/Tensorflow MaskRCNN into the SavedModel format and apply any optional graph optimizations.
  • (Optional): In the same folder, run build_image.sh to build and run a docker image serving the exported model. If you don't have a GPU, remove _gpu on the first line of the docker file.
  • (Optional): Play around with the Jupyter Notebook to call the served models and visualize the results.