This is legancy project of SAF. There is no plan for maintainess and feature updates. It original README is at here. This README shows you the customized implementation and deployment.
Streamer is also used as the ML inference application in Couper.
Build building and running streamer, please prefare your pre-trained TensorFlow model in Protocol Buffer format.
And put the model under directory ./models
.
First, build the builder images based on GPU or not under ./docker/tf_env
.
For, CPU-only host, using file Dockerfile_cpu
; if you have Nvidia GPU, using file Dockerfile
.
Next, build streamer image by the Dockerfile under root directory. Be awared to change the image name of builder in the file.
Please refer to this one in Couper repo.
Welcome to update and discuss the usecases througth issue tickets and pull requests. Or, for any other question, please contact to Carol Hsu.