Skip to content
This repository has been archived by the owner on Feb 25, 2024. It is now read-only.

bentoml/bentoctl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

โš ๏ธ BentoCTL project has been deprecated

Plese see the latest BentoML documentation on OCI-container based deployment workflow: https://docs.bentoml.com/

๐Ÿš€ Fast model deployment on any cloud

join_slack

bentoctl helps deploy any machine learning models as production-ready API endpoints on the cloud, supporting AWS SageMaker, AWS Lambda, EC2, Google Compute Engine, Azure, Heroku and more.

๐Ÿ‘‰ Join our Slack community today!

โœจ Looking deploy your ML service quickly? You can checkout BentoML Cloud for the easiest and fastest way to deploy your bento. It's a full featured, serverless environment with a model repository and built in monitoring and logging.

Highlights

  • Framework-agnostic model deployment for Tensorflow, PyTorch, XGBoost, Scikit-Learn, ONNX, and many more via BentoML: the unified model serving framework.
  • Simplify the deployment lifecycle of deploy, update, delete, and rollback.
  • Take full advantage of BentoML's performance optimizations and cloud platform features out-of-the-box.
  • Tailor bentoctl to your DevOps needs by customizing deployment operator and Terraform templates.

Getting Started

Supported Platforms:

Community

Contributing

There are many ways to contribute to the project:

  • Create and share new operators. Use deployment operator template to get started.
  • If you have any feedback on the project, share it with the community in Github Discussions under the BentoML repo.
  • Report issues you're facing and "Thumbs up" on issues and feature requests that are relevant to you.
  • Investigate bugs and reviewing other developer's pull requests.

Licence

Elastic License 2.0 (ELv2)