Skip to content

A series of Terraform based recipes to provision popular MLOps stacks on the cloud.

License

Notifications You must be signed in to change notification settings

aai-institute/mlstacks

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLStacks: Deploy your MLOps infrastructure in minutes

🌰 In a nutshell: What is MLStacks?

MLStacks is a Python package that allows you to quickly spin up MLOps infrastructure using Terraform. It is designed to be used with ZenML, but can be used with any MLOps tool or platform.

Simply write stack and component YAML specification files and deploy them using the MLStacks CLI. MLStacks will take care of the rest. We currently support modular MLOps stacks on AWS, GCP and K3D (for local use).

👷 Why We Built MLStacks

maintained-by-zenml

When we first created ZenML as an extensible MLOps framework for creating portable, production-ready MLOps pipelines, we saw many of our users having to deal with the pain of deploying infrastructure from scratch to run these pipelines. The community consistently asked questions like:

  • How do I deploy tool X with tool Y?
  • Does a combination of tool X with Y make sense?
  • Isn't there an easy way to just try these stacks out to make an informed decision?

To address these questions, the ZenML team presents you a series of Terraform-based stacks to quickly provision popular combinations of MLOps tools. These stacks will be useful for you if:

  • You are at the start of your MLOps journey, and would like to explore different tools.
  • You are looking for guidelines for production-grade deployments.
  • You would like to run your MLOps pipelines on your chosen ZenML Stack.

🔥 Do you use these tools or do you want to add one to your MLOps stack? At ZenML, we are looking for design partnerships and collaboration to implement and develop these MLOps stacks in a real-world setting.

If you'd like to learn more, please join our Slack and leave us a message!

🤓 Learn More

🙏🏻 Acknowledgements

Thank you to the folks over at Fuzzy Labs for their support and contributions to this repository. Also many thanks to Ali Abbas Jaffri for several stimulating discussions around the architecture of this project.

We'd also like to acknowledge some of the cool inspirations for this project:

About

A series of Terraform based recipes to provision popular MLOps stacks on the cloud.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • HCL 83.1%
  • Python 15.5%
  • Shell 1.4%