-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NVIDIA NIM LLM Hosting Pattern #560
Comments
This will be a great addition. Step1: Leverage Existing Terraform Blueprint: Step2: Add a New Folder: Step3: Create Documentation: |
Thank you @vara-bonthu . I am gonna implement it, please feel free to assign this to me. I raise a PR once it is finished. |
Community Note
What is the outcome that you are trying to reach?
NVIDIA NIM pvovides an easy way to self-host LLMs with containers. AWS can provide series of solutions to help customer to deploy the NIM provided LLMs in a performant and cost optimized way with EKS, Karpenter, Spot etc.
Would like to have a pattern for the customer to easily deploy the NIM LLMs in AWS.
Describe the solution you would like
Provide a IaC pattern in Data on EKS to showcase deploy NVIDIA NIM provided LLMs. Will use Karpenter, Spot, and Bottlerocket with image caching techniques which I introduced in this AWS blog.
Describe alternatives you have considered
Additional context
NVIDIA NIM: https://www.nvidia.com/en-us/ai/
Features and architecture: https://docs.nvidia.com/nim/large-language-models/latest/introduction.html
The text was updated successfully, but these errors were encountered: