Skip to content

v0.0.14

Latest
Compare
Choose a tag to compare
@rounak610 rounak610 released this 16 Jan 14:21
· 8 commits to main since this release
7411a01

✨SuperAGI v0.0.14✨

🚀 Enhanced Local LLM Support with Multi-GPU 🎉

New Feature Highlights 🌟

⚙️ Local Large Language Model (LLM) Integration:

  • SuperAGI now supports the use of local large language models, allowing users to leverage their own models seamlessly within the SuperAGI framework.
  • Easily configure and integrate your preferred LLMs for enhanced customization and control over your AI agents.

⚡️ Multi-GPU Support:

  • SuperAGI now provides multi-GPU support for improved performance and scalability.

How to Use

To enable Local Large Language Model (LLM) with Multi-GPU support, follow these simple steps:

  1. LLM Integration:
    • Add your model path in the celery and backend volumes in the docker-compose-gpu.yml file.
    • Run the command:
      docker compose -f docker-compose-gpu.yml up --build
    • Open localhost:3000 in your browser.
    • Add a local LLM model from the model section.
    • Use the added model for running your agents.

What’s Changed