Chat Vertex AI Provider plugin for LLM Workflow Engine
Access to Google Vertex AI chat models.
You must configure access to the Vertex AI API in Google Cloud by either:
- Having credentials configured for your environment (gcloud, workload identity, etc...)
- Storing the path to a service account JSON file as the
GOOGLE_APPLICATION_CREDENTIALS
environment variable
Install the latest version of this software directly from github with pip:
pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-vertexai
Install the latest version of this software directly from git:
git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-vertexai.git
Install the development package:
cd lwe-plugin-provider-chat-vertexai
pip install -e .
Add the following to config.yaml
in your profile:
plugins:
enabled:
- provider_chat_vertexai
# Any other plugins you want enabled...
From a running LWE shell:
/provider chat_vertexai
/model model_name chat-bison