This project provides a working Chatbot example using ThetaEdge Cloud. It is implemented using React and Vite.
- Configure your chatbot by editing the varibles in the file
.local.env
. See Prerequisites if you don't have an API URL.
VITE_CHATBOT_API_URL : the inference endpoint generated with your Theta Edge Cloud dashboard.
VITE_CHATBOT_INSTRUCTIONS : describes your chatbot intended functionality
VITE_CHATBOT_FIRST_QUESTION : the first question to display to your users
VITE_CHATBOT_FIRST_ANSWER : the first answer to display to your users
- Run the following commands to install the project dependencies and start it locally:
npm install
npm run dev
To run this project, you'll need an API URL, here is how to get it :
-
Navigate to Hugging Face Tokens and generate a new API key. Save this key securely as you will need it to create a Llama-3 model.
-
Visit the Meta-Llama-3-8B-Instruct license page on Hugging Face and agree to the terms of use.
-
Deploy the Llama-3 model on Theta Edge Cloud by going to the Model Explorer. Use the API key obtained in Step 1 during the deployment process.