Streaming is now officially supported by OpenAI: https://github.com/openai/openai-node
Lambda-OpenAI-Stream lets you stream OpenAI responses via an AWS Lambda Function URL. It is a simple implementation in vanilla JS. The only (optional) dependency is dotenv.
- General AWS knowledge is helpful.
- You need to have Docker installed locally.
- You need to have aws-sam installed locally and configured with your AWS account.
- Clone the repository
git clone https://github.com/maxsagt/lambda-openai-stream.git
- Create the .env in ./src with your OpenAI API key:
OPENAI_API_KEY=abc123
- Install dotenv
npm init npm install dotenv
- Build and test the Lambda function locally
sam build sam local invoke -e event.json
- Deploy to AWS. Note that your AWS user or role needs (temporary) IAM permissions for AWS CloudFormation, S3, Lambda and IAM.
sam build --cached --parallel # Use sam deploy --guided to control the AWS region.
- Done. Your Lambda Function URL is displayed in the terminal, and you can find it in AWS.
- Add configurations for Amazon CloudFront and AWS WAF to introduce authentication
- Add an index.html to show how the frontend could work.
Feedback and contributions are welcome!