Skip to content

Commit

Permalink
Merge pull request #1967 from windrichie/bugfix/update-websocket-lamb…
Browse files Browse the repository at this point in the history
…da-bedrock

Fix: modify Serverlessland URL path and update diagram label to WebSocket API
  • Loading branch information
Ben Smith authored Dec 19, 2023
2 parents 05b0a6f + 0ae268a commit f022c10
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
"x": 20,
"y": 40,
"service": "apigw",
"label": "API Gateway REST API"
"label": "API Gateway WebSocket API"
},
"icon2": {
"x": 50,
Expand Down
4 changes: 2 additions & 2 deletions apigw-websocket-api-bedrock-streaming/example-pattern.json
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
{
"title": "WebSocket API to Lambda to Bedrock with streaming response",
"description": "Creates a WebSocket API and Lambda functions that provides a streaming response from the LLMs in Amazon Bedrock.",
"description": "Creates an API Gateway WebSocket API and Lambda functions that provides a streaming response from the LLMs in Amazon Bedrock.",
"language": "Python",
"level": "200",
"framework": "SAM",
"introBox": {
"headline": "How it works",
"text": [
"This sample project demonstrates how to use WebSocket API as a front door to Lambda functions that perform inference on Amazon Bedrock LLMs using the [InvokeModelWithResponseStream](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModelWithResponseStream.html) API, where the LLM responses are returned in a stream.",
"With WebSockets API, developers of interactive LLM chatbot interfaces can provide a better user experience by displaying the LLM responses as they are generated (which at times can be long), rather than relying on the synchronous [InvokeModel](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html) API request.",
"With WebSocket API, developers of interactive LLM chatbot interfaces can provide a better user experience by displaying the LLM responses as they are generated (which at times can be long), rather than relying on the synchronous [InvokeModel](https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html) API request.",
"This pattern deploys one API Gateway WebSocket API, four Lambda functions, and one DynamoDB table."
]
},
Expand Down

0 comments on commit f022c10

Please sign in to comment.