Bedrock, output of Llama 3.1 405B model just stops #4130
Unanswered
dirkpetersen
asked this question in
Troubleshooting
Replies: 2 comments
-
Beta Was this translation helpful? Give feedback.
0 replies
-
ah that makes sense ..... for my testing with Python I had already globally increased that length |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What happened?
When I ask the Llama 3.1 405B model on Bedrock to return a page, it will just stop after a while without completing. This does not happen with the Llama 3.1 70B model or any other models. It also does not happen when I query bedrock on the CLI or via Python. My current streamRate is 75 but i slowed it down to as much as 250 with no effect
Steps to Reproduce
What browsers are you seeing the problem on?
Firefox
Relevant log output
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions