Response live streaming isn't working at all #4247
rcdailey
started this conversation in
Help Wanted
Replies: 1 comment 2 replies
-
MS Edge |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have librechat set up on a home server and I use it on three separate machines:
1 & 2 both work the same and correctly. When I send a message to Anthropic or OpenAI models, I get the response streamed back to me in real time.
However, for number 3, the response is not streamed back to me for any model. Instead, I get the response all at once, and I get a waiting indicator until the response is presented.
I can only assume there's some work policy or maybe a firewall setting preventing this from working, but I have no way of really knowing what the issue is. I did check the MS Edge dev tools console for any errors but I don't see any.
Could you help me figure out why the 3rd scenario behaves differently?
Beta Was this translation helpful? Give feedback.
All reactions