v1/chat/completions 接口报 502 #4084
-
server_config.py fastchat openai_api serverFSCHAT_OPENAI_API = { 模型LLM主要通过接口方式访问,LLM_MODELS = ["zhipu-api", "openai-api"],两个模型都配置了key,openai配置了请求地址,liunx环境是好的。 |
Beta Was this translation helpful? Give feedback.
Answered by
zRzRzRzRzRzRzR
Jun 23, 2024
Replies: 1 comment 1 reply
-
这个原因有点多,很有可能是模型这边没通,在现在0.3.x已经解耦了,可以检测模型测这是否返回内容 |
Beta Was this translation helpful? Give feedback.
1 reply
Answer selected by
zRzRzRzRzRzRzR
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
这个原因有点多,很有可能是模型这边没通,在现在0.3.x已经解耦了,可以检测模型测这是否返回内容