Skip to content

LibreChat responds with nonsense when calling LiteLLM as Proxy for Ollama #2215

Closed Answered by bsu3338
K-J-VV asked this question in Troubleshooting
Discussion options

You must be logged in to vote

Instead of ollama/llama2 try openai/llama2. Just curious it using ollama openai api compatibility makes a difference. May have to add /v1 to your base.

Replies: 5 comments 5 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
3 replies
@danny-avila
Comment options

@danny-avila
Comment options

@bsu3338
Comment options

Answer selected by K-J-VV
Comment options

You must be logged in to vote
1 reply
@danny-avila
Comment options

Comment options

You must be logged in to vote
1 reply
@danny-avila
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #2214 on March 26, 2024 14:19.