Title are not generated for Phi-3 #2544
Unanswered
newptcai
asked this question in
Troubleshooting
Replies: 1 comment 1 reply
-
This is a new feature introduced today, but for both titleModel and summaryModel, you can now set it to "current_model" which will use the currently selected model for title generation. Maybe you are having an issue with ollama attempting to load llama3 and it times out since it was not loaded already. I use phi3 without issues this way. Also, try commenting or removing My config: - name: "Ollama"
apiKey: "ollama"
baseURL: "http://localhost:11434/v1/"
models:
default: [
"llama3:latest",
"command-r",
"mixtral",
"phi3",
]
fetch: false # fetching list of models is not supported
titleConvo: true
titleModel: "current_model"
modelDisplayLabel: "Ollama" You will have to update to today's latest commit, let me know if you need help updating |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What happened?
I run phi-3 with Ollama. I can have conversation. But the titles are not created.
Steps to Reproduce
I run ollama through docker and LibreChat via docker compose. Here's the section in
librechat.yaml
What browsers are you seeing the problem on?
No response
Relevant log output
No response
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions