Self hosted OpenAI compatible model does not call tools #2446
magaton
started this conversation in
Discussions
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I am trying to migrate langchain agent to a langgraph and hitting a problem with locally hosted qwen-2.5 (Tabby wrapper through ChatOpenAI). The tools are not called!
The tools are executed when I use either Ollama through ChatOllama or OpenAi model through ChatOpenAI.
I have tried three options:
None of the above works and I would conclude that this particular model or tabby itself cannot call tools, but this proves not to be the case.
Old langchain agents just work, meaning the tool is properly called.
The question I use in all of the scenarios is: "Who won US Open 2024?"
What should I do in langgraph (both in create-react-agent and classic graph with Nodes and ToolNodes) to get the tool_call working?
Beta Was this translation helpful? Give feedback.
All reactions