-
Notifications
You must be signed in to change notification settings - Fork 241
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix issue of ignoring multiple tool calls #432
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the delay here! Couple things
})), | ||
} | ||
} else { | ||
output['function-call' as PortId] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The option parallelFunctionCalling
switches between the function-call
and function-calls
output and it looks like that's lost here - this will break existing graphs, unfortunately, could you bring that behavior back?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the delay, will do
if (isMultiResponse) { | ||
output['function-call' as PortId] = { | ||
type: 'object[]', | ||
value: functionCalls.flat().map((functionCall) => ({ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you sure that .flat()
is right here? If there are multiple LLM calls that all make multiple function calls, then the output should be object[][]
- an array per LLM call
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMHO that would have the same issue as above, changing expectations. I'll leave that up to you to decide and I can make the change
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, but setting the LLM to return multiple responses already arrayifies the string output, so it would make sense to arrayify the function call output as well. I consider it bug, and I doubt enough people are using multi-response with parallel function calling for it to be a big enough problem
This issue is described here: #431 (comment)