Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue of ignoring multiple tool calls #432

Closed
wants to merge 0 commits into from

Conversation

castortech
Copy link

This issue is described here: #431 (comment)

Copy link
Collaborator

@abrenneke abrenneke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay here! Couple things

})),
}
} else {
output['function-call' as PortId] = {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The option parallelFunctionCalling switches between the function-call and function-calls output and it looks like that's lost here - this will break existing graphs, unfortunately, could you bring that behavior back?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the delay, will do

if (isMultiResponse) {
output['function-call' as PortId] = {
type: 'object[]',
value: functionCalls.flat().map((functionCall) => ({
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure that .flat() is right here? If there are multiple LLM calls that all make multiple function calls, then the output should be object[][] - an array per LLM call

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMHO that would have the same issue as above, changing expectations. I'll leave that up to you to decide and I can make the change

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right, but setting the LLM to return multiple responses already arrayifies the string output, so it would make sense to arrayify the function call output as well. I consider it bug, and I doubt enough people are using multi-response with parallel function calling for it to be a big enough problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants