Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM reasoning based on multiple text passages #650

Open
cristi-constantin opened this issue Oct 20, 2023 · 2 comments · May be fixed by Lsquare-LIU/aws-ai-qna-bot#98
Open

LLM reasoning based on multiple text passages #650

cristi-constantin opened this issue Oct 20, 2023 · 2 comments · May be fixed by Lsquare-LIU/aws-ai-qna-bot#98

Comments

@cristi-constantin
Copy link

How can I get the QnABot to pass more than one text passage item to the LLM engine?

For example, I have two texts that, whenever I delete one, the other one is picked up as a suitable answer and passed to the LLM. So, both of them are relevant/above the threshold. When both are present, just one is picked up by the OpenSearch and passed to the LLM as context. I don't use Kendra.
I want both/several text passages to be passed to the LLM so the LLM can answer questions requiring more than one passage to infer the proper answer.
Thank you.

@rstrahan
Copy link
Contributor

Currently QnABot only passes one text passage to the LLM - whichever one has the highest relevance score..

This is so that if the matched text item has additional behavior, such as response card images or buttons, session attribute settings, lambda hooks, etc.. QnAbot will honor and process these.. BUT that can be done only if there is a single text item match used in the answer.

If multiple text passages were allowed, I think the tradeoff would be to disable any/all of that additional capability, and use the text item simply as context for the LLM.. I do expect this is a good tradeoff to offer in return for supporting TopN text item matches for LLM input.

@marcburnie - new feature request?

@kroeter
Copy link

kroeter commented Dec 8, 2023

We've added this request to our backlog, but it is not currently planned for development in the next 6 months. I'll provide updates on this thread if that changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants