Replies: 1 comment
-
I remember that comfyui is not acceptable to form a closed loop, let alone form two nested closed loops. This idea is good, but comfyui should not be achievable. At least the current comfyui should not be achievable. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
Thank you for the nice set of LLM Tools.
I'd like to suggest a workflow that may allow longer chat exeperience and/or could help maintain a more consistent chat conversation.
Below is the screenshot of the workflow:
The diagram illustrates a chat workflow designed to maintain a long conversation with a large language model (LLM) by summarizing and compressing the chat history. Here's how the process works:
Chat Interface:
This is the entry and exit point of the workflow, representing the user interface where the conversation takes place. It interacts with the LLM and displays responses to the user, ensuring the conversation flows smoothly.
1. Initial Interaction:
2. First LLM Response:
3. Chat History Accumulation:
Summary Loop:
4. Token Count and Summarization Trigger:
5. Summarization Process:
6. New Compressed Chat History:
So in short, the workflow I am suggesting allows the following:
This approach ensures that long conversations can be maintained without running into token limits, improving the efficiency of using LLMs for extended chats.
I hope this helps.
Please keep up the good work.
Beta Was this translation helpful? Give feedback.
All reactions