Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Debounce, LlamaCpp support, expose prompt as setup option, fix passing parameters to model (ollama) #11
base: main
Are you sure you want to change the base?
Debounce, LlamaCpp support, expose prompt as setup option, fix passing parameters to model (ollama) #11
Changes from 14 commits
d04b8f4
edb3d59
71d051e
79d4b29
6018a42
1dc3cb4
490e5eb
7910988
41dda5b
865035f
a69f521
30e63f0
071d701
2f8e1b0
4442111
84f2fea
9633b8e
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this bug apply to the
openai
orbard
backends as well? They similarly useparams
instead ofo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I do not know, since I do not used Bard and openai. I can only assume that if they work ok, then this line is not needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am not comfortable with this entire debounce concept.
First, I dont think it is need here. cmp already has a debounce implementation.
Second, I think this implementation is wrong; there should not be a global autocommand which handles the debounce.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Built in debounce in cmp - has issue where it is not working as it should:
I tested this with 2 second delay, and cmp will not wait for last key press.
In my implementation it will wait till last key is pressed - thus it wont spin my gpu fans as much. I'm not sure if the implementation is ok, I just copied someone else code. I know that copilot - cmp extension is also using its own debounce code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, looking at cmp sources I can only agree.