-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Codestral not working in pre-release version #2421
Comments
Also there is
In developer logs, it isn't present in logs from release version (v0.8.52). |
@martincerven thanks for calling this out, it's an unintended consequence of 6e5c475, and definitely pretty high priority. Will get this in for next pre-release |
A duplicate for reference: #2430 |
More references: A forward-looking improvement we hope may be accepted into Ollama: ollama/ollama#6968 Documentation about the suffix property: https://github.com/ollama/ollama/blob/main/docs/api.md#parameters An example of a model whose template includes Suffix is here: https://ollama.com/library/starcoder2/blobs/3b190e68fefe (note that this model was shown as broken in the linked issue, but this is because the user would need to repeat From the Ollama documentation and from the initial PR, I don't believe there's any native way of getting model capabilities (e.g. whether insertion is supported), but we can tell from the template Solution probably is:
and then Continue will be able to automatically fall back to manually constructing the prompt |
I tried to use codestral with my api key from mistral. But autocomplete does not work, no error message, it does not work. "tabAutocompleteModel": { |
solution add apiBase :
Retreive the API key from Codestral menu and not the API key menu. good luck. |
"tabAutocompleteModel": { I add the apiBase but it still does not work. Tool do not generate any code. Yes i use the codestral api from the picture above |
please check if tab autocomplete is enabled on your ide settings : |
This Continue change should work however there is an Ollama bug related to capability checking that the Continue change uncovers. Ideally you would be able to request generation with a custom template to override the ollama Modelfile default that does not include I raised a PR on ollama/ollama here: ollama/ollama#7026 which is awaiting approval. Once that merges, you can configure Continue to use a custom Ollama template to enable FIM completions. For example attempting to use Qwen 2.5 Coder 7B-base as the Continue Tab Autocomplete Model fails since the default ollama template is
and tab completion immediately works as expected. |
@Altroo
|
@maximilian22x i'm sorry man, i dunno what else i can do to help |
also not working for deepseek with ollama
|
@tomasz-stefaniak has solved the problem here (https://github.com/continuedev/continue/pull/2452)—this will be available in next release! We'll wait for confirmation until closing this issue |
Before submitting your bug report
Relevant environment info
Description
While re
To reproduce
No response
Log output
No response
The text was updated successfully, but these errors were encountered: