Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codestral not working in pre-release version #2421

Open
3 tasks done
martincerven opened this issue Sep 29, 2024 · 13 comments
Open
3 tasks done

Codestral not working in pre-release version #2421

martincerven opened this issue Sep 29, 2024 · 13 comments
Assignees
Labels
area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior needs-triage Waiting to be triaged priority:high Indicates high priority

Comments

@martincerven
Copy link

Before submitting your bug report

Relevant environment info

v0.9.213
ollama version is 0.3.12

Description

"tabAutocompleteModel": {
    "title": "Codestral",
    "provider": "ollama",
    "model": "codestral"
  },
Screenshot 2024-09-29 at 14 46 37

While re

To reproduce

No response

Log output

No response

@sestinj sestinj self-assigned this Sep 29, 2024
@dosubot dosubot bot added area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior priority:medium Indicates medium priority labels Sep 29, 2024
@martincerven
Copy link
Author

Also there is

[2024-09-29T13:40:07] [ERROR] Error processing folder: {}
[2024-09-29T13:40:07] [ERROR] Error details: {"stack":"Error: MiniSearch: duplicate ID
...

In developer logs, it isn't present in logs from release version (v0.8.52).

@sestinj
Copy link
Contributor

sestinj commented Sep 29, 2024

@martincerven thanks for calling this out, it's an unintended consequence of 6e5c475, and definitely pretty high priority. Will get this in for next pre-release

@sestinj sestinj added priority:high Indicates high priority and removed priority:medium Indicates medium priority labels Sep 29, 2024
@sestinj sestinj mentioned this issue Oct 1, 2024
3 tasks
@sestinj
Copy link
Contributor

sestinj commented Oct 1, 2024

A duplicate for reference: #2430

@sestinj
Copy link
Contributor

sestinj commented Oct 1, 2024

More references:

A forward-looking improvement we hope may be accepted into Ollama: ollama/ollama#6968

Documentation about the suffix property: https://github.com/ollama/ollama/blob/main/docs/api.md#parameters

An example of a model whose template includes Suffix is here: https://ollama.com/library/starcoder2/blobs/3b190e68fefe (note that this model was shown as broken in the linked issue, but this is because the user would need to repeat ollama pull to get the version from after insertion support was added in Ollama)

From the Ollama documentation and from the initial PR, I don't believe there's any native way of getting model capabilities (e.g. whether insertion is supported), but we can tell from the template

Solution probably is:

  1. Where we call the /api/show endpoint in Ollama.ts, we should check the body.template for inclusion of ".Suffix"
  2. Use this to set whether supportsFim returns true

and then Continue will be able to automatically fall back to manually constructing the prompt

@TyDunn TyDunn assigned tomasz-stefaniak and unassigned sestinj Oct 1, 2024
@maximilian22x
Copy link

I tried to use codestral with my api key from mistral. But autocomplete does not work, no error message, it does not work.

"tabAutocompleteModel": {
"title": "Codestral",
"provider": "mistral",
"model": "codestral-latest",
"apiKey": "mein api key"
},

@Altroo
Copy link

Altroo commented Oct 1, 2024

@maximilian22x @martincerven

solution add apiBase :

"tabAutocompleteModel": { "title": "Codestral", "provider": "mistral", "model": "codestral-latest", "apiKey": "xx", "apiBase": "https://codestral.mistral.ai/v1/" },

Retreive the API key from Codestral menu and not the API key menu.

Capture d'écran 2024-09-30 234509

good luck.

@maximilian22x
Copy link

maximilian22x commented Oct 1, 2024

"tabAutocompleteModel": {
"title": "Codestral",
"provider": "mistral",
"model": "codestral-latest",
"apiBase": "https://codestral.mistral.ai/v1/",
"apiKey": "mein api"
},

I add the apiBase but it still does not work. Tool do not generate any code. Yes i use the codestral api from the picture above

@Altroo
Copy link

Altroo commented Oct 1, 2024

@maximilian22x

please check if tab autocomplete is enabled on your ide settings :

Sans titre

@kyRobot
Copy link

kyRobot commented Oct 1, 2024

This Continue change should work however there is an Ollama bug related to capability checking that the Continue change uncovers. Ideally you would be able to request generation with a custom template to override the ollama Modelfile default that does not include {{ .Suffix }} but that fails since Ollama only considers the default template.

I raised a PR on ollama/ollama here: ollama/ollama#7026 which is awaiting approval.

Once that merges, you can configure Continue to use a custom Ollama template to enable FIM completions.

For example attempting to use Qwen 2.5 Coder 7B-base as the Continue Tab Autocomplete Model fails since the default ollama template is {{.Prompt}} and therefore ollama thinks the model'doesnt support' FIM, but it definitely does if you prompt it directly. You can override the ollama template to enable FIM like so:

"tabAutocompleteModel": {
    "title": "Tab Completion",
    "provider": "ollama",
    "model": "qwen2.5-coder:7b-base",
    "apiKey": "unused",
    "requestOptions": {
      "extraBodyProperties": {
        "raw": false,
        "template":"{{- if .Suffix }}<|fim_prefix|>{{ .Prompt }}<|fim_suffix|>{{ .Suffix }}<|fim_middle|>{{- else }}{{ .Prompt }}{{- end }}"
      }
    }

and tab completion immediately works as expected.

@maximilian22x
Copy link

maximilian22x commented Oct 1, 2024

@Altroo
Yes, its already on. I do not understand why it does not work. Should working.
Everything work, only Autocomplete does not. Why it so complex, should be Easy to Use. Here is my complete config file:

{
  "models": [
    {
      "model": "meta-llama/Llama-3.2-3B-Instruct",
      "contextLength": 4096,
      "apiBase": "https://api.hyperbolic.xyz/v1/",
      "title": "Llama 3.2 3B",
      "apiKey": "meinAPI",
      "provider": "openai"
    },
    {
      "model": "Qwen/Qwen2.5-72B-Instruct",
      "contextLength": 4096,
      "apiBase": "https://api.hyperbolic.xyz/v1/",
      "title": "Qwen2.5-72B-Instruct",
      "apiKey": "meinAPI",
      "provider": "openai"
    },
    {
      "model": "meta-llama/Meta-Llama-3.1-405B-Instruct",
      "contextLength": 4096,
      "apiBase": "https://api.hyperbolic.xyz/v1/",
      "title": "Meta-Llama-3.1-405B-Instruct",
      "apiKey": "meinAPI",
      "provider": "openai"
    },
    {
      "model": "deepseek-ai/DeepSeek-V2.5",
      "contextLength": 4096,
      "apiBase": "https://api.hyperbolic.xyz/v1/",
      "title": "DeepSeek-V2.5",
      "apiKey": "meinAPI",
      "provider": "openai"
    },
    {
      "title": "Gemini 1.5 Flash",
      "model": "gemini-1.5-flash-latest",
      "contextLength": 1000000,
      "apiKey": "meinAPI",
      "provider": "gemini"
    },
    {
      "model": "claude-3-sonnet-20240229",
      "contextLength": 200000,
      "title": "Claude 3 Sonnet",
      "apiKey": "meinAPI",
      "provider": "anthropic"
    },
    {
      "title": "Gemini 1.5 Pro",
      "model": "gemini-1.5-pro-latest",
      "contextLength": 2000000,
      "apiKey": "meinAPI",
      "provider": "gemini"
    },
    {
      "apiKey": "meinAPI",
      "title": "Codestral",
      "model": "codestral-latest",
      "provider": "mistral"
    },
    {
      "apiKey": "meinAPI",
      "title": "Codestral Mamba",
      "model": "codestral-mamba-latest",
      "provider": "mistral"
    },
    {
      "model": "claude-3-5-sonnet-20240620",
      "provider": "anthropic",
      "apiKey": "meinAPI",
      "title": "Claude 3.5 Sonnet"
    },
    {
      "model": "claude-3-haiku-20240307",
      "provider": "anthropic",
      "apiKey": "meinAPI",
      "title": "Claude 3 Haiku"
    }
  ],
  "customCommands": [
    {
      "name": "test",
      "prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
      "description": "Write unit tests for highlighted code"
    }
  ],
  "tabAutocompleteModel": {
    "title": "Codestral",
    "provider": "mistral",
    "model": "codestral-latest",
    "apiBase": "https://codestral.mistral.ai/v1/", 
    "apiKey": "meinAPI"
  },
  "contextProviders": [
    {
      "name": "code",
      "params": {}
    },
    {
      "name": "docs",
      "params": {}
    },
    {
      "name": "diff",
      "params": {}
    },
    {
      "name": "terminal",
      "params": {}
    },
    {
      "name": "problems",
      "params": {}
    },
    {
      "name": "folder",
      "params": {}
    },
    {
      "name": "codebase",
      "params": {}
    }
  ],
  "slashCommands": [
    {
      "name": "edit",
      "description": "Edit selected code"
    },
    {
      "name": "comment",
      "description": "Write comments for the selected code"
    },
    {
      "name": "share",
      "description": "Export the current chat session to markdown"
    },
    {
      "name": "cmd",
      "description": "Generate a shell command"
    },
    {
      "name": "commit",
      "description": "Generate a git commit message"
    }
  ],
  "embeddingsProvider": {
    "provider": "free-trial"
  },
  "reranker": {
    "name": "free-trial"
  },
  "experimental": {
    "modelRoles": {
      "repoMapFileSelection": "Claude 3 Haiku"
    }
  }
}

@Altroo
Copy link

Altroo commented Oct 2, 2024

@maximilian22x i'm sorry man, i dunno what else i can do to help

@brunocasarotti
Copy link

also not working for deepseek with ollama

"tabAutocompleteModel": {
    "title": "Deepseek",
    "provider": "ollama",
    "model": "deepseek-coder:6.7b-base",
    "apiBase": "http://0.0.0.0:7869/"
  },

@sestinj
Copy link
Contributor

sestinj commented Oct 3, 2024

@tomasz-stefaniak has solved the problem here (https://github.com/continuedev/continue/pull/2452)—this will be available in next release!

We'll wait for confirmation until closing this issue

@RomneyDa RomneyDa added the needs-triage Waiting to be triaged label Oct 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:autocomplete Relates to the auto complete feature kind:bug Indicates an unexpected problem or unintended behavior needs-triage Waiting to be triaged priority:high Indicates high priority
Projects
None yet
Development

No branches or pull requests

8 participants