Skip to content

Commit

Permalink
Make Codestral more configurable (#32)
Browse files Browse the repository at this point in the history
Similar to Ollama allow configuration of `prompt` and `suffix` for the Codestral backend.
  • Loading branch information
mrloop authored Sep 19, 2024
1 parent bd2fda1 commit cb320bd
Show file tree
Hide file tree
Showing 2 changed files with 32 additions and 5 deletions.
29 changes: 27 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,31 @@ cmp_ai:setup({
You will also need to make sure you have the Codestral api key in you
environment, `CODESTRAL_API_KEY`.

You can also use the `suffix` and `prompt` parameters, see [Codestral](https://github.com/codestral/codestral) for more details.

```lua
local cmp_ai = require('cmp_ai.config')

cmp_ai:setup({
max_lines = 1000,
provider = 'Codestral',
provider_options = {
model = 'codestral-latest',
prompt = function(lines_before, lines_after)
return lines_before
end,
suffix = function(lines_after)
return lines_after
end
},
notify = true,
notify_callback = function(msg)
vim.notify(msg)
end,
run_on_every_keystroke = true,
})
```

To use Google Bard:

```lua
Expand Down Expand Up @@ -170,7 +195,7 @@ cmp_ai:setup({
})
```

With Ollama you can also use the `suffix` parameter, typically when you want to use cmp-ai for codecompletion and you want to use the default plugin/prompt.
With Ollama you can also use the `suffix` parameter, typically when you want to use cmp-ai for code completion and you want to use the default plugin/prompt.

If the model you're using has the following template:
```
Expand All @@ -191,7 +216,7 @@ cmp_ai:setup({
provider_options = {
model = 'codegemma:2b-code',
prompt = function(lines_before, lines_after)
return lines_before
return lines_before
end,
suffix = function(lines_after)
return lines_after
Expand Down
8 changes: 5 additions & 3 deletions lua/cmp_ai/backends/codestral.lua
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ local requests = require('cmp_ai.requests')
Codestral = requests:new(nil)
BASE_URL = 'https://codestral.mistral.ai/v1/fim/completions'

function Codestral:new(o, params)
function Codestral:new(o)
o = o or {}
setmetatable(o, self)
self.__index = self
self.params = vim.tbl_deep_extend('keep', params or {}, {
self.params = vim.tbl_deep_extend('keep', o or {}, {
model = 'codestral-latest',
temperature = 0.1,
n = 1,
Expand Down Expand Up @@ -35,8 +35,10 @@ function Codestral:complete(lines_before, lines_after, cb)
return
end
local data = {
prompt = lines_before,
prompt = self.params.prompt and self.params.prompt(lines_before, lines_after) or lines_before,
suffix = self.params.suffix and self.params.suffix(lines_after) or '',
}

data = vim.tbl_deep_extend('keep', data, self.params)
self:Get(BASE_URL, self.headers, data, function(answer)
local new_data = {}
Expand Down

0 comments on commit cb320bd

Please sign in to comment.