Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Ollama integration #156

Closed
mtompkins opened this issue Feb 4, 2024 · 6 comments
Closed

[Feature Request] Ollama integration #156

mtompkins opened this issue Feb 4, 2024 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@mtompkins
Copy link

mtompkins commented Feb 4, 2024

💭 Describe the feature

In addition to OpenAI and its associated cost it would be great if we might be able to use Ollama

💡 Proposed Solution

Extend the AI integration to use Ollama which allows running various LLM models locally without a fee.

@mtompkins mtompkins added the enhancement New feature or request label Feb 4, 2024
@Zhengqbbb Zhengqbbb added the PR welcome Pull request welcome label Feb 5, 2024
@gandli
Copy link

gandli commented Mar 8, 2024

最近,ollama 已实现对 OpenAI 的兼容性。czg 支持 Ollama 一个可行办法是利用 ‘ollama cp’ 命令,将现有模型名复制到一个临时名。在命令行中具体执行像下面这样的命令:

ollama cp gemma gpt-3.5-turbo

接下来,修改 .czrc 文件内容,仿如下所示:

{
  "openAIToken": " ",
  "apiEndpoint": "http://localhost:11434/v1"
}

@Zhengqbbb
Copy link
Owner

Zhengqbbb commented Mar 11, 2024

最近,ollama 已实现对 OpenAI 的兼容性。czg 支持 Ollama 一个可行办法是利用 ‘ollama cp’ 命令,将现有模型名复制到一个临时名。在命令行中具体执行像下面这样的命令:

感谢! 我会在最近开始这方面的开发 🫠


# ollama cp
ollama pull gemma
ollama ls
ollama cp <target-module> gpt-3.5-turbo
ollama ls # check cp success

CleanShot 2024-06-03 at 20 30 31@2x

It is recommended to use commands for configuration here. The loading path of AI configuration is different from that of other configurations.

npx czg --api-key=" " --api-endpoint="http://localhost:11434/v1"

@terranc
Copy link

terranc commented Jul 15, 2024

But every first request fails

@Zhengqbbb
Copy link
Owner

I have released a next pre-release version 🌟(v1.10.0-beta.1)🌟 for testing.
Plan for the official release on the weekend 🚀.

Welcome to try it out! 👀 Let me know if there are still any issues. 🎉

npm install -g czg@next
czg -v # ensure version is v1.10.0-beta.1
czg -h # show help

# Usage

### 1. Use the default `gpt-4o-mini` model
czg ai

### 2. Set specify a model in current session
czg ai -M=gpt-3.5-turbo

### 3. Setup the default model
czg --api-model="gpt-4"

See more setup details in https://deploy-preview-185--cz-git.netlify.app/recipes/openai#setup-openai-token

@mtompkins
Copy link
Author

Thanks for your work. I just don't see how this relates to using local LLMs.
I do appreciate the work you do!

@Zhengqbbb Zhengqbbb removed the PR welcome Pull request welcome label Sep 26, 2024
@Zhengqbbb
Copy link
Owner

Zhengqbbb commented Sep 26, 2024

Thank you for testing !

Currently, cz-git and czg only support models compatible with the /chat/completions OpenAI API endpoint
And in my testing or running the Ollama model to generate commit messages not well.

If someone have any better suggestions, please feel free to share.

CleanShot 2024-09-26 at 23 20 15@2x

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants