Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama instructions are really unclear #19

Open
JonnyDeates opened this issue Jun 10, 2024 · 3 comments
Open

Ollama instructions are really unclear #19

JonnyDeates opened this issue Jun 10, 2024 · 3 comments

Comments

@JonnyDeates
Copy link

Having issues integrating my locally hosted api of llama3. I followed the instructions, yet it keeps prompting me to provide an api key for chatgpt. Not sure what more needs to be modified fro ollama to work, but I see that the requests arent even leaving the chat menu, so it seems to be an if check somewhere blocking it. Seems like just an oversight, or it is mostly in but not done.

Any help would be appreciated.

@perfectra1n
Copy link
Collaborator

perfectra1n commented Jun 20, 2024

Hey @JonnyDeates, here's what my Chat Options looks like:

{
	"viewWidth": 364,
	"engine": "ChatGpt",
	"apiKey": "asdfasdfasdfasdfasdf",
	"requestUrls": {
		"completion": "https://ollama.internal.network/api/chat"
	},
	"engineOptions": {
		"model": "llama3",
		"max_tokens": 2500,
		"temperature": 0.3,
		"top_p": 1,
		"presence_penalty": 0.5,
		"frequency_penalty": 0.5,
		"stream": false,
		"n": 1
	},
	"shortcut": {
		"toggle": "Alt+Q",
		"hide": "Esc"
	},
	"faces": [
		"bx-smile",
		"bx-wink-smile",
		"bx-face",
		"bx-happy-alt",
		"bx-cool",
		"bx-laugh",
		"bx-upside-down"
	],
	"colors": [
		"var(--muted-text-color)"
	],
	"autoSave": true,
	"systemPrompt": "",
	"checkUpdates": true
}

What does yours look like currently? What does the Network tab within your "inspect" look like? Are you able to show any example requests?

But yes, this is mainly just an (ab)use of the existing code to make it think it's talking to ChatGPT using OpenAI's API as the existing codebase is very tightly coupled to it. If someone wants to integrate a new Ollama engine, a PR would certainly be accepted!

@perfectra1n perfectra1n changed the title Ollahm instructions are really unclear Ollama instructions are really unclear Jun 20, 2024
@perfectra1n
Copy link
Collaborator

I've also updated the README with the above information, and an Nginx configuration block that may be required so that Ollama accepts the "Authorization" header.

@perfectra1n
Copy link
Collaborator

Created #20 to talk about integrating Ollama better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants