Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Docs] Include all provider variables in config.toml #315

Open
vRobM opened this issue Dec 10, 2024 · 5 comments
Open

[Docs] Include all provider variables in config.toml #315

vRobM opened this issue Dec 10, 2024 · 5 comments

Comments

@vRobM
Copy link

vRobM commented Dec 10, 2024

Since I started without those (XAI & GROQ) it was painful figuring it out.
Overloading the OPENAI vars with other keys didn't work.

I only found the CLI export examples after a few hours ;-/

Also, please include examples of calling providers from CLI as well as providers/model combo.

Note that some providers don't have a default fallback and need more specificity.

@ErikBjare
Copy link
Owner

ErikBjare commented Dec 10, 2024

Include all provider variables in config.toml

I'd rather not duplicate a bunch of examples, it's just env vars after all.

You think it's enough if we just link to the doc on providers?

Overloading the OPENAI vars with other keys didn't work.

What do you mean?

Also, please include examples of calling providers from CLI as well as providers/model combo.

That exists at https://gptme.org/docs/providers.html. Which doc pages did you read? What exactly would you want changed?

Easiest would be if you submit a PR to https://github.com/ErikBjare/gptme/blob/master/docs/providers.rst

@vRobM
Copy link
Author

vRobM commented Dec 10, 2024

I'd rather not duplicate a bunch of examples, it's just env vars after all.

I hear you, however for first time setup and if you don't have one of the ones in the example, you fail to get started.

You think it's enough if we just link to the doc on providers?

That may help also, however for me it would have been best to have all the examples as comments in the config file to begin with.

Overloading the OPENAI vars with other keys didn't work.
What do you mean?

Many other projects had only OpenAI support, and if using a compatible API, one can simply replace the OAI API key with another service, adjusting the base URL to get things working.

Also, please include examples of calling providers from CLI as well as providers/model combo.
That exists at https://gptme.org/docs/providers.html. Which doc pages did you read? What exactly would you want changed?
Easiest would be if you submit a PR to https://github.com/ErikBjare/gptme/blob/master/docs/providers.rst

Yes it does exist there, just not explicitly. Hence the ask for clarity.

I don't have things working yet (gptme-server crashes), so I hesitate to edit with examples that worked for me before they do.

@ErikBjare
Copy link
Owner

Many other projects had only OpenAI support, and if using a compatible API, one can simply replace the OAI API key with another service, adjusting the base URL to get things working.

Ah! I see how that is confusing.

I'll try to make the example in config.toml more complete.

gptme-server crashes

Logs? Is it just missing server extras? (if it fails to import flask)

@ErikBjare
Copy link
Owner

I updated the docs for config.toml: https://gptme.org/docs/config.html and simplified the page on providers.

@vRobM
Copy link
Author

vRobM commented Dec 11, 2024

I updated the docs for config.toml: https://gptme.org/docs/config.html and simplified the page on providers.

config.toml looks good (nice addition for DeepSeek), however the providers page has a few issues:

  1. the top example could be updated to echo the config changes
  2. the reference to the config file mentions gptme.toml which doesn't exist (I think you mean ~/.config/gptme/config.toml)
  3. the API key list could use DeepSeek too.
  4. when the repo is cloned, it would help if the config.toml had these changes
  5. a few examples of how to launch gptme with providers that have a working example model that will work without errors for validating the configuration file or launch parameters (Groq has so many models compared to Grok, etc)

As to the gptme-server, it doesn't seem to pick up the XAI_API_KEY in the config or the shell export ;-/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants