-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Unable to use Ollama models: Error in validate_skeleton: api_key
is not a valid character scalar. It is a <character>.
#230
Comments
Thank you for the report. We'll take a look |
@indykpol I was able to reproduce this problem. You needed to save your settings after changing your API service or model. This is why OpenAI was being used still after selecting Ollama. This wasn't obvious, so I've added some info tooltips to be extra safe |
Thanks - I can confirm it solved the problem. It really needs saving before it is usable. |
Another related question. Let's suppose I have Ollama running at certain IP and I configured the Chat interface to use now. Now when I want to use "Chat in source" or the comment code functionality I encounter a strange behaviour. It tries to use local instance of Ollama out of the box, without giving me an opportunity to configure: Error in I actually got it to work, saving the config in the Chat as the default, and then restarting everything in a new project. I think some variables don't get updated properly during the runtime. |
I think that is the expected behavior. The gptstudio chat runs in a different R process than the one Rstudio uses, this is why it doesn't block your R console. This is also why we have "Save for this session" (current R process running gptstudio) and "Save as default" (saves in disk + updates current process). However, Rstudio's R process is already loaded with whatever options / ENV variables it found at startup, so your changes in the gptstudio configuration will not affect it until you restart your R session. Let me know if you were referring to something different. |
Confirm setup
{gptstudio}
(pak::pak("MichelNivard/gptstudio")
) and tested if the problem remains.{reprex}
and{sessioninfo}
packages to be able to run this issue's code snippetpak::pak(c("reprex", "sessioninfo"))
.What happened?
This issue looks a bit different on the Github dev version from the CRAN version.
When I try to use chat with local Ollama host I get the following upon sending anything to the chat (after configuring the app to use local Ollama and selecting one of the available models):
Error in validate_skeleton:
api_key
is not a valid character scalar. It is a .4: shiny::runApp
3: eval
2: eval
1: .rs.sourceWithProgress
I believe it pertains to the fact that:
Relevant log output
Session info
Created on 2024-10-14 with reprex v2.1.1
Session info
Code of Conduct
The text was updated successfully, but these errors were encountered: