-
Notifications
You must be signed in to change notification settings - Fork 875
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pull Model 1 exited with code 0 #132
Comments
I also have ollama installed and running on my mac at the time i execute |
Quality of this software is below any level. After several tests I gave up. |
The pull-model container is not intended to be long running. It is just there to make sure that the model is pulled. On a mac, the value I'm not sure why you're seeing this error though. Ollama was either unable to pull @sid-mandati it could be helpful to run |
hello. thanks for your reply. when i ping the host.docker.internal from my mac, i get a reply. i was also able to verify ollama was running on my mac by opening the ollama base url from my mac browser. i am guessing the docker is not able to access ollama running on my mac. if you believe that could be the issue. is there a fix you would recommend? |
You can get more information about what went wrong using 'docker logs pull-model', what does it show? |
You have to add config in /etc/hosts as follows: 127.0.0.1 host.docker.internal |
When i do
docker compose up
, i get the below error. 6 out of the 7 containers are running, except for the pull-model-1. ollama is running on my mac M1 already. I have the following in my .env fileLLM=llama2:latest
EMBEDDING_MODEL=sentence_transformer
OLLAMA_BASE_URL=http://host.docker.internal:11434
Error Details:
pull-model-1 | pulling ollama model llama2 using http://host.docker.internal:11434
pull-model-1 exited with code 0
The text was updated successfully, but these errors were encountered: