Replies: 3 comments 1 reply
-
The tricky thing about containers is that localhost inside the container stays inside the container, so to reach localhost on the host from inside the container, use something like your host ip address. This is a change to this part of the command:
Let me know if that fixes the issue, and thanks for checking out the project. |
Beta Was this translation helpful? Give feedback.
-
Didn't work, currently I have it running with just the API up and not the UI locally as can't figure out how to get it to correctly expose as per above. |
Beta Was this translation helpful? Give feedback.
-
Are you running ollama on your host machine or also in a container? |
Beta Was this translation helpful? Give feedback.
-
Hi,
I'm having trouble accessing the API for GHOSTS SHADOWS when running it via Docker. The setup works perfectly on bare metal, but when I try to use Docker, I encounter issues.
Context:
GHOSTS SHADOWS provides several interfaces, including a REST API and a UI web interface.
Endpoints include activities, chat, content, and social.
Bare Metal Setup:
ollama create
andollama run
commands are executed for various models.python api.py
andpython ui.py
are run in separate terminals.OLLAMA_HOST=0.0.0.0:11434 ollama serve
is used to make the API available beyond localhost.Docker Setup:
curl
the chat endpoint results in a connection error.Error Details:
ConnectionRefusedError
indicating that the service is not accessible athttp://localhost:11434/api/chat
.Things I Have Tried:
Question:
Has anyone encountered a similar issue with Docker and GHOSTS SHADOWS? Are there specific Docker configurations or environment variables that might be affecting connectivity? Any suggestions or troubleshooting steps would be greatly appreciated.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions