Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Newbie question: Run a serverless on-demand API backend on runpod.io or vast.ai #88

Open
gymnae opened this issue Aug 9, 2024 · 5 comments

Comments

@gymnae
Copy link

gymnae commented Aug 9, 2024

Hi,
I'm a hobbyist and occasionally would like to create images or run queries. Serverless seems like a good fit with cost control.
I tried to use this image for a serverless installation on runpod, but was unable to call the API as a backend from my local ComfyUI install. Would this work in a worker/serverless setting, or would I need to run an instance on vast.ai or runpod?

Cheers

@robballantyne
Copy link
Member

I'll have a demo for Vast's autoscaler ready soon.

You'll generally create a handler that's invoked by the serverless controller and should interact with the ComfyUI on localhost:18188 - I.e not external:8188 because that requires caddy to run and you won't want that because we should be fast to start on serverless.

You can achieve the fast start by declaring environment variable SERVERLESS=true or SUPERVISOR_NO_AUTOSTART=caddy,jupyter, syncthing as required

I'm about 90% done with a universal async API wrapper for processing ComfyUI workflows which should make serverless integration very easy - I just have to finish adding timings and webhooks https://github.com/ai-dock/comfyui/tree/main/build/COPY_ROOT_1/opt/ai-dock/api-wrapper

@gymnae
Copy link
Author

gymnae commented Aug 10, 2024

That sounds great :) I'd prefer to run it on vast, so cool :)

@field-mouse
Copy link

field-mouse commented Oct 16, 2024

I'm about 90% done with a universal async API wrapper for processing ComfyUI workflows which should make serverless integration very easy - I just have to finish adding timings and webhooks

Any updates?

@quannv4
Copy link

quannv4 commented Nov 7, 2024

hi, can you please update the status and how do I make this work?

You can achieve the fast start by declaring environment variable SERVERLESS=true or SUPERVISOR_NO_AUTOSTART=caddy,jupyter, syncthing as required

then what's next?

If we deployed the docker to vastai and want to use as serverless, we have to stop and start instance everytime (before) we request?

do I need to run supervisorctl [start|stop|restart] comfyui ?

You'll generally create a handler that's invoked by the serverless controller and should interact with the ComfyUI on localhost:18188 - I.e not external:8188 because that requires caddy to run and you won't want that because we should be fast to start on serverless.

where we create this handler and how?

sorry a lot of noob questions would be appreciated if you answer.

@robballantyne
Copy link
Member

robballantyne commented Nov 7, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants