Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

haProxy config #1

Open
baderth opened this issue May 6, 2022 · 2 comments
Open

haProxy config #1

baderth opened this issue May 6, 2022 · 2 comments

Comments

@baderth
Copy link

baderth commented May 6, 2022

Hi,
First of all - sorry for bothering you.
I try to get my head around loadbalancing seaside/topaz FastCGIs with haProxy (instead of currently used nginx) and your repo here is the only spot on the whole interwebs which has at least something for me.

In case I get the repo right and you're doing nginx->haProxy->FastCGI topaz here, I have a question on:
https://github.com/talk-small-be-open/via-base-main/tree/master/deployment/files/haproxy/haproxy.cfg.j2

Seems like a normal http pass instead of a needed(?) fastcgi to me? (as fcgi backends in haProxy would have a "proto fcgi" ending the server line).

Again, sorry, normally i would spend days investigating instead of bothering people but I'm so close to give up on this because all i get is a 502 for all requests towards haProxy/topaz/seaside requests. And I just want nice loadBalancing without paying nginx for that simple feature (queueing).

Thanks for any help, cheers from Vienna,
Tom

@dassi
Copy link
Contributor

dassi commented May 6, 2022

Hi Tom! My setup ist without FastCGI, its pure http. The seaside webserver inside each of the topaz gems is http (Zinc...). It used to be FastCGI in the past, but the docs of GsDevKit use http. (I am not even sure, if there is a FastCGI option).

(BTW: haproxy has proven to be very useful in the context of GemStone seaside load balancing, I never was happy with nginx' balancing features, somehow. haproxy very nicely handles outages from single gems.)

Why are you using FastCGI?

@baderth
Copy link
Author

baderth commented May 6, 2022

Oh boy...
I/we run this fastcgi setup in production since...2008 I guess. Never questioned it, took it as a given.
I'll sure have a look on the Zinc variant asap, thanks for pointing out.

800 sessions peak, 500gig repository over here, so will need a bit of testing :)

@haproxy over nginx: you cant be happy with the free options nginx gives you regarding load balancing, if you want to queue requests before the upstreams/gems, which is a nginx plus only feature.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants