the new bancho server for theta! built for scalability and speed
configuration is done either through gamma.toml
, or through environment variables. see gamma.toml.example
for the available fields, and src/settings.rs
/ rustdoc for the environment variable names.
you can build a docker image using nix build .#docker
, then load it into docker with docker load < result
.
you can use the docker-compose.dev.yml
to set up the required other services, then copy gamma.toml.example
to gamma.toml
.
you can then cargo run
, or use nix run
/ nix build .#gamma
you can change the domain that osu uses by passing it the -devserver
argument. it will then access several subdomains of that base, which you should set via dns or via /etc/hosts
. eg for -devserver localhost
, it will try:
a.localhost
b.localhost
c.localhost
c[1-7].localhost
ce.localhost
osu.localhost
as well as this, osu's security is kinda bad - it forces you to use tls 1.0, which most server software really doesn't want you to do. debian, nix, and arch's nginx packages all seem to be built without support for this.
locally, you can use mitmproxy
(which nix installs for you): ie sudo ./scripts/proxy.sh
. this does change certain headers, etc it shouldn't, but it will work for the most part. you will also need to import the CA that it auto generates in ~/.mitmproxy/mitmproxy-ca-cert.pem
.
for deployment, you'll need to run a proper reverse proxy set to support tlsv1.
we use the tracing
crate for instrumentation.
by default the console output is in a human-friendly format that shows nested spans using indentation.
this can also be exported to any OTLP collector (such as the opentelemetry collector) to visualise and create metrics from it. you can control where this is sent via the config path telem.endpoint
, which should be a gRPC endpoint.
docker-compose.dev.yml
will start a grafana (localhost:3000
) and tempo server which lets you query these traces and events, with gamma.toml.example
shipping to it automatically.
in deployment, you should set up tempo/grafana properly and likely use the grafana agent or another collector to also create loki logs from the spans.