Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak? #18

Open
dcherniv opened this issue Dec 24, 2019 · 1 comment
Open

Memory leak? #18

dcherniv opened this issue Dec 24, 2019 · 1 comment

Comments

@dcherniv
Copy link

It appears that exporter is consuming quite a lot of memory.
image
Sharp dropoff in the graph is the OOMKilled events from kubernetes.
Exporter is configured as follows and appears to be working fine otherwise. It's running as a side-car to the pgbouncer.

  - args:
    - --config
    - /home/REDACTED/exporter.yaml
    command:
    - pgbouncer-exporter
    env:
    - name: PGBOUNCER_USER
      valueFrom:
        secretKeyRef:
          key: exporter_user
          name: prod-REDACTED-pgbouncer
    - name: PGBOUNCER_PASS
      valueFrom:
        secretKeyRef:
          key: exporter_password
          name: prod-REDACTED-pgbouncer
    image: REDACTED-pgbouncer:0.1.3   
    imagePullPolicy: Always
    livenessProbe:
      failureThreshold: 2
      httpGet:
        path: /
        port: exporter
        scheme: HTTP
      periodSeconds: 3
      successThreshold: 1
      timeoutSeconds: 3
    name: pgbouncer-exporter
    ports:
    - containerPort: 9100
      name: exporter
      protocol: TCP
    readinessProbe:
      failureThreshold: 1
      httpGet:
        path: /
        port: exporter
        scheme: HTTP
      periodSeconds: 3
      successThreshold: 1
      timeoutSeconds: 3
    resources:
      limits:
        cpu: 200m
        memory: 256Mi
      requests:
        cpu: 50m
        memory: 128Mi

config file (exporter.yaml):

/home/REDACTED $ cat /home/REDACTED/exporter.yaml 
exporter_host: 0.0.0.0
exporter_port: 9100
pgbouncers:
  - dsn: postgresql://$(PGBOUNCER_USER):$(PGBOUNCER_PASS)@localhost:5432/pgbouncer
    connect_timeout: 5
    exclude_databases:
      - pgbouncer

invocation:

/home/REDACTED $ ps auxwww
PID   USER     TIME  COMMAND
    1 postgres  2:08 {pgbouncer-expor} /usr/bin/python3.7 /usr/bin/pgbouncer-exporter --config /home/REDACTED/exporter.yaml
21616 postgres  0:00 /bin/sh
21622 postgres  0:00 ps auxw
/home/REDACTED $ 

logs:

r$ kubectl logs -n prod prod-REDACTED pgbouncer-exporter
{"asctime": "2019-12-24 01:08:46", "levelname": "INFO", "message": "Exporter is starting up"}
{"asctime": "2019-12-24 01:08:46", "levelname": "INFO", "message": "Config file successfully read from /home/REDACTED/exporter.yaml"}
{"asctime": "2019-12-24 01:08:46", "levelname": "INFO", "message": "Exporter listening on 0.0.0.0:9100"}
r$ 

How much memory does it require?
@pracucci
Copy link

I would say it definitely looks to be a memory leak in the exporter. Do you have any time to investigate it, please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants