You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When the journald files are corrupted (or the journald input otherwise fails) the sidecar does not know, and fails to report errors back to the graylog server.
It's worth noting that filebeat doesn't exit when this occurs, it just stops the journald input. I'm pretty confident that this isn't a normal situation for the sidecar.
May 19 19:55:16 hostname systemd[1]: Started Wrapper service for Graylog controlled collector.
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="Using node-id: <UUID>"
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="No node name was configured, falling back to hostname"
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="Starting signal distributor"
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="Adding process runner for: filebeat-63a12208827d252d2f7931ca"
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="[filebeat-63a12208827d252d2f7931ca] Configuration change detected, rewriting configuration file."
May 19 19:55:16 hostname filebeat[25072]: 2023-05-19T19:55:16.175+0800 WARN map[file.line:175 file.name:beater/filebeat.go] Filebeat is unable to load the ingest pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the ingest pipelines or are using Logstash pipelines, you can ignore this warning. {"ecs.version": "1.6.0"}
May 19 19:55:16 hostname graylog-sidecar[25062]: time="2023-05-19T19:55:16+08:00" level=info msg="[filebeat-63a12208827d252d2f7931ca] Starting (exec driver)"
May 19 19:55:16 hostname filebeat[25080]: 2023-05-19T19:55:16.237+0800 WARN map[file.line:175 file.name:beater/filebeat.go] Filebeat is unable to load the ingest pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the ingest pipelines or are using Logstash pipelines, you can ignore this warning. {"ecs.version": "1.6.0"}
May 19 19:55:16 hostname filebeat[25080]: 2023-05-19T19:55:16.287+0800 WARN map[file.line:307 file.name:beater/filebeat.go] Filebeat is unable to load the ingest pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the ingest pipelines or are using Logstash pipelines, you can ignore this warning. {"ecs.version": "1.6.0"}
May 19 19:55:16 hostname filebeat[25080]: 2023-05-19T19:55:16.287+0800 WARN [input] map[file.line:102 file.name:v2/loader.go] EXPERIMENTAL: The journald input is experimental {"ecs.version": "1.6.0"}
May 19 19:55:16 hostname filebeat[25080]: 2023-05-19T19:55:16.324+0800 ERROR [input.journald] map[file.line:124 file.name:compat/compat.go] Input 'journald' failed with: input.go:130: input everything failed (id=everything)
failed to read message field: bad message {"ecs.version": "1.6.0"}
Observe that the collector status still shows as "Running"
Remove the corrupt file, restart the service and view the collected logs
Environment
Sidecar Version: 1.4.0
Graylog Version: 5.1
Operating System: Debian 11
Elasticsearch Version: 7.17.6
MongoDB Version: 5.0.18
The text was updated successfully, but these errors were encountered:
Problem description
When the journald files are corrupted (or the journald input otherwise fails) the sidecar does not know, and fails to report errors back to the graylog server.
It's worth noting that filebeat doesn't exit when this occurs, it just stops the journald input. I'm pretty confident that this isn't a normal situation for the sidecar.
Possible upstream issue: elastic/beats#32782
Steps to reproduce the problem
journalctl --verify
Environment
The text was updated successfully, but these errors were encountered: