Skip to content

Latest commit

 

History

History
87 lines (68 loc) · 3.54 KB

README.md

File metadata and controls

87 lines (68 loc) · 3.54 KB

About

The grafana_metrics_used.py script extends the functionality of my Query Scraper.

It connects to the Grafana API and scrapes all the dashboards for two key items: variables and PromQL queries.

The goal is to gather all the metrics used in the dashboards. After collecting the queries and variables, the script preprocesses them to extract only the metrics. It then removes any duplicates and sorts the metrics in alphabetical order.

Step 1: We go from this

    {
        "dashboard": "MyDashboard / K8s / Cluster / Ephemeral Storage",
        "metric_names": [
            "node_filesystem_avail_bytes",
            "node_filesystem_size_bytes",
            "up"
        ]
    },
    {
        "dashboard": "MyDashboard / K8s / Cluster View",
        "metric_names": [
            "kube_namespace_created",
            "container_cpu_usage_seconds_total",
            "kube_ingress_info",

Step 2: ... to this - clean :)

    "cluster:namespace:pod_cpu:active:kube_pod_container_resource_limits",
    "cluster:namespace:pod_cpu:active:kube_pod_container_resource_requests",
    "cluster:namespace:pod_memory:active:kube_pod_container_resource_limits",
    "cluster:namespace:pod_memory:active:kube_pod_container_resource_requests",
    "container_cpu_usage_seconds_total",
    "container_memory_cache",
    "container_memory_rss",
    "container_memory_swap",
    "container_memory_working_set_bytes",
    "container_network_receive_bytes_total",
    "container_network_receive_packets_total",
    "container_network_transmit_bytes_total",
    "container_network_transmit_packets_total",
    "kube_configmap_info",

By obtaining this list of metrics, users can identify the exact metrics required for the existing dashboards to function correctly. Any unnecessary metrics can be restricted or deleted, reducing the number of metrics scraped by Prometheus. This results in faster query times, saved memory, and a more robust monitoring solution.

How to run the script?

1. Docker

2. Python

To run you will need to modify the following in the grafana_query_scraper.py, and provide your grafana URL as well as Grafana API key (read further to see how to get it).

if __name__ == "__main__":

    # Your grafana url 
    grafana_url = "<YOUR-GRAFANA-URL>"  
    # API key - read the readme for instructions.
    api_key = "<YOUR-GRAFANA-API-KEY>"
    # File name where the result of scrape will be saved
    file_name = 'grafana_queries.json'

Install the relevant dependancies and execute the script:

pip install requests
python3 grafana_query_scrapper.py

Getting Grafana API key

To connect with Grafana API an API key is required. Below I present guide on how to get it.

Image 1 Image 2 Image 3 Image 4 Image 5 Image 6