Exporting Prometheus metrics into Google sheet

 Note: this is post is linked to the previous one, if you are curious feel free to check it Exploring Car Metrics Visualization with Grafana and Prometheus

 


 

 

Catching up from the previous post, i encountered another challenge, exporting metrics on a external place (in my case an S3 bucket) and that's where i wrote a small script/tool called P2C (or better ChatGPT helped me writing it, as you can see from the syntax) 

P2C stands for Prometheus 2 CSV, this simple script/application allows you to effortlessly export data from Prometheus servers to CSV files and even upload it to an AWS S3 bucket in case it's needed. 

In this blog post, i'll walk you through the project and demonstrate how you can seamlessly integrate these metrics into your Google Sheets for data visualization

Another goal of this little script was to discover new ways of sharing data, trying to be a small Data Engineer for a weekend proved that Grafana dashboards "could be" replicated on Google Sheets with limited graphs (yet excel is still nowadays the fundation of modern society 😆)




 


 

Prerequisites

Before diving into the project, ensure you have the following prerequisites:

  • Go: The Prometheus 2 CSV is written in Go, so you'll need to have Go (tested with Golang 1.21.1) installed on your system. If you don't already have it, you can download and install Go from the official Go website or leverage the Dockerfile/image

  • AWS S3 Bucket (Optional): If you plan to upload the CSV file to an AWS S3 bucket, make sure you have your AWS credentials configured. The Go script will automatically use default AWS credentials or take them from environment variables (AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID).

  • Prometheus: To use this tool, you must have access to a Prometheus server with a valid address to query data.

Demo

Before getting started, you can watch a quick demo of the Prometheus 2 CSV in action.

This demo showcases the project running on Ubuntu 22.04 and Golang 1.18/1.21.



Examples

Here are a few usage examples to give you a better idea of how i exported the data from Prometheus:

 

Export data from Prometheus to a local CSV file:

p2c --query "combustionFuelLevel_range{job='bmw_exporter'} 
currentMileages{job='bmw_exporter'} 
gas_fuel_price{job='bmw_exporter'}" \
--time-range "7d" --address "http://prometheus.example.com:9090

 

Export data from Prometheus and upload it to an AWS S3 bucket:

p2c --query "combustionFuelLevel_range{job='bmw_exporter'} \
currentMileages{job='bmw_exporter'} \
gas_fuel_price{job='bmw_exporter'}" \
--time-range "7d" --address "http://prometheus.example.com:9090" \
--upload-s3 "s3://my-s3-bucket/folder/" --region "us-west-1"
 
In my specific use-case i am executing p2c as a Kubernetes cronjob in order to scrape all the car metrics and push them directly into an S3 bucket
 
For more details you can head up to the Github repo here:

 

Automate Data Import to Google Sheets

One of the powerful use cases for this tool is to automatically import Prometheus metrics into S3 which then will be leveraged to Google Sheets for easy visualization. 

This S3 bucket then is directly exposed only to Cloudflare via IP based conditions (refreshed via a Lambda function) meanwhile the WAF rule allows only my Google Sheet to reach the data

In order to retrive the data from Google Sheets i used the IMPORTDATA function to retrieve any HTTP resource with a specific delimeter

For example, you can use the following formula to import data from a remote CSV file:

=IMPORTDATA("https://my-s3-bucket/path/subpath/metrics.csv", ",")

Once you have the data in Google Sheets, you can start creating graphs and charts to visualize your Prometheus metrics as you prefer

 

If you are using as above P2C the tool will also upload the current date of the retrived data e.g.:

  • 04-11-2023-16-54-_job_xyz_.csv
  • latest-_job_xyz_.csv


To have always the latest fresh data in your Google sheet you can use the "latest" CSV thus you will automatically see the graphs updated each day in case you are running the tool as a cronjob (e.g. each day at midnight)


Conclusion


As a conclusion i think one of the lesson other than the tecnical aspect is that nowadays we strive for specialization (e.g. being a good Network Engineer, DevOps, Product Engineer, Datascientist.. etc..) and that's correct/how society developed but we shouldn't forget to give room for being versatile and go outside our scope/perimeter  

 

If you are a Developer do not stop on your code

If you are a DevOps/SRE do not stop on your infrastructure

If you are a Networker do not stop on your network packets

If you are a Business/Product owner do not stop on your own perimeter

Try always to get into the next topic outside your perimeter with the same methodology as techincal aspects change over time or could be deprecated very quickly


:wq!



Comments