Pgexplaindash: Logs Postgres EXPLAIN ANALYZE results for analysis in a dashboard

4 hours ago 1

Showcase video:

showcase-pg-db-optimize.mp4

Integrated application using Python, Grafana, Vector and Loki to store explain analyze query results and display them for analyzation in a simple dashboard.

The motivation is to have many PostgreSQL explain analyze queries and databases and execute everything in one run, then analyze the results interactively on the dashboard.

Inspired by pev2. Please check out this amazing tool.

  • Python
  • PostgreSQL database running
  • Grafana
  • Vector
  • Loki
  • Docker Engine or Docker Desktop

The python program uses the queries and databases saved to execute the explain queries. Results get logged using Python logging and vector picks up the logs from the docker container running the Python program. Vector sends it to Loki for storage, and Grafana queries Loki to display data in the dashboards.

Keep in mind that this application only measures the elapsed time of SQL statements as they execute within the database engine itself. It does not account for any of the additional latencies and overhead you’ll encounter in a real‐world setting, including but not limited to:

  • Network latency between your application server and the database host

  • Client‐side processing, such as building and serializing the query or parsing and deserializing the result set

  • Driver and ORM overhead, including marshalling parameters, mapping rows to objects, and any client‐side caching

  • I/O contention on the database server (disk reads/writes, log flushing, buffer cache misses)

  • Resource throttling or CPU scheduling enforced by container orchestrators or hypervisors

Because these factors can often dominate end‐to‐end response times, be sure to profile your full application stack—including the network path and client code—if you need an accurate measure of total latency.

Please have Docker Engine or Docker desktop installed.

Setup database connections

Databases running in other docker containers running locally must be on the same Docker network as the program. To do this, each container running a PostgreSQL database must be started using the same docker network.

Create network

docker network create `name`

Setup external network with PostgreSQL databaes in docker-compose

pg-container: networks: - `name` networks: `name`: external: true

Set env var NETWORK in .env

Add queries and databases

Do not include & in any database or query fields. The system doesn't sanitize &, which causes hard-to-diagnose errors or silent failures.

Only EXPLAIN ANALYZE queries in JSON format are supported. Always format SQL statements as:

EXPLAIN (ANALYZE, FORMAT JSON) SELECT ... ;
  1. Navigate to the project directory:
cd `root`/db-optimize-logger

Option 1: Using uv

uv venv uv run app/build_queries.py

Option 2: Using pip:

pip install dearpygui pandas pydantic python app/build_queries.py`

After running either option: Add databases and queries, then save them.

Run application and access the dashboard

After adding queries and databases, you can run the rest of the application.

The start.sh will delete any data from previous run and execute the queries, begin the pipeline and make the dashboard ready.

Fresh start (deletes previous data)

chmod +x start.sh ./start.sh

Preserve Existing Data

Note: Query execution time depends on complexity. The dashboard will be available after processing.

http://localhost:3000/a/ivarehaugland-explaindbdashboard-app/home

Read Entire Article