pgBadger is a popular PostgreSQL log analyzer. This article describes how to use pgBadger to analyze BigAnimal logs stored in Azure Blob storage and AWS S3 bucket.
- BigAnimal has several options for logging solutions. Before exporting logs, make sure you are using the custom logging solution which store Postgres logs in Azure Blob storage or AWS S3 bucket base on the cloud provider you use.
- In order to get meaningful database analysis reports, you must enable and set up log configurations in your BigAnimal cluster.
Clone the pgBadger git repository and install
git clone https://github.com/darold/pgbadger.git
make && sudo make install
Export BigAnimal logs from Azure Blob Storage
- Get Azure Blog Storage name from BigAnimal Portal, Clusters - <Cluster Name> - Monitoring & Logging page:
- Log into the Azure portal and go to Storage accounts, find the storage by the name you got from the BigAnimal portal:
- Click BigAnimal storage, click Containers in the left menu, and BigAnimal cluster logs are stored in a container named biganimal-logs.
- Inside the container, logs can be found based on the BigAnimal cluster ID contained in the log file name.
Export BigAnimal logs from AWS S3 Bucket
- Get AWS S3 bucket name from BigAnimal Portal, Clusters - <Cluster Name> - Monitoring & Logging page:
- Log in to AWS management console, go to Amazon S3 service, and locate the bucket through the bucket name obtained from the BigAnimal portal:
- Click the kubernetes-logs object to find the cluster log by the cluster ID. If your cluster is highly available, you will see log objects for each pod.
Generate pgBadger report
- Run pgBadger by setting the format log type to jsonlog with option -f and output file with option -o. If you want to parse multiple logs, you can pass the log directory as a value to pgBadger.
pgbadger -f jsonlog <log_files_directory> -o <output.html>
- A HTML file will be generated as the pgBadger report: