pgBadger is a popular PostgreSQL log analyzer. This article describes how to use pgBadger to analyze BigAnimal logs stored in Azure Log Analytics Workspaces and AWS CloudWatch.
Prerequisites
- BigAnimal has several options for logging solutions. By default, the logs from the Postgres clusters are forwarded to the Azure Log Workspace or AWS CloudWatch base on the cloud provider you use. Before exporting logs, make sure you are using the default logging solution.
- In order to get meaningful database analysis reports, you must enable and set up log configurations in your BigAnimal cluster.
- When exporting BigAnimal cluster log, make sure to split the logs into structured fields that match the Postgres csvlog format.
Install pgBadger
- Clone the pgBadger git repository and install pgBadger. At the time of publishing this article, pgBadger version is 12.0.
git clone https://github.com/darold/pgbadger.git
cd pgbadger/
perl Makefile.PL
make && sudo make install - Install an additional Perl module to enable pgBadger to parse logs in CSV format:
perl -MCPAN -e shell
install Text::CSV_XS
quit
Export BigAnimal logs from Azure Log Analytics Workspaces
- Login Azure Portal, and go to Log Analytics workspaces. Base on your cluster location and resource group, select log workspace which created by BigAnimal.
- Use the following Kusto Query Language to query cluster log by a given pg_cluster_id_s.
PostgresLogs_CL
| where pg_cluster_id_s == '<PG_CLUSTER_ID>'
| project record_log_time_s, record_user_name_s, record_database_name_s, record_process_id_s, record_connection_from_s, record_session_id_s, record_session_line_num_s, record_command_tag_s, record_session_start_time_s, record_virtual_transaction_id_s, record_transaction_id_s, record_error_severity_s, record_sql_state_code_s, record_message_s, record_detail_s, record_hint_s, record_context_s, record_query_s,record_query_pos_s, record_application_name_s,record_backend_type_s, record_query_id_s
| sort by record_log_time_s desc
- Once you have the query results, export to a CSV file by clicking the Export -> CSV (all columns):
(Please note that Azure Log Analytics Workspace has a limit which only show the first 30,000 results. User could reference Azure document to narrow down the result set.)
Export BigAnimal logs from AWS CloudWatch
- Log in to AWS management console, and got to CloudWatch Logs Insights. Select the BigAnimal log group starting with the prefix /biganimal/PostgresLogs, and use the following query string to query cluster log by a given pg_cluster_id.
fields record.log_time, record.user_name, record.database_name, record.process_id, record.connection_from, record.session_id, record.session_line_num, record.command_tag,record.session_start_time, record.virtual_transaction_id, record.transaction_id, record.error_severity, record.sql_state_code, record.message, record.hint, record.query,record.application_name, record.backend_type, record.query_id | filter pg_cluster_id == '<PG_CLUSTER_ID>'| sort record.log_time desc
- Once you have the query results, download result to a CSV file by clicking Export results -> Download table (CSV):
Generate report
- Run pgBadger by setting the format log type to csv with option -f and output file with option -o.
pgbadger -f csv <log.csv> -o <output.html>
- A HTML file will be generated as the pgBadger report: