pgBadger is a popular PostgreSQL log analyzer. This article introduce how to use pgBadger to analyze BigAnimal logs.
- Install pgBadger
Follow the Installation steps to install pgBadger. When writing this article, the latest version of pgBadger is v11.8. - Install one additional Perl module to make pgBadger parse CSV format log as follow:
perl -MCPAN -e shell
install Text::CSV_XS
quit - Export BigAnimal logs
The recommend way of querying BigAnimal logs is use log components of the cloud providers.
Important: In order to make pgBadger parse the csv log, make sure the logs are split into structured fields matching those of the Postgres csvlog format.
Azure: BigAnimal clusters logs are stored in the Customer Log Analytics workspace. Use the following query string to query cluster log through the given cluster ID. Once you get the query result, export to a CSV file by clicking the Export -> Export to CSV - all columns.
PostgresLogs_CL
| where pg_cluster_id_s == '<PG_CLUSTER_ID>'
| project record_log_time_s, record_user_name_s, record_database_name_s, record_process_id_s, record_connection_from_s, record_session_id_s, record_session_line_num_s, record_command_tag_s, record_session_start_time_s, record_virtual_transaction_id_s, record_transaction_id_s, record_error_severity_s, record_sql_state_code_s, record_message_s, record_detail_s, record_hint_s, record_context_s, record_query_s,record_query_pos_s, record_application_name_s, record_backend_type_s, record_query_id_s
| sort by record_log_time_s desc
AWS: BigAnimal clusters logs are stored in the Log Streams in CloudWatch Log Groups. Use the following query string to query cluster log through the given cluster ID. Once you get the query result, export to a CSV file by clicking Export results -> Download table (CSV).
fields record.log_time, record.user_name, record.database_name, record.process_id, record.connection_from, record.session_id, record.session_line_num, record.command_tag,record.session_start_time, record.virtual_transaction_id, record.transaction_id, record.error_severity, record.sql_state_code, record.message, record.hint, record.query,record.application_name, record.backend_type, record.query_id | filter pg_cluster_id == '<PG_CLUSTER_ID>'| sort record.log_time desc - Generate report
pgbadger -f csv <log.csv> -o <output.html>
By using an option -f to set format log type to csv and using an option -o to set the output file. A HTML file will be generated as the pgBadger report.