BigAnimal supports viewing the metrics and logs from your cloud provider. For those clusters created at Azure, we send BigAnimal clusters to Azure log analytics .
By running KQL queries, you can get the related Postgres logs. You can also use the example query we shared here.
This article introduces the different ways to export logs from Azure Log Analytics workspace (LAW).
Method 1 export logs to CSV
Simple, no additional configuration
One-time export, can't configure repeat, 30,000 rows limitation
Click the Export button from its top action bar.
Method 2 Export Logs to Storage Account, Event Hubs..
Customize your workflow
Choose the destination you want. For example, Event Hubs, Storage Account
Additional Charge. Kindly refer : Usage metering, billing, and pricing models for Azure Logic Apps and Logic Apps pricing
Step 1 Prepare your KQL query, we will export the result of this query to the destination.
Step 2 Go to Logic apps from Azure portal
and click '+ Add' to create your own Logic app
Step 3 Configure your Logic app. Just pay attention to Plan, Standard allows you configure multiple workflows in the same Logic app but Consumption can only create one workflow per Logic app.
Step 4 You will be directed to Logic Apps Designer automatically (consumption plan), starts with common trigger "Recurrence" . Define the frequency for this workflow
Step 5 Click Next Step, and search for 'Azure Monitor Logs'
Step 6 Configure query and list results and click Next Step
Query: the KQL query you prepared before.
If you want to send logs to Azure Event Hubs, then you can continue with below steps.
Step 7 Choose an operation -> Event Hubs -> Send requests
Step 8 Configure Event Hubs as below
Get the Event Hubs connection string from Event Hubs
Step 9 Configure Send Event, tested with below settings successfully
Content - 'value -Item' , with this setting, Body is encoded
Step 10 Click Save and Run/Trigger (manually run and check the workflow works or not).
If you want to send the log results to Storage Account, then you can follow this section.
Step 7 Go to Create Blob (V2) in the 3rd action
Create a connection to your Storage Account
You can get Connection String from your SA page with below steps
After connection is established, you can configure
- Folder path: which Container you want to sent to
- Blob name: How to name the Blob
- Blob content: Current item
Step 8 Save it and Run/Trigger, then you can go to your Storage Account to check the result.
Create CSV table
Step 7 After the 'run query and list result' action, add 'create CSV table' action.
Step 8 In the 'From' place, choose 'value'. There are two kinds of 'value', one is 'value' the other is 'value_item' (this one means every item to create a CSV, which is not our expect.)
Please note, If you are using Consumption and your operation is For Each, then it means each data will be considered as 1 run. So be careful with those query which may return thousands of rows. Logic Apps pricing