Hello,
Welcome to Microsoft Q&A,
- Prepare Log Analytics to receive Harness logs
- Create a custom table (e.g.,
Harness_Audit_CL) in your workspace via the portal. This wizard also creates a Data Collection Rule (DCR) for you. - Note the ingestion endpoint and stream from the DCR. The request URL format is:
- https://<your-dce-or-dcr-endpoint>.ingest.monitor.azure.com/dataCollectionRules/dcr-xxxx/streams/Custom-Harness_Audit?api-version=2023-01-01
- Give your sender identity permission on the DCR (IAM ➜ Monitoring Metrics Publisher is commonly used in the official tutorial)
- Build the Logic App workflow (Consumption or Standard)
Trigger: Recurrence (e.g., every 10 minutes). Actions:
Get ‘since’ window (variables or a key/value store) so you only pull deltas.
- HTTP (to Harness): call
…/audit/getauditeventlistv2with query params forstartTime/endTime(and paging). Add Harness headers with your API key/secret; keep them in Azure Key Vault.
Transform the response JSON to the schema your DCR expects (you can also keep it raw and transform in the DCR).
- HTTP (to Azure Monitor Logs Ingestion):
Method: POST
URL: your DCR/DCE ingestion URL (above)
Headers: Content-Type: application/json; charset=utf-8
- Auth: Managed Identity for the Logic App with Audience
https://monitor.azure.com/(supported in the HTTP action).
- [https://free.blessedness.top/en-us/azure/logic-apps/authenticate-with-managed-identity?tabs=consumption](https://free.blessedness.top/en-us/azure/logic-apps/authenticate-with-managed-identity?tabs=consumption)
The records show up in Harness_Audit_CL within seconds. The official Logs Ingestion API docs + tutorial have end-to-end samples and the exact endpoint/fields.
https://free.blessedness.top/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-portal
Please Upvote and accept the answer if it helps!!