﻿---
title: Encoding modes for Azure logs
description: Understand the OTel Native and ECS logs encoding modes available in the EDOT Cloud Forwarder for Azure.
url: https://docs-v3-preview.elastic.dev/elastic/edot-cloud-forwarder-azure/tree/main/reference/edot-cf-azure/encoding-modes
products:
  - EDOT Cloud Forwarder
  - EDOT Cloud Forwarder for Azure
  - Elastic Cloud Serverless
  - Elastic Observability
applies_to:
  - Serverless Observability projects: Preview
  - Elastic Distribution of OpenTelemetry Cloud Forwarder for Azure: Preview since 0.7
---

# Encoding modes for Azure logs
The encoding mode controls how Azure resource logs are transformed before being sent to Elasticsearch.
EDOT Cloud Forwarder for Azure supports two encoding modes for logs.

| Mode                      | Description                                                                                                       | Data stream dataset                    | Prerequisites                                     |
|---------------------------|-------------------------------------------------------------------------------------------------------------------|----------------------------------------|---------------------------------------------------|
| **OTel Native** (default) | Parses logs into OpenTelemetry semantic conventions with structured attributes.                                   | `azure.<service>.otel`                 | None                                              |
| **ECS**                   | Preserves the original Azure log as a string and routes it through the Azure Logs integration for ECS processing. | `azure.events` are routed by category. | [Azure Logs integration](#ecs-mode-prerequisites) |


## ECS mode prerequisites

ECS mode requires the **Azure Logs** integration on your Elasticsearch cluster. The integration provides the ingest pipelines that parse the raw event, routes it to the correct data stream based on the `category` field, and applies ECS field mappings.
To install the Azure Logs integration:
1. In your Elastic Cloud deployment or Serverless project, go to **Integrations**.
2. Search for **Azure Logs**.
3. Select **Install**.

Without the integration, the documents remain in `logs-azure.events-default` unprocessed and may not be indexed correctly.

## How encoding works

When an Azure resource log arrives at the Event Hub, EDOT Cloud Forwarder for Azure decodes and transforms it according to the selected encoding mode before sending it to Elasticsearch.
The following example shows how the same Azure Activity Log appears in Elasticsearch depending on the encoding mode.
This is a simplified Azure Activity Log as it arrives in the Event Hub:
<dropdown title="Log sample">
  ```json
  {
    "time": "2025-06-10T14:30:00.1234567Z",
    "resourceId": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM",
    "operationName": "Microsoft.Compute/virtualMachines/write",
    "category": "Administrative",
    "resultType": "Success",
    "correlationId": "abc-123-def-456",
    "level": "Information",
    "location": "westeurope",
    "callerIpAddress": "203.0.113.50",
    "properties": {
      "message": "Virtual machine updated successfully"
    }
  }
  ```
</dropdown>

The log is parsed into OpenTelemetry semantic conventions. Azure-specific fields are mapped to structured `attributes`, and cloud resource metadata is placed in `resource.attributes`:
<dropdown title="Log parsed to OTel semantic conventions">
  ```json
  {
    "@timestamp": "2025-06-10T14:30:00.123Z",
    "attributes": {
      "azure.category": "Administrative",
      "azure.correlation_id": "abc-123-def-456",
      "azure.operation.name": "Microsoft.Compute/virtualMachines/write",
      "azure.result.type": "Success",
      "network.peer.address": "203.0.113.50"
    },
    "body": {
      "text": "Virtual machine updated successfully"
    },
    "data_stream": {
      "dataset": "azure.activity.otel",
      "namespace": "default",
      "type": "logs"
    },
    "resource": {
      "attributes": {
        "cloud.provider": "azure",
        "cloud.region": "westeurope",
        "cloud.resource_id": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM"
      }
    },
    "severity_text": "Information"
  }
  ```
</dropdown>

In ECS mode, EDOT Cloud Forwarder for Azure preserves the original Azure log as a string in the `message` field and sends it to the `logs-azure.events-default` data stream. What happens next depends on whether the Azure Logs integration is installed.
<tab-set>
  <tab-item title="Without the Azure Logs integration">
    The document is stored in `logs-azure.events-default` with the original event as an unparsed string. Without the integration, documents remain unprocessed and may not be indexed correctly or lack the expected field mappings.
    <dropdown title="Log sample">
      ```json
      {
        "message": "{\"time\":\"2025-06-10T14:30:00.1234567Z\",\"resourceId\":\"/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM\",\"operationName\":\"Microsoft.Compute/virtualMachines/write\",\"category\":\"Administrative\",\"resultType\":\"Success\",\"correlationId\":\"abc-123-def-456\",\"level\":\"Information\",\"location\":\"westeurope\",\"callerIpAddress\":\"203.0.113.50\",\"properties\":{\"message\":\"Virtual machine updated successfully\"}}",
        "data_stream": {
          "dataset": "azure.events",
          "namespace": "default",
          "type": "logs"
        },
        "event": {
          "dataset": "azure.events"
        },
        "input": {
          "type": "azure-eventhub"
        },
        "tags": ["forwarded", "preserve_original_event"]
      }
      ```
    </dropdown>
  </tab-item>

  <tab-item title="With the Azure Logs integration">
    The integration's ingest pipeline parses the `message` string, extracts the `category` field, and routes the document to the appropriate final data stream. Because this log has `"category": "Administrative"`, it is routed to `logs-azure.activitylogs-default` and processed into ECS fields:
    <dropdown title="Log sample">
      ```json
      {
        "@timestamp": "2025-06-10T14:30:00.123Z",
        "azure": {
          "activitylogs": {
            "category": "Administrative",
            "operation_name": "Microsoft.Compute/virtualMachines/write",
            "result_type": "Success"
          },
          "correlation_id": "abc-123-def-456",
          "resource": {
            "group": "TEST-RG",
            "id": "/SUBSCRIPTIONS/TEST-SUB-001/RESOURCEGROUPS/TEST-RG/PROVIDERS/MICROSOFT.COMPUTE/VIRTUALMACHINES/MY-VM",
            "provider": "MICROSOFT.COMPUTE"
          },
          "subscription_id": "TEST-SUB-001"
        },
        "client": {
          "ip": "203.0.113.50"
        },
        "cloud": {
          "provider": "azure"
        },
        "data_stream": {
          "dataset": "azure.activitylogs",
          "namespace": "default",
          "type": "logs"
        },
        "event": {
          "action": "Microsoft.Compute/virtualMachines/write",
          "kind": "event"
        },
        "log": {
          "level": "Information"
        },
        "tags": ["preserve_original_event"]
      }
      ```
    </dropdown>
  </tab-item>
</tab-set>