Loading

Avro codec plugin

  • Plugin version: v3.4.1
  • Released on: 2023-10-16
  • Changelog

For other versions, see the Versioned plugin docs.

For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.

Read serialized Avro records as Logstash events

This plugin is used to serialize Logstash events as Avro datums, as well as deserializing Avro datums into Logstash events.

The plugin behaves the same regardless of ECS compatibility, except adding the original message to [event][original].

This codec is for serializing individual Logstash events as Avro datums that are Avro binary blobs. It does not encode Logstash events into an Avro file.

This codec is for deserializing individual Avro records. It is not for reading Avro files. Avro files have a unique format that must be handled upon input.

Partial deserialization

Avro format is known to support partial deserialization of arbitrary fields, providing a schema containing a subset of the schema which was used to serialize the data. This codec doesn’t support partial deserialization of arbitrary fields. Partial deserialization might work only when providing a schema which contains the first N fields of the schema used to serialize the data (and in the same order).

Example usage with Kafka input.

input {
  kafka {
    codec => avro {
        schema_uri => "/tmp/schema.avsc"
    }
  }
}
filter {
  ...
}
output {
  ...
}
Setting Input type Required
ecs_compatibility string No
encoding string, one of ["binary", "base64"] No
schema_uri string Yes
tag_on_failure boolean No
target string No

  • Value type is string

  • Supported values are:

    • disabled: Avro data added at root level
    • v1,v8: Elastic Common Schema compliant behavior ([event][original] is also added)

Controls this plugin’s compatibility with the Elastic Common Schema (ECS).

  • Value can be any of: binary, base64
  • Default value is base64

Set encoding for Avro’s payload. Use base64 (default) to indicate that this codec sends or expects to receive base64-encoded bytes.

Set this option to binary to indicate that this codec sends or expects to receive binary Avro data.

  • This is a required setting.
  • Value type is string
  • There is no default value for this setting.

schema path to fetch the schema from. This can be a http or file scheme URI example:

  • http - http://example.com/schema.avsc
  • file - /path/to/schema.avsc
  • Value type is boolean
  • Default value is false

tag events with _avroparsefailure when decode fails

  • Value type is string
  • There is no default value for this setting.
  • This is only relevant when decode data into an event

Define the target field for placing the values. If this setting is not set, the Avro data will be stored at the root (top level) of the event.

Example

input {
  kafka {
    codec => avro {
        schema_uri => "/tmp/schema.avsc"
        target => "[document]"
    }
  }
}