Avro codec plugin v3.2.2
- Plugin version: v3.2.2
- Released on: 2017-08-15
- Changelog
For other versions, see the overview list.
To learn more about Logstash, see the Logstash Reference.
For questions about the plugin, open a topic in the Discuss forums. For bugs or feature requests, open an issue in Github. For the list of Elastic supported plugins, please consult the Elastic Support Matrix.
Read serialized Avro records as Logstash events
This plugin is used to serialize Logstash events as Avro datums, as well as deserializing Avro datums into Logstash events.
This codec is for serializing individual Logstash events as Avro datums that are Avro binary blobs. It does not encode Logstash events into an Avro file.
This codec is for deserializing individual Avro records. It is not for reading Avro files. Avro files have a unique format that must be handled upon input.
Example usage with Kafka input.
input {
kafka {
codec => avro {
schema_uri => "/tmp/schema.avsc"
}
}
}
filter {
...
}
output {
...
}
Setting | Input type | Required |
---|---|---|
schema_uri |
string | Yes |
tag_on_failure |
boolean | No |
- This is a required setting.
- Value type is string
- There is no default value for this setting.
schema path to fetch the schema from. This can be a http or file scheme URI example:
- http -
http://example.com/schema.avsc
- file -
/path/to/schema.avsc
- Value type is boolean
- Default value is
false
tag events with _avroparsefailure
when decode fails