﻿---
title: Deserializing Data
description: The plugins described in this section are useful for deserializing data into Logstash events. 
url: https://www.elastic.co/elastic/docs-builder/docs/3016/reference/logstash/data-deserialization
products:
  - Logstash
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Deserializing Data
The plugins described in this section are useful for deserializing data into Logstash events.
<definitions>
  <definition term="avro codec">
    Reads serialized Avro records as Logstash events. This plugin deserializes individual Avro records. It is not for reading Avro files. Avro files have a unique format that must be handled upon input.
    The following config deserializes input from Kafka:
    ```json
    input {
      kafka {
        codec => {
          avro => {
            schema_uri => "/tmp/schema.avsc"
          }
        }
      }
    }
    ...
    ```
  </definition>
  <definition term="csv filter">
    Parses comma-separated value data into individual fields. By default, the filter autogenerates field names (column1, column2, and so on), or you can specify a list of names. You can also change the column separator.
    The following config parses CSV data into the field names specified in the `columns` field:
    ```json
    filter {
      csv {
        separator => ","
        columns => [ "Transaction Number", "Date", "Description", "Amount Debit", "Amount Credit", "Balance" ]
      }
    }
    ```
  </definition>
  <definition term="fluent codec">
    Reads the Fluentd `msgpack` schema.
    The following config decodes logs received from `fluent-logger-ruby`:
    ```json
    input {
      tcp {
        codec => fluent
        port => 4000
      }
    }
    ```
  </definition>
  <definition term="json codec">
    Decodes (via inputs) and encodes (via outputs) JSON formatted content, creating one event per element in a JSON array.
    The following config decodes the JSON formatted content in a file:
    ```json
    input {
      file {
        path => "/path/to/myfile.json"
        codec =>"json"
    }
    ```
  </definition>
  <definition term="protobuf codec">
    Reads protobuf encoded messages and converts them to Logstash events. Requires the protobuf definitions to be compiled as Ruby files. You can compile them by using the [ruby-protoc compiler](https://github.com/codekitchen/ruby-protocol-buffers).
    The following config decodes events from a Kafka stream:
    ```json
    input
      kafka {
        zk_connect => "127.0.0.1"
        topic_id => "your_topic_goes_here"
        codec => protobuf {
          class_name => "Animal::Unicorn"
          include_path => ['/path/to/protobuf/definitions/UnicornProtobuf.pb.rb']
        }
      }
    }
    ```
  </definition>
  <definition term="xml filter">
    Parses XML into fields.
    The following config parses the whole XML document stored in the `message` field:
    ```json
    filter {
      xml {
        source => "message"
      }
    }
    ```
  </definition>
</definitions>