﻿---
title: Use ingest pipelines for parsing
description: When you use Filebeat modules with Logstash, you can use the ingest pipelines provided by Filebeat to parse the data. You need to load the pipelines into...
url: https://www.elastic.co/elastic/docs-builder/docs/3028/reference/logstash/use-ingest-pipelines
products:
  - Logstash
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Use ingest pipelines for parsing
When you use Filebeat modules with Logstash, you can use the ingest pipelines provided by Filebeat to parse the data. You need to load the pipelines into Elasticsearch and configure Logstash to use them.
**To load the ingest pipelines:**
On the system where Filebeat is installed, run the `setup` command with the `--pipelines` option specified to load ingest pipelines for specific modules. For example, the following command loads ingest pipelines for the system and nginx modules:
```shell
filebeat setup --pipelines --modules nginx,system
```

A connection to Elasticsearch is required for this setup step because Filebeat needs to load the ingest pipelines into Elasticsearch. If necessary, you can temporarily disable your configured output and enable the Elasticsearch output before running the command.
**To configure Logstash to use the pipelines:**
On the system where Logstash is installed, create a Logstash pipeline configuration that reads from a Logstash input, such as Beats or Kafka, and sends events to an Elasticsearch output. Set the `pipeline` option in the Elasticsearch output to `%{[@metadata][pipeline]}` to use the ingest pipelines that you loaded previously.
Here’s an example configuration that reads data from the Beats input and uses Filebeat ingest pipelines to parse data collected by modules:
```yaml
input {
  beats {
    port => 5044
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}" 
      action => "create" 
      pipeline => "%{[@metadata][pipeline]}" 
      user => "elastic"
      password => "secret"
    }
  } else {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}" 
      action => "create"
      user => "elastic"
      password => "secret"
    }
  }
}
```

See the Filebeat [Modules](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/beats/filebeat/filebeat-modules-overview) documentation for more information about setting up and running modules.
For a full example, see [Example: Set up Filebeat modules to work with Kafka and Logstash](https://www.elastic.co/elastic/docs-builder/docs/3028/reference/logstash/use-filebeat-modules-kafka).