﻿---
title: Working with Winlogbeat modules
description: Winlogbeat comes packaged with pre-built modules that contain the configurations needed to collect, parse, enrich, and visualize data from various Windows...
url: https://www.elastic.co/elastic/docs-builder/docs/3028/reference/logstash/working-with-winlogbeat-modules
products:
  - Logstash
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Working with Winlogbeat modules
Winlogbeat comes packaged with pre-built [modules](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/beats/winlogbeat/winlogbeat-modules) that contain the configurations needed to collect, parse, enrich, and visualize data from various Windows logging providers. Each Winlogbeat module consists of one or more filesets that contain ingest node pipelines, Elasticsearch templates, Winlogbeat input configurations, and Kibana dashboards.
You can use Winlogbeat modules with Logstash, but you need to do some extra setup. The simplest approach is to [set up and use the ingest pipelines](#use-winlogbeat-ingest-pipelines) provided by Winlogbeat.

## Use ingest pipelines for parsing

When you use Winlogbeat modules with Logstash, you can use the ingest pipelines provided by Winlogbeat to parse the data. You need to load the pipelines into Elasticsearch and configure Logstash to use them.
**To load the ingest pipelines:**
On the system where Winlogbeat is installed, run the `setup` command with the `--pipelines` option specified to load ingest pipelines for specific modules. For example, the following command loads ingest pipelines for the security and sysmon modules:
```shell
winlogbeat setup --pipelines --modules security,sysmon
```

A connection to Elasticsearch is required for this setup step because Winlogbeat needs to load the ingest pipelines into Elasticsearch. If necessary, you can temporarily disable your configured output and enable the Elasticsearch output before running the command.
**To configure Logstash to use the pipelines:**
On the system where Logstash is installed, create a Logstash pipeline configuration that reads from a Logstash input, such as Beats or Kafka, and sends events to an Elasticsearch output. Set the `pipeline` option in the Elasticsearch output to `%{[@metadata][pipeline]}` to use the ingest pipelines that you loaded previously.
Here’s an example configuration that reads data from the Beats input and uses Winlogbeat ingest pipelines to parse data collected by modules:
```yaml
input {
  beats {
    port => 5044
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}" 
      action => "create" 
      pipeline => "%{[@metadata][pipeline]}" 
      user => "elastic"
      password => "secret"
    }
  } else {
    elasticsearch {
      hosts => "https://061ab24010a2482e9d64729fdb0fd93a.us-east-1.aws.found.io:9243"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}" 
      action => "create"
      user => "elastic"
      password => "secret"
    }
  }
}
```

See the Winlogbeat [Modules](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/beats/winlogbeat/winlogbeat-modules) documentation for more information about setting up and running modules.