﻿---
title: Tutorial: Extend Elastic Integrations with the Logstash elastic_integration filter
description: You can use Logstash to transform events collected by Elastic Agents and paired with an Elastic integration. You get the benefits of Elastic integrations--such...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/reference/logstash/ea-integration-tutorial
products:
  - Elastic Cloud Enterprise
  - Elastic Cloud Hosted
  - Elastic Cloud on Kubernetes
  - Elasticsearch
  - Logstash
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Tutorial: Extend Elastic Integrations with the Logstash elastic_integration filter
You can use Logstash to transform events collected by Elastic Agents and paired with an [Elastic integration](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference).
You get the benefits of Elastic integrations--such as the simplicity of ingesting data from a wide variety of data sources and ensuring compliance with the [Elastic Common Schema (ECS)](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/ecs)--combined with the extra processing power of Logstash.
This new functionality is made possible by the [elastic_integration filter](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration) plugin.
When you include the `elastic_integration` filter in your configuration, Logstash reads certain field values generated by the Elastic Agent,
and uses them to apply the transformations from Elastic integrations.
This ability allows you to further process events in the Logstash pipeline before sending them to their configured destinations.
This tutorial walks you through adding the [Crowdstrike integration](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/crowdstrike), and sending the data to Elastic Cloud Hosted, Elastic Cloud Serverless, or self-managed Elasticsearch.

## Prerequisites

You need:
- A working Elastic Cloud Hosted or Elastic Cloud Serverless deployment, or self-managed Elasticsearch cluster
- A Logstash instance
- Fleet Server
- An [Elastic Agent installed](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/fleet/install-elastic-agents) on the hosts you want to collect data from, and configured to [send output to Logstash](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/fleet/logstash-output)
- An active Elastic [subscription](https://www.elastic.co/subscriptions).
- A user configured with the [minimum required privileges](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration#plugins-filters-elastic_integration-minimum_required_privileges).

<note>
  Even though the focus of this tutorial is Fleet-managed agents, you can use the `elastic_integration` filter and this
  general approach with [self-managed agents](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/fleet/install-standalone-elastic-agent).
</note>


## Process overview

- [Configure Fleet to send data from Elastic Agent to Logstash](#ea-integrations-fleet)
- [Create an Elastic Agent policy with the necessary integration](#ea-integrations-create-policy)
- [Configure Logstash to use the `elastic_integration` filter plugin](#ea-integrations-pipeline)


## Configure Fleet to send data from Elastic Agent to Logstash

1. For Fleet-managed agents, go to Kibana and navigate to **Fleet > Settings**.
2. Create a new output and specify Logstash as the output type.
3. Add the Logstash hosts (domain or IP address/s) that the Elastic Agent should send data to.
4. Add the client SSL certificate and the Client SSL certificate key to the configuration.
5. Click **Save and apply settings** in the bottom right-hand corner of the page.


## Create an Elastic Agent policy with the necessary integrations

1. In Kibana navigate to **Fleet > Agent** policies, and select **Create agent policy**.
2. Give this policy a name, and then select **Advanced options**.
3. Change the **Output for integrations** setting to the Logstash output you created.
4. Click **Create agent policy**.
5. Select the policy name, and click **Add integration**.
   This step takes you to the Integrations browser, where you can select an integration that has everything necessary to _integrate_ the data source with your other data in the Elastic Stack.
   We'll use Crowdstrike as our example in this tutorial.
6. On the **Crowdstrike** integration overview page, click **Add Crowdstrike** to configure the integration.
7. Configure the integration to collect the data you need.
   On step 2 at the bottom of the page (**Where to add this integration?**), make sure that the “Existing hosts” option
   is selected and the Agent policy selected is the Logstash policy that we created for our Logstash output.
   This policy should be selected by default.
8. Click **Save and continue**.
   You have the option to add the Elastic Agent to your hosts. If you haven't already, [install the Elastic Agent](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/fleet/install-elastic-agents) on the host where you want to collect data.


## Configure Logstash to use the `elastic_integration` filter plugin

1. Create a new [Logstash pipeline](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/logstash/creating-logstash-pipeline).
2. Be sure to include these plugins:
   - [elastic_agent input](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-inputs-elastic_agent)
- [`elastic_integration` filter](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration)
- [`elasticsearch` output](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-outputs-elasticsearch)

Note that every event sent from the Elastic Agent to Logstash contains specific meta-fields.
Logstash expects events to contain a top-level `data_stream` field with `type`, `dataset`, and `namespace` sub-fields.
Logstash uses this information and its connection to Elasticsearch to determine which integrations to apply to the event before sending the event to its destination output.
Logstash frequently synchronizes with Elasticsearch to ensure that it has the most recent versions of the enabled integrations.

### Sample configuration: output to Elastic Cloud

This sample illustrates using the `elastic_agent` input and the `elastic_integration` filter for processing in Logstash, and then sending the output to Elastic Cloud Hosted or Elastic Cloud Serverless.
Check out the [`elastic_integration` filter](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration) for the full list of configuration options.
```txt
input {
  elastic_agent { port => 5055 }
}

filter {
  elastic_integration {
    cloud_id => "your-cloud:id"
    api_key => "your-api-key"
  }
}

output {
  stdout {}
  elasticsearch {
    cloud_id => "your-cloud:id"
    api_key => "your-api-key"
  }
}
```

All processing occurs in Logstash before events are forwarded to Elastic Cloud Hosted or Elastic Cloud Serverless.

### Sample configuration: output to self-managed Elasticsearch

This sample illustrates using the `elastic_agent` input and the `elastic_integration` filter for processing in Logstash, and then sending the output to Elasticsearch.
Check out the [`elastic_integration` filter](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration) for the full list of configuration options.
Check out [minimum required privileges](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration#plugins-filters-elastic_integration-minimum_required_privileges) for more info.
```txt
input {
  elastic_agent { port => 5055 }
}

filter {
  elastic_integration {
    hosts => ["{es-host}:9200"]
    ssl_enabled => true
    ssl_certificate_authorities => "/usr/share/logstash/config/certs/ca-cert.pem"
    username => "elastic" 
    password => "changeme"
  }
}

output {
  stdout {
    codec => rubydebug
  }
  ## add elasticsearch
  elasticsearch {
    hosts => ["{es-host}:9200"]
    user => "elastic"
    password => "changeme"
    ssl_certificate_authorities => "/usr/share/logstash/config/certs/ca-cert.pem"
  }
}
```

<note>
  The user credentials that you specify in the `elastic_integration` filter must have sufficient privileges to get information about Elasticsearch and the integrations that you are using.
</note>

If your Elastic Agent and Logstash pipelines are configured correctly, events go to Logstash for processing before Logstash forwards them on to Elasticsearch.
If you encounter problems, check out the [Troubleshooting](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/lsr/plugins-filters-elastic_integration#plugins-filters-elastic_integration-troubleshooting) section in the `elastic_integration` filter docs.