logstash
Loading

Tutorial: Extend Elastic Integrations with the Logstash elastic_integration filter

Serverless Stack

You can use Logstash to transform events collected by Elastic Agents and paired with an Elastic integration.

You get the benefits of Elastic integrations--such as the simplicity of ingesting data from a wide variety of data sources and ensuring compliance with the Elastic Common Schema (ECS)--combined with the extra processing power of Logstash.

This new functionality is made possible by the elastic_integration filter plugin. When you include the elastic_integration filter in your configuration, Logstash reads certain field values generated by the Elastic Agent, and uses them to apply the transformations from Elastic integrations. This ability allows you to further process events in the Logstash pipeline before sending them to their configured destinations.

This tutorial walks you through adding the Crowdstrike integration, and sending the data to Elastic Cloud Hosted, Elastic Cloud Serverless, or self-managed Elasticsearch.

You need:

Note

Even though the focus of this tutorial is Fleet-managed agents, you can use the elastic_integration filter and this general approach with self-managed agents.

  1. For Fleet-managed agents, go to Kibana and navigate to Fleet > Settings.
  2. Create a new output and specify Logstash as the output type.
  3. Add the Logstash hosts (domain or IP address/s) that the Elastic Agent should send data to.
  4. Add the client SSL certificate and the Client SSL certificate key to the configuration.
  5. Click Save and apply settings in the bottom right-hand corner of the page.
  1. In Kibana navigate to Fleet > Agent policies, and select Create agent policy.

  2. Give this policy a name, and then select Advanced options.

  3. Change the Output for integrations setting to the Logstash output you created.

  4. Click Create agent policy.

  5. Select the policy name, and click Add integration.

    This step takes you to the Integrations browser, where you can select an integration that has everything necessary to integrate the data source with your other data in the Elastic Stack.

    We'll use Crowdstrike as our example in this tutorial.

  6. On the Crowdstrike integration overview page, click Add Crowdstrike to configure the integration.

  7. Configure the integration to collect the data you need. On step 2 at the bottom of the page (Where to add this integration?), make sure that the “Existing hosts” option is selected and the Agent policy selected is the Logstash policy that we created for our Logstash output. This policy should be selected by default.

  8. Click Save and continue.

    You have the option to add the Elastic Agent to your hosts. If you haven't already, install the Elastic Agent on the host where you want to collect data.

  1. Create a new Logstash pipeline.
  2. Be sure to include these plugins:

Note that every event sent from the Elastic Agent to Logstash contains specific meta-fields. Logstash expects events to contain a top-level data_stream field with type, dataset, and namespace sub-fields.

Logstash uses this information and its connection to Elasticsearch to determine which integrations to apply to the event before sending the event to its destination output. Logstash frequently synchronizes with Elasticsearch to ensure that it has the most recent versions of the enabled integrations.

This sample illustrates using the elastic_agent input and the elastic_integration filter for processing in Logstash, and then sending the output to Elastic Cloud Hosted or Elastic Cloud Serverless.

Check out the elastic_integration filter for the full list of configuration options.

input {
  elastic_agent { port => 5055 }
}

filter {
  elastic_integration {
    cloud_id => "your-cloud:id"
    api_key => "your-api-key"
  }
}

output {
  stdout {}
  elasticsearch {
    cloud_id => "your-cloud:id"
    api_key => "your-api-key"
  }
}
		

All processing occurs in Logstash before events are forwarded to Elastic Cloud Hosted or Elastic Cloud Serverless.

This sample illustrates using the elastic_agent input and the elastic_integration filter for processing in Logstash, and then sending the output to Elasticsearch.

Check out the elastic_integration filter for the full list of configuration options.

Check out minimum required privileges for more info.

input {
  elastic_agent { port => 5055 }
}

filter {
  elastic_integration {
    hosts => ["{es-host}:9200"]
    ssl_enabled => true
    ssl_certificate_authorities => "/usr/share/logstash/config/certs/ca-cert.pem"
    username => "elastic"
    password => "changeme"
  }
}

output {
  stdout {
    codec => rubydebug
  }
  ## add elasticsearch
  elasticsearch {
    hosts => ["{es-host}:9200"]
    user => "elastic"
    password => "changeme"
    ssl_certificate_authorities => "/usr/share/logstash/config/certs/ca-cert.pem"
  }
}
		
  1. debugs datastream inputs
Note

The user credentials that you specify in the elastic_integration filter must have sufficient privileges to get information about Elasticsearch and the integrations that you are using.

If your Elastic Agent and Logstash pipelines are configured correctly, events go to Logstash for processing before Logstash forwards them on to Elasticsearch.

If you encounter problems, check out the Troubleshooting section in the elastic_integration filter docs.