Loading

CrowdStrike Integration

<div class="condensed-table">
| | |
| --- | --- |
| Version | 1.49.0 (View all) |
| Compatible Kibana version(s) | 8.16.0 or higher |
| Supported Serverless project types
What’s this? | Security
Observability |
| Subscription level
What’s this? | Basic |
| Level of support
What’s this? | Elastic |

</div>
The CrowdStrike integration allows you to easily connect your CrowdStrike Falcon platform to Elastic for seamless onboarding of alerts and telemetry from CrowdStrike Falcon and Falcon Data Replicator. Elastic Security can leverage this data for security analytics including correlation, visualization and incident response. It provides support using four different modes for integrating CrowdStrike to the Elastic:

  1. Falcon SIEM Connector: This is a pre-built integration designed to connect CrowdStrike Falcon with Security Information and Event Management (SIEM) systems. It streamlines the flow of security data from CrowdStrike Falcon to the SIEM, providing a standardized and structured way of feeding information into the SIEM platform. It includes the following datasets for receiving logs:

    • falcon dataset: consists of endpoint data and Falcon platform audit data forwarded from Falcon SIEM Connector.
  2. CrowdStrike REST API: This provides a programmatic interface to interact with the CrowdStrike Falcon platform. It allows users to perform various operations such as querying information about unified alerts and hosts/devices. It includes the following datasets for receiving logs:

    • alert dataset: It is typically used to retrieve detailed information about unified alerts generated by the CrowdStrike Falcon platform, via Falcon Intelligence Alert API - /alerts/entities/alerts/v2.
    • host dataset: It retrieves all the hosts/devices in your environment providing information such as device metadata, configuration, and status generated by the CrowdStrike Falcon platform, via Falcon Intelligence Host/Device API - /devices/entities/devices/v2. It is more focused to provide the management and monitoring information of devices such as login details, status, policies, configuration etc.
  3. Falcon Data Replicator: This Collect events in near real time from your endpoints and cloud workloads, identities and data. CrowdStrike Falcon Data Replicator (FDR) enables you with actionable insights to improve SOC performance. FDR contains near real-time data collected by the Falcon platform’s single, lightweight agent. It includes the following datasets for receiving logs:

  4. CrowdStrike Event Stream: This streams security logs from CrowdStrike Event Stream, including authentication activity, cloud security posture management (CSPM), firewall logs, user activity, and XDR data. It captures real-time security events like user logins, cloud environment changes, network traffic, and advanced threat detections. The streaming integration provides continuous monitoring and analysis for proactive threat detection. It enhances visibility into user behavior, network security, and overall system health. This setup enables faster response capabilities to emerging security incidents. It includes the following datasets for receiving logs:

    • falcon dataset: consists of streaming data forwarded from CrowdStrike Event Stream.

This integration is compatible with CrowdStrike Falcon SIEM-Connector-v2.0, REST API, and CrowdStrike Event Streaming. For Rest API support, this module has been tested against the CrowdStrike API Version v1/v2.

  1. Client ID

  2. Client Secret

  3. Token url

  4. API Endpoint url

  5. Required scopes for each data stream :

    Data Stream Scope
    Alert read:alert
    Host read:host
  1. Client ID

  2. Client Secret

  3. Token URL

  4. API Endpoint URL

  5. CrowdStrike App ID

  6. Required scopes for event stream:

    Data Stream Scope
    Event Stream read: Event streams

This is the Alert dataset.

Contains endpoint data and CrowdStrike Falcon platform audit data forwarded from Falcon SIEM Connector.

By default, the configuration file located at /opt/crowdstrike/etc/cs.falconhoseclient.cf provides configuration options related to the events collected by Falcon SIEM Connector.

Parts of the configuration file called EventTypeCollection and EventSubTypeCollection provides a list of event types that the connector should collect.

Current supported event types are:

  • DetectionSummaryEvent
  • IncidentSummaryEvent
  • UserActivityAuditEvent
  • AuthActivityAuditEvent
  • FirewallMatchEvent
  • RemoteResponseSessionStartEvent
  • RemoteResponseSessionEndEvent
  • CSPM Streaming events
  • CSPM Search events
  • IDP Incidents
  • IDP Summary events
  • Mobile Detection events
  • Recon Notification events
  • XDR Detection events
  • Scheduled Report Notification events

The CrowdStrike Falcon Data Replicator (FDR) allows CrowdStrike users to replicate FDR data from CrowdStrike managed S3 buckets. CrowdStrike writes notification events to a CrowdStrike managed SQS queue when new data is available in S3.

This integration can be used in two ways. It can consume SQS notifications directly from the CrowdStrike managed SQS queue or it can be used in conjunction with the FDR tool that replicates the data to a self-managed S3 bucket and the integration can read from there.

In both cases SQS messages are deleted after they are processed. This allows you to operate more than one Elastic Agent with this integration if needed and not have duplicate events, but it means you cannot ingest the data a second time.

This is the simplest way to setup the integration, and also the default.

You need to set the integration up with the SQS queue URL provided by Crowdstrike FDR. Ensure the Is FDR queue option is enabled.

This option can be used if you want to archive the raw CrowdStrike data.

You need to follow the steps below:

  • Create a S3 bucket to receive the logs.
  • Create a SQS queue.
  • Configure your S3 bucket to send object created notifications to your SQS queue.
  • Follow the FDR tool instructions to replicate data to your own S3 bucket.
  • Configure the integration to read from your self-managed SQS topic.
  • Disable the Is FDR queue option in the integration.
Note

While the FDR tool can replicate the files from S3 to your local file system, this integration cannot read those files because they are gzip compressed, and the log file input does not support reading compressed files.

AWS credentials are required for running this integration if you want to use the S3 input.

  • access_key_id: first part of access key.
  • secret_access_key: second part of access key.
  • session_token: required when using temporary security credentials.
  • credential_profile_name: profile name in shared credentials file.
  • shared_credential_file: directory of the shared credentials file.
  • endpoint: URL of the entry point for an AWS web service.
  • role_arn: AWS IAM Role to assume.

There are three types of AWS credentials can be used:

  • access keys,
  • temporary security credentials, and
  • IAM role ARN.

AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY are the two parts of access keys. They are long-term credentials for an IAM user, or the AWS account root user. Please see AWS Access Keys and Secret Access Keys for more details.

Temporary security credentials has a limited lifetime and consists of an access key ID, a secret access key, and a security token which typically returned from GetSessionToken.

MFA-enabled IAM users would need to submit an MFA code while calling GetSessionToken. default_region identifies the AWS Region whose servers you want to send your first API request to by default.

This is typically the Region closest to you, but it can be any Region. Please see Temporary Security Credentials for more details.

sts get-session-token AWS CLI can be used to generate temporary credentials. For example. with MFA-enabled:

aws> sts get-session-token --serial-number arn:aws:iam::1234:mfa/your-email@example.com --duration-seconds 129600 --token-code 123456

Because temporary security credentials are short term, after they expire, the user needs to generate new ones and manually update the package configuration in order to continue collecting aws metrics.

This will cause data loss if the configuration is not updated with new credentials before the old ones expire.

An IAM role is an IAM identity that you can create in your account that has specific permissions that determine what the identity can and cannot do in AWS.

A role does not have standard long-term credentials such as a password or access keys associated with it. Instead, when you assume a role, it provides you with temporary security credentials for your role session. IAM role Amazon Resource Name (ARN) can be used to specify which AWS IAM role to assume to generate temporary credentials.

Please see AssumeRole API documentation for more details.

  1. Use access keys: Access keys include access_key_id, secret_access_key and/or session_token.
  2. Use role_arn: role_arn is used to specify which AWS IAM role to assume for generating temporary credentials. If role_arn is given, the package will check if access keys are given. If not, the package will check for credential profile name. If neither is given, default credential profile will be used.

Please make sure credentials are given under either a credential profile or access keys. 3. Use credential_profile_name and/or shared_credential_file: If access_key_id, secret_access_key and role_arn are all not given, then the package will check for credential_profile_name. If you use different credentials for different tools or applications, you can use profiles to configure multiple access keys in the same configuration file. If there is no credential_profile_name given, the default profile will be used. shared_credential_file is optional to specify the directory of your shared credentials file. If it’s empty, the default directory will be used. In Windows, shared credentials file is at C:\Users<yourUserName>.aws\credentials. For Linux, macOS or Unix, the file locates at ~/.aws/credentials. Please seehttps://docs.aws.amazon.com/ses/latest/DeveloperGuide/create-shared-credentials-file.html[Create Shared Credentials File] for more details.

This is the Host dataset.