Loading

Sublime Security

<div class="condensed-table">
| | |
| --- | --- |
| Version | 1.4.0 (View all) |
| Compatible Kibana version(s) | 8.16.2 or higher |
| Supported Serverless project types
What’s this? | Security
Observability |
| Subscription level
What’s this? | Basic |
| Level of support
What’s this? | Elastic |

</div>
Sublime Security is a programmable, AI-powered, cloud email security platform for Microsoft 365 and Google Workspace environments. It is used to block email attacks such as phishing, BEC, malware, threat hunt, and auto-triage user reports.

The Sublime Security integration collects data for Audit, Email Message(MDM Schema) and Message Event logs using REST API and AWS-S3 or AWS-SQS:

  • REST API mode - Sublime Security integration collects and parses data from the Sublime Security REST APIs.
  • AWS S3 polling mode - Sublime Security writes data to S3 and Elastic Agent polls the S3 bucket by listing its contents and reading new files.
  • AWS S3 SQS mode - Sublime Security writes data to S3, S3 pushes a new object notification to SQS, Elastic Agent receives the notification from SQS, and then reads the S3 object. Multiple Agents can be used in this mode.

The Sublime Security integration collects three types of logs:

  • https://docs.sublime.security/reference/listeventsinauditlog[Audit] - Captures detailed records of all significant actions and changes within the platform, including changes to email security policies, user access to email data, and modifications to email configurations, ensuring traceability and compliance for all operations.
  • https://docs.sublime.security/docs/export-message-mdms[Email Message] - Represents the flow of individual emails through the platform, including sender and recipient details, spam filtering outcomes, and overall email disposition, helping to secure and analyze email communication.
  • https://docs.sublime.security/reference/getmessage-1[Message Event] - Represents document specific actions taken on emails, like spam detection or rule applications, providing detailed insights into how the platform processes and protects email communications.

Elastic Agent must be installed. For more details and installation instructions, please refer to the Elastic Agent Installation Guide.

There are several options for installing and managing Elastic Agent:

With this approach, you install Elastic Agent and use Fleet in Kibana to define, configure, and manage your agents in a central location. We recommend using Fleet management because it makes the management and upgrade of your agents considerably easier.

With this approach, you install Elastic Agent and manually configure the agent locally on the system where it’s installed. You are responsible for managing and upgrading the agents. This approach is reserved for advanced users only.

You can run Elastic Agent inside a container, either with Fleet Server or standalone. Docker images for all versions of Elastic Agent are available from the Elastic Docker registry, and we provide deployment manifests for running on Kubernetes.

Please note, there are minimum requirements for running Elastic Agent. For more information, refer to the Elastic Agent Minimum Requirements.

  • Retrieve your API Key. This key will be used further in the Elastic integration setup to authenticate and access different Sublime Security Logs.
  • Base URL of Sublime Security is also required for configuring integration.
Note

Users with the Admin role are allowed to access Audit logs. For more information, refer here.

  • Create an Amazon S3 bucket. Refer to the link here.
  • User can set the parameter "Bucket List Prefix" according to the requirement.
  1. If data forwarding to an AWS S3 Bucket hasn’t been configured, then first set up an AWS S3 Bucket as mentioned in the above documentation.

  2. To set up an SQS queue, follow "Step 1: Create an Amazon SQS queue" mentioned in the Documentation.

    • While creating an SQS Queue, please provide the same bucket ARN that has been generated after creating an AWS S3 Bucket.
  3. Set up event notifications for a S3 bucket. Follow this link.

    • Users have to set the prefix parameter the same as the S3 Bucket List Prefix as created earlier. (for example, exports/sublime_platform_audit_log/ for a audit data stream).
    • Select the event type as s3:ObjectCreated:*, select the destination type SQS Queue, and select the queue that has been created in Step 2.

NOTE:

  • Credentials for the above AWS S3 and SQS input types should be configured using the link.
  • Data collection via AWS S3 Bucket and AWS SQS are mutually exclusive in this case.
  • You can configure a global SQS queue for all data streams or a local SQS queue for each data stream. Configuring data stream specific SQS queues will enable better performance and scalability. Data stream specific SQS queues will always override any global queue definitions for that specific data stream.
  1. In Kibana navigate to Management > Integrations.
  2. In "Search for integrations" top bar, search for Sublime Security.
  3. Select the "Sublime Security" integration from the search results.
  4. Select "Add Sublime Security" to add the integration.
  5. Enable the Integration to collect logs via AWS S3 or API input.
  6. Under the AWS S3 input, there are two types of inputs: using AWS S3 Bucket or using SQS.
  7. Add all the required integration configuration parameters, including API Key, Interval, Initial Interval and Page Size for API input and Access Key, Secret Key and Session Token for AWS input type to enable data collection.
  8. Click on "Save and continue" to save the integration.

NOTE:

  • The Base URL for Sublime Security cloud customers is https://api.platform.sublimesecurity.com. Depending on your type of deployment, yours may be different.
  • For SSO users, in addition to access key ID and secret access key, the session token is required to configure integration. For IAM users, the session token is optional and not required.

This is the audit dataset.

This is the email_message dataset.

This is the message_event dataset.