﻿---
title: Get started with system logs
description: In this guide you can learn how to onboard system log data from a machine or server, then explore the data in Discover. Follow these steps to onboard...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/get-started-with-system-logs
products:
  - Elastic Cloud Serverless
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Get started with system logs
In this guide you can learn how to onboard system log data from a machine or server, then explore the data in **Discover**.

## Prerequisites

<applies-switch>
  <applies-item title="stack:" applies-to="Elastic Stack: Generally available">
    To follow the steps in this guide, you need an Elastic Stack deployment that includes:
    - Elasticsearch for storing and searching data
    - Kibana for visualizing and managing data
    - Kibana user with `All` privileges on Fleet and Integrations. Because many Integrations assets are shared across spaces, users need the Kibana privileges in all spaces.
    To get started quickly, create an Elastic Cloud Hosted deployment and host it on AWS, GCP, or Azure. [Try it out for free](https://cloud.elastic.co/registration?page=docs&placement=docs-body).
  </applies-item>

  <applies-item title="serverless:" applies-to="Elastic Cloud Serverless: Generally available">
    The **Admin** role or higher is required to onboard log data. To learn more, refer to [Assign user roles and privileges](/elastic/docs-builder/docs/3016/deploy-manage/users-roles/cloud-organization/manage-users#general-assign-user-roles).
  </applies-item>
</applies-switch>


## Onboard system log data

Follow these steps to onboard system log data.
<stepper>
  <step title="Open your project">
    Open an [Elastic Observability Serverless project](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/get-started) or Elastic Stack deployment.
  </step>

  <step title="Select data collection method">
    From the Observability UI, go to **Add data**. Under **What do you want to monitor?**, select **Host**, then select one of these options:
    <tab-set>
      <tab-item title="OpenTelemetry: Full Observability">
        Collect native OpenTelemetry metrics and logs using the Elastic Distribution of OpenTelemetry Collector (EDOT).**Recommended for**: Users who want to collect native OpenTelemetry data or are already using OpenTelemetry in their environment.
      </tab-item>

      <tab-item title="Elastic Agent: Logs & Metrics">
        Bring data from Elastic integrations using the Elastic Agent.**Recommended for**: Users who want to leverage Elastic's pre-built integrations and centralized management through Fleet.
      </tab-item>
    </tab-set>
  </step>

  <step title="Follow setup instructions">
    Follow the in-product steps to auto-detect your logs and install and configure your chosen data collector.
  </step>

  <step title="Verify data collection">
    After the agent is installed and successfully streaming log data, you can view the data in the UI:
    1. From the navigation menu, go to **Discover**.
    2. Select **All logs** from the **Data views** menu. The view shows all log datasets. Notice you can add fields, change the view, expand a document to see details, and perform other actions to explore your data.
  </step>

  <step title="Explore and analyze your data">
    Now that you have logs flowing into Elasticsearch, you can start exploring and analyzing your data:
    - **[Explore logs in Discover](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/explore-logs)**: Search, filter, and tail all your logs from a central location
    - **[Parse and route logs](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/parse-route-logs)**: Extract structured fields from unstructured logs and route them to specific data streams
    - **[Filter and aggregate logs](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/filter-aggregate-logs)**: Filter logs by specific criteria and aggregate data to find patterns and gain insights
  </step>
</stepper>


## Other ways to collect log data

While the Elastic Agent and OpenTelemetry Collector are the recommended approaches for most users, Elastic provides additional tools for specific use cases:
<tab-set>
  <tab-item title="Filebeat">
    Filebeat is a lightweight data shipper that sends log data to Elasticsearch. It's ideal for:
    - Simple log collection: When you need to collect logs from specific files or directories.
    - Custom parsing: When you need to parse logs using ingest pipelines before indexing.
    - Legacy systems: When you can't install the Elastic Agent or OpenTelemetry Collector.
    For more information, refer to [Collecting log data with Filebeat](https://www.elastic.co/elastic/docs-builder/docs/3016/deploy-manage/monitor/stack-monitoring/collecting-log-data-with-filebeat) and [Ingest logs from applications using Filebeat](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/plaintext-application-logs).
  </tab-item>

  <tab-item title="Winlogbeat">
    Winlogbeat is specifically designed for collecting Windows event logs. It's ideal for:
    - Windows environments: When you need to collect Windows security, application, and system event logs.
    - Security monitoring: When you need detailed Windows security event data.
    - Compliance requirements: When you need to capture specific Windows event IDs.
    For more information, refer to the [Winlogbeat documentation](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/winlogbeat).
  </tab-item>

  <tab-item title="Logstash">
    Logstash is a powerful data processing pipeline that can collect, transform, and enrich log data before sending it to Elasticsearch. It's ideal for:
    - Complex data processing: When you need to parse, filter, and transform logs before indexing.
    - Multiple data sources: When you need to collect logs from various sources and normalize them.
    - Advanced use cases: When you need data enrichment, aggregation, or routing to multiple destinations.
    - Extending Elastic integrations: When you want to add custom processing to data collected by Elastic Agent or Beats.
    For more information, refer to [Logstash](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/logstash) and [Using Logstash with Elastic integrations](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/logstash/using-logstash-with-elastic-integrations).
  </tab-item>

  <tab-item title="REST APIs">
    You can use Elasticsearch REST APIs to send log data directly to Elasticsearch. This approach is ideal for:
    - Custom applications: When you want to send logs directly from your application code.
    - Programmatic collection: When you need to collect logs using custom scripts or tools.
    - Real-time streaming: When you need to send logs as they're generated.
    For more information, refer to [Elasticsearch REST APIs](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/elasticsearch/rest-apis).
  </tab-item>
</tab-set>


## Next steps

Now that you've added logs and explored your data, learn how to onboard other types of data:
- [Stream any log file](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/stream-any-log-file)
- [Stream application logs](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/stream-application-logs)
- [Get started with traces and APM](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/apm/get-started)

To onboard other types of data, select **Add Data** from the main menu.