﻿---
title: Plaintext application logs
description: Ingest and parse plaintext logs, including existing logs, from any programming language or framework without modifying your application or its configuration...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/plaintext-application-logs
products:
  - Elastic Cloud Serverless
  - Elastic Observability
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Plaintext application logs
Ingest and parse plaintext logs, including existing logs, from any programming language or framework without modifying your application or its configuration.
Plaintext logs require some additional setup that structured logs do not require:
- To search, filter, and aggregate effectively, you need to parse plaintext logs using an ingest pipeline to extract structured fields. Parsing is based on log format, so you might have to maintain different settings for different applications.
- To [correlate plaintext logs](#correlate-plaintext-logs), you need to inject IDs into log messages and parse them using an ingest pipeline.

To ingest, parse, and correlate plaintext logs:
1. Ingest plaintext logs with [Filebeat](#ingest-plaintext-logs-with-filebeat) or [Elastic Agent](#ingest-plaintext-logs-with-the-agent) and parse them before indexing with an ingest pipeline.
2. [Correlate plaintext logs with an APM agent.](#correlate-plaintext-logs)
3. [View logs in Discover](#view-plaintext-logs)


## Ingest logs

Send application logs to Elasticsearch using one of the following shipping tools:
- [Filebeat](#ingest-plaintext-logs-with-filebeat) A lightweight data shipper that sends log data to Elasticsearch.
- [Elastic Agent](#ingest-plaintext-logs-with-the-agent) A single agent that can collect logs, metrics, and security data, and protect your hosts. You can use Fleet to manage Elastic Agent policies and lifecycles directly from Kibana.

<tip>
  We recommend using the [Elastic Distribution of OpenTelemetry Collector](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/logs/stream-any-log-file-using-edot-collector) to collect application logs.
</tip>


### Ingest logs with Filebeat

Follow these steps to ingest application logs with Filebeat.

#### Step 1: Install Filebeat

Install Filebeat on the server you want to monitor by running the commands that align with your system:
<tab-set>
  <tab-item title="DEB">
    ```sh
    curl -L -O https\://artifacts.elastic.co/downloads/beats/filebeat/filebeat-9.3.2-amd64.deb
    sudo dpkg -i filebeat-9.3.2-amd64.deb
    ```
  </tab-item>

  <tab-item title="RPM">
    ```sh
    curl -L -O https\://artifacts.elastic.co/downloads/beats/filebeat/filebeat-9.3.2-x86_64.rpm
    sudo rpm -vi filebeat-9.3.2-x86_64.rpm
    ```
  </tab-item>

  <tab-item title="macOS">
    ```sh
    curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-9.3.2-darwin-x86_64.tar.gz
    tar xzvf filebeat-9.3.2-darwin-x86_64.tar.gz
    ```
  </tab-item>

  <tab-item title="Linux">
    ```sh
    curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-9.3.2-linux-x86_64.tar.gz
    tar xzvf filebeat-9.3.2-linux-x86_64.tar.gz
    ```
  </tab-item>

  <tab-item title="Windows">
    1. Download the [Filebeat Windows zip file](https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-9.3.2-windows-x86_64.zip).
    2. Extract the contents of the zip file into `C:\Program Files`.
    3. Rename the _filebeat-9.3.2-windows-x86_64_ directory to _Filebeat_.
    4. Open a PowerShell prompt as an Administrator (right-click the PowerShell icon and select **Run As Administrator**).
    5. From the PowerShell prompt, run the following commands to install Filebeat as a Windows service:
       ```powershell
       PS > cd 'C:\Program Files\Filebeat'
       PS C:\Program Files\{filebeat}> .\install-service-filebeat.ps1
       ```
    If script execution is disabled on your system, you need to set the execution policy for the current session to allow the script to run. For example: `PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-filebeat.ps1`.
  </tab-item>
</tab-set>


#### Step 2: Connect to Elasticsearch

Connect to Elasticsearch using an API key to set up Filebeat. Set the following information in the `filebeat.yml` file:
```yaml
output.elasticsearch:
  hosts: ["your-projects-elasticsearch-endpoint"]
  api_key: "id:api_key"
```

1. Set the `hosts` to your deployment’s Elasticsearch endpoint. Copy the Elasticsearch endpoint from **Help menu (![help icon](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-help-icon.svg)) → Connection details**. For example, `https://my-deployment.es.us-central1.gcp.cloud.es.io:443`.
2. From **Developer tools**, run the following command to create an API key that grants `manage` permissions for the `cluster` and the `filebeat-*` indices using:
   ```json

   {
     "name": "filebeat_host001",
     "role_descriptors": {
       "filebeat_writer": {
         "cluster": ["manage"],
         "index": [
           {
             "names": ["filebeat-*"],
             "privileges": ["manage", "create_doc"]
           }
         ]
       }
     }
   }
   ```
   Refer to [Grant access using API keys](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/filebeat/beats-api-keys) for more information.


#### Step 3: Configure Filebeat

Add the following configuration to your `filebeat.yaml` file to start collecting log data.
```yaml
filebeat.inputs:
- type: filestream  
  enabled: true
  paths: /path/to/logs.log  
```


#### Step 4: Set up and start Filebeat

Filebeat comes with predefined assets for parsing, indexing, and visualizing your data. To load these assets:
From the Filebeat installation directory, set the [index template](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/data-store/templates) by running the command that aligns with your system:
<tab-set>
  <tab-item title="DEB">
    ```sh
    ./filebeat setup -e
    ```
  </tab-item>

  <tab-item title="RPM">
    ```sh
    ./filebeat setup -e
    ```
  </tab-item>

  <tab-item title="MacOS">
    ```sh
    PS > .\filebeat.exe setup -e
    ```
  </tab-item>

  <tab-item title="Linux">
    ```sh
    filebeat setup -e
    ```
  </tab-item>

  <tab-item title="Windows">
    ```sh
    filebeat setup -e
    ```
  </tab-item>
</tab-set>

From the Filebeat installation directory, start filebeat by running the command that aligns with your system:
<tab-set>
  <tab-item title="DEB">
    ```sh
    sudo service filebeat start
    ```

    <note>
      If you use an `init.d` script to start Filebeat, you can’t specify command line flags (see [Command reference](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/filebeat/command-line-options)). To specify flags, start Filebeat in the foreground.
    </note>
    Also see [Filebeat and systemd](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/filebeat/running-with-systemd).
  </tab-item>

  <tab-item title="RPM">
    ```sh
    sudo service filebeat start
    ```

    <note>
      If you use an `init.d` script to start Filebeat, you can’t specify command line flags (see [Command reference](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/filebeat/command-line-options)). To specify flags, start Filebeat in the foreground.
    </note>
    Also see [Filebeat and systemd](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/beats/filebeat/running-with-systemd).
  </tab-item>

  <tab-item title="MacOS">
    ```sh
    ./filebeat -e
    ```
  </tab-item>

  <tab-item title="Linux">
    ```sh
    ./filebeat -e
    ```
  </tab-item>

  <tab-item title="Windows">
    ```sh
    PS C:\Program Files\filebeat> Start-Service filebeat
    ```
    By default, Windows log files are stored in `C:\ProgramData\filebeat\Logs`.
  </tab-item>
</tab-set>


#### Step 5: Parse logs with an ingest pipeline

Use an ingest pipeline to parse the contents of your logs into structured, [Elastic Common Schema (ECS)](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/ecs)-compatible fields.
Create an ingest pipeline that defines a [dissect processor](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/dissect-processor) to extract structured ECS fields from your log messages. In your project, navigate to **Developer Tools** and using a command similar to the following example:
```json

{
  "description": "Extracts the timestamp log level and host ip",
  "processors": [
    {
      "dissect": { <2>
        "field": "message", <3>
        "pattern": "%{@timestamp} %{log.level} %{host.ip} %{message}" <4>
      }
    }
  ]
}
```

Refer to [Extract structured fields](/elastic/docs-builder/docs/3016/solutions/observability/logs/parse-route-logs#observability-parse-log-data-extract-structured-fields) for more on using ingest pipelines to parse your log data.
After creating your pipeline, specify the pipeline for filebeat in the `filebeat.yml` file:
```yaml
output.elasticsearch:
  hosts: ["your-projects-elasticsearch-endpoint"]
  api_key: "id:api_key"
  pipeline: "your-pipeline" 
```


### Ingest logs with the Elastic Agent

Follow these steps to ingest and centrally manage your logs using Elastic Agent and Fleet.

#### Step 1: Add the custom logs integration to your project

To add the custom logs integration to your project:
1. Find **Integrations** in the main menu or use the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
2. Type `custom` in the search bar and select **Custom Logs**.
3. Click **Add Custom Logs**.
4. Click **Install Elastic Agent** at the bottom of the page, and follow the instructions for your system to install the Elastic Agent.
5. After installing the Elastic Agent, configure the integration from the **Add Custom Logs integration** page.
6. Give your integration a meaningful name and description.
7. Add the **Log file path**. For example, `/var/log/your-logs.log`.
8. Give your agent policy a name. The agent policy defines the data your Elastic Agent collects.
9. Save your integration to add it to your deployment.


#### Step 2: Add an ingest pipeline to your integration

To aggregate or search for information in plaintext logs, use an ingest pipeline with your integration to parse the contents of your logs into structured, [Elastic Common Schema (ECS)](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/ecs)-compatible fields.
1. From the custom logs integration, select **Integration policies** tab.
2. Select the integration policy you created in the previous section.
3. Click **Change defaults → Advanced options**.
4. Under **Ingest pipelines**, click **Add custom pipeline**.
5. Create an ingest pipeline with a [dissect processor](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/dissect-processor) to extract structured fields from your log messages.
   Click **Import processors** and add a similar JSON to the following example:
   ```JSON
   {
     "description": "Extracts the timestamp log level and host ip",
     "processors": [
       {
         "dissect": { 
           "field": "message", 
           "pattern": "%{@timestamp} %{log.level} %{host.ip} %{message}" 
         }
       }
     ]
   }
   ```
6. Click **Create pipeline**.
7. Save and deploy your integration.


## Correlate logs

Correlate your application logs with trace events to:
- view the context of a log and the parameters provided by a user
- view all logs belonging to a particular trace
- easily move between logs and traces when debugging application issues

Log correlation works on two levels:
- at service level: annotation with `service.name`, `service.version`, and `service.environment` allow you to link logs with APM services
- at trace level: annotation with `trace.id` and `transaction.id` allow you to link logs with traces

Learn about correlating plaintext logs in the agent-specific ingestion guides:
- [Go](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/go/logs)
- [Java](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/java/logs#log-correlation-ids)
- [.NET](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/dotnet/logs)
- [Node.js](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/nodejs/logs)
- [Python](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/python/logs#log-correlation-ids)
- [Ruby](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/apm/agents/ruby/logs)


## View logs

To view logs ingested by Filebeat, go to **Discover** from the main menu and create a data view based on the `filebeat-*` index pattern. You can also select **All logs** from the **Data views** menu as it includes the `filebeat-*` index pattern by default. Refer to [Create a data view](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/data-views) for more information.