﻿---
title: Example: Parse logs
description: In this example tutorial, you’ll use an ingest pipeline to parse server logs in the Common Log Format before indexing. Before starting, check the prerequisites...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/ingest/transform-enrich/example-parse-logs
products:
  - Elasticsearch
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Example: Parse logs
In this example tutorial, you’ll use an [ingest pipeline](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/ingest/transform-enrich/ingest-pipelines) to parse server logs in the [Common Log Format](https://en.wikipedia.org/wiki/Common_Log_Format) before indexing. Before starting, check the [prerequisites](/elastic/docs-builder/docs/3016/manage-data/ingest/transform-enrich/ingest-pipelines#ingest-prerequisites) for ingest pipelines.
The logs you want to parse look similar to this:
```txt
212.87.37.154 - - [05/May/2099:16:21:15 +0000] "GET /favicon.ico HTTP/1.1" 200 3638 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36"
```

These logs contain a timestamp, IP address, and user agent. You want to give these three items their own field in Elasticsearch for faster searches and visualizations. You also want to know where the request is coming from.
1. In Kibana, go to the **Ingest Pipelines** management page using the navigation menu or the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
   ![Kibana's Ingest Pipelines list view](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/images/elasticsearch-reference-ingest-pipeline-list.png)
2. Click **Create pipeline > New pipeline**.
3. Set **Name** to `my-pipeline` and optionally add a description for the pipeline.
4. Add a [grok processor](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/grok-processor) to parse the log message:
   1. Click **Add a processor** and select the **Grok** processor type.
2. Set **Field** to `message` and **Patterns** to the following [grok pattern](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/scripting/grok):
   ```text
   %{IPORHOST:source.ip} %{USER:user.id} %{USER:user.name} \[%{HTTPDATE:@timestamp}\] "%{WORD:http.request.method} %{DATA:url.original} HTTP/%{NUMBER:http.version}" %{NUMBER:http.response.status_code:int} (?:-|%{NUMBER:http.response.body.bytes:int}) %{QS:http.request.referrer} %{QS:user_agent}
   ```
3. Click **Add** to save the processor.
4. Set the processor description to `Extract fields from 'message'`.
5. Add processors for the timestamp, IP address, and user agent fields. Configure the processors as follows:
   | Processor type                                                                                                                       | Field        | Additional options                    | Description                                       |
   |--------------------------------------------------------------------------------------------------------------------------------------|--------------|---------------------------------------|---------------------------------------------------|
   | [**Date**](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/date-processor)             | `@timestamp` | **Formats**: `dd/MMM/yyyy:HH:mm:ss Z` | `Format '@timestamp' as 'dd/MMM/yyyy:HH:mm:ss Z'` |
   | [**GeoIP**](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/geoip-processor)           | `source.ip`  | **Target field**: `source.geo`        | `Add 'source.geo' GeoIP data for 'source.ip'`     |
   | [**User agent**](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/user-agent-processor) | `user_agent` |                                       | `Extract fields from 'user_agent'`                |
   Your form should look similar to this:
   ![Processors for Ingest Pipelines](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/images/elasticsearch-reference-ingest-pipeline-processor.png)
   The four processors will run sequentially: Grok > Date > GeoIP > User agent You can reorder processors using the arrow icons.
   Alternatively, you can click the **Import processors** link and define the processors as JSON:
   ```js
   {
     "processors": [
       {
         "grok": {
           "description": "Extract fields from 'message'",
           "field": "message",
           "patterns": ["%{IPORHOST:source.ip} %{USER:user.id} %{USER:user.name} \\[%{HTTPDATE:@timestamp}\\] \"%{WORD:http.request.method} %{DATA:url.original} HTTP/%{NUMBER:http.version}\" %{NUMBER:http.response.status_code:int} (?:-|%{NUMBER:http.response.body.bytes:int}) %{QS:http.request.referrer} %{QS:user_agent}"]
         }
       },
       {
         "date": {
           "description": "Format '@timestamp' as 'dd/MMM/yyyy:HH:mm:ss Z'",
           "field": "@timestamp",
           "formats": [ "dd/MMM/yyyy:HH:mm:ss Z" ]
         }
       },
       {
         "geoip": {
           "description": "Add 'source.geo' GeoIP data for 'source.ip'",
           "field": "source.ip",
           "target_field": "source.geo"
         }
       },
       {
         "user_agent": {
           "description": "Extract fields from 'user_agent'",
           "field": "user_agent"
         }
       }
     ]

   }
   ```
6. To test the pipeline, click **Add documents**.
7. In the **Documents** tab, provide a sample document for testing:
   ```js
   [
     {
       "_source": {
         "message": "212.87.37.154 - - [05/May/2099:16:21:15 +0000] \"GET /favicon.ico HTTP/1.1\" 200 3638 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36\""
       }
     }
   ]
   ```
8. Click **Run the pipeline** and verify the pipeline worked as expected.
9. If everything looks correct, close the panel, and then click **Create pipeline**.
   You’re now ready to index the logs data to a [data stream](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/data-store/data-streams).
10. Create an [index template](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/data-store/templates) with [data stream enabled](/elastic/docs-builder/docs/3016/manage-data/data-store/data-streams/set-up-data-stream#create-index-template).
   ```json

   {
     "index_patterns": [ "my-data-stream*" ],
     "data_stream": { },
     "priority": 500
   }
   ```
11. Index a document with the pipeline you created.
   ```json

   {
     "message": "89.160.20.128 - - [05/May/2099:16:21:15 +0000] \"GET /favicon.ico HTTP/1.1\" 200 3638 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36\""
   }
   ```
12. To verify, search the data stream to retrieve the document. The following search uses [`filter_path`](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/elasticsearch/rest-apis/common-options#common-options-response-filtering) to return only the [document source](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/elasticsearch/mapping-reference/mapping-source-field).
   ```json
   ```
   The API returns:
   ```json
   {
     "hits": {
       "hits": [
         {
           "_source": {
             "@timestamp": "2099-05-05T16:21:15.000Z",
             "http": {
               "request": {
                 "referrer": "\"-\"",
                 "method": "GET"
               },
               "response": {
                 "status_code": 200,
                 "body": {
                   "bytes": 3638
                 }
               },
               "version": "1.1"
             },
             "source": {
               "ip": "89.160.20.128",
               "geo": {
                 "continent_name" : "Europe",
                 "country_name" : "Sweden",
                 "country_iso_code" : "SE",
                 "city_name" : "Linköping",
                 "region_iso_code" : "SE-E",
                 "region_name" : "Östergötland County",
                 "location" : {
                   "lon" : 15.6167,
                   "lat" : 58.4167
                 }
               }
             },
             "message": "89.160.20.128 - - [05/May/2099:16:21:15 +0000] \"GET /favicon.ico HTTP/1.1\" 200 3638 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36\"",
             "url": {
               "original": "/favicon.ico"
             },
             "user": {
               "name": "-",
               "id": "-"
             },
             "user_agent": {
               "original": "\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36\"",
               "os": {
                 "name": "Mac OS X",
                 "version": "10.11.6",
                 "full": "Mac OS X 10.11.6"
               },
               "name": "Chrome",
               "device": {
                 "name": "Mac"
               },
               "version": "52.0.2743.116"
             }
           }
         }
       ]
     }
   }
   ```