﻿---
title: Grok processor
description: The Grok processor parses unstructured log messages using a set of predefined patterns to match the log messages and extract the fields. The grok processor...
url: https://www.elastic.co/elastic/docs-builder/docs/3028/solutions/observability/streams/management/extract/grok
products:
  - Elastic Cloud Enterprise
  - Elastic Cloud Hosted
  - Elastic Cloud Serverless
  - Elastic Cloud on Kubernetes
  - Elastic Observability
  - Elastic Stack
  - Elasticsearch
  - Kibana
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available since 9.2, Preview in 9.1
---

# Grok processor
The **Grok** processor parses unstructured log messages using a set of predefined patterns to match the log messages and extract the fields. The grok processor is powerful and can parse a wide variety of log formats.
You can provide multiple patterns to the grok processor. The grok processor tries to match the log message against each pattern in the order they are provided. If a pattern matches, it extracts the fields and the remaining patterns won't be used.
If a pattern doesn't match, the grok processor tries the next pattern. If no patterns match, the Grok processor will fail and you can troubleshoot the issue. Instead of writing grok patterns, you can have Streams generate patterns for you. Refer to [generate patterns](#streams-grok-patterns) for more information.
<tip>
  To improve pipeline performance, start with the most common patterns first, then add more specific patterns. This reduces the number times the grok processor has to run.
</tip>

To parse a log message with a grok processor:
1. Set the **Source Field** to the field you want to search for grok matches.
2. Set the patterns you want to use in the **Grok patterns** field. Refer to the [example pattern](#streams-grok-example) for more information on patterns.

This functionality uses the Elasticsearch [Grok processor](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/enrich-processor/grok-processor) internally, but you configure it in Streamlang. Streamlang doesn’t always have 1:1 parity with the ingest processor options and behavior. Refer to [Processor limitations and inconsistencies](/elastic/docs-builder/docs/3028/solutions/observability/streams/management/extract#streams-processor-inconsistencies).

## Example grok pattern

Grok patterns are defined in the following format:
```
{
  "MY_DATE": "%{YEAR}-%{MONTHNUM}-%{MONTHDAY}"
}
```

Where `MY_DATE` is the name of the pattern.
The previous pattern can then be used in the processor.
```
%{MY_DATE:date}
```


## Generate patterns

<note>
  This feature requires a [Generative AI connector](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/kibana/connectors-kibana/gen-ai-connectors).
</note>

Instead of writing the Grok patterns by hand, you can select **Generate Patterns** to have AI generate them for you.
Generated patterns work best on semi-structured data. For very custom logs with a lot of text, creating patterns manually generally creates more accurate results.
![](https://www.elastic.co/elastic/docs-builder/docs/3028/solutions/images/logs-streams-patterns.png)

To add a generated grok pattern:
1. Select **Create** → **Create processor**.
2. Select **Grok** from the **Processor** menu.
3. Select **Generate pattern**.
4. Select **Accept** to add a generated pattern to the list of patterns used by the grok processor.


### How does **Generate patterns** work?

Streams groups the samples from the **Data preview** table into categories of similar messages. For each category, Streams generates suggestions by sending samples to the LLM. Suggestions are then shown in the UI.
<warning>
  This can incur additional costs, depending on the LLM connector you are using. Typically a single iteration uses between 1000 and 5000 tokens depending on the number of identified categories and the length of the messages.
</warning>


## YAML reference

In [YAML mode](/elastic/docs-builder/docs/3028/solutions/observability/streams/management/extract#streams-editing-yaml-mode), configure the grok processor using the following parameters. For the complete Streamlang syntax, refer to the [Streamlang reference](https://www.elastic.co/elastic/docs-builder/docs/3028/solutions/observability/streams/management/streamlang).

| Parameter             | Type     | Required | Description                                                      |
|-----------------------|----------|----------|------------------------------------------------------------------|
| `from`                | string   | Yes      | Source field to parse.                                           |
| `patterns`            | string[] | Yes      | One or more grok patterns, tried in order.                       |
| `pattern_definitions` | object   | No       | Custom pattern definitions as key-value pairs.                   |
| `ignore_missing`      | boolean  | No       | When `true`, skip this processor if the source field is missing. |

```yaml
- action: grok
  from: body.message
  patterns:
    - "%{IP:attributes.client_ip} %{WORD:attributes.method} %{URIPATHPARAM:attributes.path}"
  pattern_definitions:
    MY_PATTERN: "%{YEAR}-%{MONTHNUM}-%{MONTHDAY}"
```