﻿---
title: Manage general content with data streams
description: Data streams are specifically designed for time series data. If you want to manage general content (data without timestamps) with data streams, you can...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/lifecycle/index-lifecycle-management/tutorial-general-content-with-data-streams
products:
  - Elasticsearch
applies_to:
  - Elastic Stack: Generally available
---

# Manage general content with data streams
Data streams are specifically designed for time series data.
If you want to manage general content (data without timestamps) with data streams, you can set up [ingest pipelines](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/ingest/transform-enrich/ingest-pipelines) to transform and enrich your general content by adding a timestamp field at [ingest](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/ingest) time and get the benefits of time-based data management.
For example, search use cases such as knowledge base, website content, e-commerce, or product catalog search, might require you to frequently index general content (data without timestamps). As a result, your index can grow significantly over time, which might impact storage requirements, query performance, and cluster health. Following the steps in this procedure (including a timestamp field and moving to ILM-managed data streams) can help you rotate your indices in a simpler way, based on their size or lifecycle phase.
To roll over your general content from indices to a data stream, you:
1. [Create an ingest pipeline](#manage-general-content-with-data-streams-ingest) to process your general content and add a `@timestamp` field.
2. [Create a lifecycle policy](#manage-general-content-with-data-streams-policy) that meets your requirements.
3. [Create an index template](#manage-general-content-with-data-streams-template) that uses the created ingest pipeline and lifecycle policy.
4. [Create a data stream](#manage-general-content-with-data-streams-create-stream).
5. *Optional:* If you have an existing, non-managed index and want to migrate your data to the data stream you created, [reindex with a data stream](#manage-general-content-with-data-streams-reindex).
6. [Update your ingest endpoint](#manage-general-content-with-data-streams-endpoint) to target the created data stream.
7. *Optional:* You can use the [ILM explain API](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-ilm-explain-lifecycle) to get status information for your managed indices.
   For more information, refer to [Check lifecycle progress](/elastic/docs-builder/docs/3016/manage-data/lifecycle/index-lifecycle-management/tutorial-time-series-with-data-streams#ilm-gs-check-progress).


## Create an ingest pipeline to transform your general content

You can create an ingest pipeline that uses the [`set` enrich processor](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/enrich-processor/set-processor) to add a `@timestamp` field. Follow these steps in Kibana or using the [create or update a pipeline](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-ingest-put-pipeline) API.
<tab-set>
  <tab-item title="Kibana">
    To add an ingest pipeline in Kibana:
    1. Go to the **Ingest Pipelines** management page using the navigation menu or the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
    2. Select **Create pipeline > New pipeline**.
    Configure the pipeline with a name, description, and a **Set** processor that adds the `@timestamp` field with a value of `{{_ingest.timestamp}}`.
    ![Create ingest pipeline](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/images/elasticsearch-reference-tutorial-ilm-general-content-ingest.png)
  </tab-item>

  <tab-item title="API">
    Use the API to add an ingest pipeline:
    ```json

    {
      "description": "Add an ingest timestamp",
       "processors": [
        {
          "set": {
            "field": "@timestamp",
            "value": "{{_ingest.timestamp}}"
          }
        }]
    }
    ```
  </tab-item>
</tab-set>


## Create a lifecycle policy

A lifecycle policy specifies the phases in the index lifecycle and the actions to perform in each phase. A lifecycle can have up to five phases: `hot`, `warm`, `cold`, `frozen`, and `delete`.
For example, you might define a policy named `indextods` that is configured to roll over when the shard size reaches 10 GB.
You can create the policy in Kibana or with the [create or update policy](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-ilm-put-lifecycle) API.
<tab-set>
  <tab-item title="Kibana">
    To create the policy in Kibana:
    1. Go to the **Index Lifecycle Policies** management page using the navigation menu or the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
    2. Click **Create policy**.
    In the **Hot phase**, by default, an ILM-managed index [rolls over](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/elasticsearch/index-lifecycle-actions/ilm-rollover) when either:
    - It reaches 30 days of age.
    - One or more primary shards reach 50 GB in size.
    Disable **Use recommended defaults** to adjust these values and roll over when the primary shard reaches 10GB.
    ![Create policy page](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/images/elasticsearch-reference-tutorial-ilm-rollover-general-content-create-policy.png)
  </tab-item>

  <tab-item title="API">
    Use the API to create a lifecyle policy:
    ```json

    {
      "policy": {
        "phases": {
          "hot": {
           "min_age": "0ms",
            "actions": {
              "set_priority": {
                "priority": 100
              },
              "rollover": {
               "max_primary_shard_size": "10gb"
              }
            }
          }
        }
      }
    }
    ```
  </tab-item>
</tab-set>

For more information about lifecycle phases and available actions, refer to [Create a lifecycle policy](/elastic/docs-builder/docs/3016/manage-data/lifecycle/index-lifecycle-management/configure-lifecycle-policy#ilm-create-policy).

## Create an index template to apply the ingest pipeline and lifecycle policy

To use the created lifecycle policy, you configure an index template that uses it.
When creating the index template, specify the following details:
- the name of the lifecycle policy, which in our example is `indextods`
- the ingest pipeline that enriches the data by adding the `@timestamp` field, which in our example is `ingest_time_1`
- that the template is data stream enabled by including the `data_stream` definition
- the index pattern, which ensures that this template will be applied to matching indices and in our example is `movetods`

You can create the template in Kibana or with the [create or update index template](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-indices-put-index-template) API.
<tab-set>
  <tab-item title="Kibana">
    To create an index template in Kibana, complete these steps:
    1. Go to the **Index Management** page using the navigation menu or the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
    2. In the **Index Templates** tab, select **Create template**.
       ![Create template page](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/images/elasticsearch-reference-tutorial-ilm-rollover-general-content-create-template.png)
    3. On the **Logistics** page:
       1. Specify the name of the template. For example `index_to_dot`.
    2. Specify a pattern to match the indices you want to manage with the lifecycle policy. For example, `movetodos`.
    3. Turn on the **Create data streams** toggle.
    4. Set the [index mode](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3016/reference/elasticsearch/index-settings/time-series) to **Standard**.
    4. Optional: On the **Component templates** page, use the search and filter tools to select any [component templates](/elastic/docs-builder/docs/3016/manage-data/data-store/templates#component-templates) to include in the index template. The index template will inherit the settings, mappings, and aliases defined in the component templates and apply them to indices when they're created.
    5. On the **Index settings** page, specify the lifecycle policy and ingest pipeline you want to use. For example, `indextods` and `ingest_time_1`:
       ```json
       {
          "lifecycle": {
            "name": "indextods"
           },
         "default_pipeline": "ingest_time_1"
       }
       ```
    6. On the **Mappings** page, customize the fields and data types used when documents are indexed into Elasticsearch. For example, select **Load JSON** and include these mappings:
       ```json
       {
         "_source": {
           "excludes": [],
           "includes": [],
           "enabled": true
           },
           "_routing": {
             "required": false
             },
           "dynamic": true,
           "numeric_detection": false,
           "date_detection": true,
           "dynamic_date_formats": [
             "strict_date_optional_time",
             "yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"
           ]
         }
       ```
    7. On the **Review** page, confirm your selections. You can check your selected options, as well as both the format of the index template that will be created and the associated API request.
    The newly created index template will be used for all new indices with names that match the specified pattern, and for each of these, the specified ILM policy will be applied.For more information about configuring templates in Kibana, refer to [Manage index templates](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/data-store/templates).
  </tab-item>

  <tab-item title="API">
    Use the [create index template API](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-indices-put-index-template) to create an index template that specifies the created ingest pipeline and lifecycle policy:
    ```json

    {
      "template": {
        "settings": {
          "index": {
            "lifecycle": {
              "name": "indextods"
            },
            "default_pipeline": "ingest_time_1"
          }
        },
        "mappings": {
          "_source": {
            "excludes": [],
            "includes": [],
            "enabled": true
          },
          "_routing": {
            "required": false
          },
          "dynamic": true,
          "numeric_detection": false,
          "date_detection": true,
          "dynamic_date_formats": [
            "strict_date_optional_time",
            "yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"
          ]
        }
      },
      "index_patterns": [
        "movetods"
      ],
      "data_stream": {
        "hidden": false,
        "allow_custom_routing": false
      }
    }
    ```
  </tab-item>
</tab-set>


## Create a data stream

Create a data stream using the [_data_stream API](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-indices-create-data-stream):
```json
```

You can [check the lifecycle status of your data stream](https://www.elastic.co/elastic/docs-builder/docs/3016/manage-data/lifecycle/index-lifecycle-management/policy-view-status), including details about its associated ILM policy.

### Optional: Reindex your data with a data stream

If you want to copy your documents from an existing index to the data stream you created, reindex with a data stream using the [_reindex API](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-reindex):
```json

{
  "source": {
    "index": "indextods"
  },
  "dest": {
    "index": "movetods",
    "op_type": "create"

  }
}
```

For more information, check [Reindex with a data stream](/elastic/docs-builder/docs/3016/manage-data/data-store/data-streams/use-data-stream#reindex-with-a-data-stream).

## Update your ingest endpoint to target the created data stream

If you use Elastic clients, scripts, or any other third party tool to ingest data to Elasticsearch, make sure you update these to use the created data stream.