﻿---
title: GCP Dataflow templates
description: In this tutorial, you’ll learn how to ship logs directly from the Google Cloud Console with the Dataflow template for analyzing GCP Audit Logs in the...
url: https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/observability/cloud/gcp-dataflow-templates
products:
  - Elastic Observability
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# GCP Dataflow templates
In this tutorial, you’ll learn how to ship logs directly from the Google Cloud Console with the Dataflow template for analyzing GCP Audit Logs in the Elastic Stack.

## What you’ll learn

You’ll learn how to:
- Export GCP audit logs through Pub/Sub topics and subscriptions.
- Ingest logs using [Google Dataflow](https://cloud.google.com/dataflow) and view those logs in Kibana.


## Before you begin

Create an [Elastic Cloud Hosted](https://cloud.elastic.co/registration?page=docs&placement=docs-body) deployment or [Elastic Observability Serverless](https://www.elastic.co/elastic/docs-builder/docs/3016/deploy-manage/deploy/elastic-cloud/create-serverless-project) project. Both include an Elasticsearch cluster for storing and searching your data and Kibana for visualizing and managing your data.
This tutorial assumes the Elasticsearch cluster is already running.
<applies-switch>
  <applies-item title="stack: ga" applies-to="Elastic Stack: Generally available">
    For Elastic Cloud Hosted deployments, you need your **Cloud ID** and an **API Key**.To find the Cloud ID of your [deployment](https://cloud.elastic.co/deployments), go to the deployment’s **Overview** page.![Cloud ID](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-cloud-id.png)
  </applies-item>

  <applies-item title="serverless: ga" applies-to="Elastic Cloud Serverless: Generally available">
    For Elastic Observability Serverless projects, you need your **Elasticsearch endpoint URL** and an **API key**.To find your endpoint URL, select **Manage** next to your project, then find the Elasticsearch endpoint under **Application endpoints, cluster and component IDs**. Alternatively, open your project, select the help icon, then select **Connection details**.
  </applies-item>
</applies-switch>

Use Kibana to [create a Base64-encoded API key](/elastic/docs-builder/docs/3016/deploy-manage/api-keys/elasticsearch-api-keys#create-api-key) to authenticate on your deployment.
<important>
  You can optionally restrict the privileges of your API Key; otherwise they’ll be a point in time snapshot of permissions of the authenticated user. For this tutorial the data is written to the `logs-gcp.audit-default` data streams.
</important>


## Step 1: Install the GCP integration

You’ll start with installing the Elastic GCP integration to add pre-built dashboards, ingest node configurations, and other assets that help you get the most of the GCP logs you ingest.
1. Find **Integrations** in the main menu or use the [global search field](https://www.elastic.co/elastic/docs-builder/docs/3016/explore-analyze/find-and-organize/find-apps-and-objects).
2. Search for `gcp`.
   ![Kibana integrations](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-kibana-integrations.png)
3. Click the Elastic Google Cloud Platform (GCP) integration to see more details about it, then click **Add Google Cloud Platform (GCP)**.
   ![GCP integration](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-integration.png)
4. Click **Save integration**.


## Step 2: Create a Pub/Sub topic and subscription

Before configuring the Dataflow template, create a Pub/Sub topic and subscription from your Google Cloud Console where you can send your logs from Google Operations Suite. There are three available filesets: `audit`, `vpcflow`, `firewall`. This tutorial covers the `audit` fileset.
1. Go to the **Logs Router** page to configure GCP to export logs to a Pub/Sub topic. Use the search bar to find the page:
   ![Navigate to Logs Router page](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-navigate-logs-router.png)
   To set up the logs routing sink, click  **Create sink**. Set **sink name** as `monitor-gcp-audit-sink`. Select the **Cloud Pub/Sub topic** as the **sink service** and **Create new Cloud Pub/Sub topic** named `monitor-gcp-audit`:
   ![Create Pub/Sub topic](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-create-pubsub-topic.png)
   Finally, under **Choose logs to include in sink**, add `logName:"cloudaudit.googleapis.com"` (it includes all audit logs). Click **create sink**.  It will look something like the following:
   ![Create logs routing sink](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-create-sink.png)
2. Now go to the **Pub/Sub** page to add a subscription to the topic you just created. Use the search bar to find the page:
   ![GCP Pub/Sub](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-pub-sub.png)
   To add a subscription to the `monitor-gcp-audit` topic click **Create subscription**:
   ![Create GCP Pub/Sub Subscription](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-pub-sub-create-subscription.png)
   Set `monitor-gcp-audit-sub` as the **Subscription ID** and leave the **Delivery type** as pull:
   ![GCP Pub/Sub Subscription ID](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-pub-sub-subscription-id.png)
   Finally, scroll down and click **Create**.


## Step 3: Configure the Google Dataflow template

After creating a Pub/Sub topic and subscription, go to the **Dataflow Jobs** page and configure your template to use them. Use the search bar to find the page:
![GCP Dataflow Jobs](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-dataflow-jobs.png)

To create a job, click **Create Job From Template**. Set **Job name** as `auditlogs-stream` and select `Pub/Sub to Elasticsearch` from the **Dataflow template** dropdown menu:
![GCP Dataflow Pub/Sub to Elasticsearch](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-dataflow-pub-sub-elasticsearch.png)

Before running the job, fill in required parameters:
![GCP Dataflow Required Parameters](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-dataflow-required-parameters.png)

<note>
  For **Cloud Pub/Sub subscription**, use the subscription you created in the previous step. Use the values you obtained earlier for the following fields:
  - For Elastic Cloud Hosted deployments, your **Cloud ID**
  - For Serverless, your **Elasticsearch endpoint URL in the format https://hostname:[port]**.
  - **Base64-encoded API Key**.
  If you don’t have an **Error output topic**, create one like you did in the previous step.
</note>

After filling the required parameters, click **Show Optional Parameters** and add `audit` as the log type parameter.
![GCP Dataflow Optional Parameters](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-dataflow-optional-parameters.png)

When you are all set, click **Run Job** and wait for Dataflow to execute the template, which takes a few minutes.
Finally, navigate to Kibana to see your logs parsed and visualized in the **[Logs GCP] Audit** dashboard.
![GCP audit overview dashboard](https://www.elastic.co/elastic/docs-builder/docs/3016/solutions/images/observability-monitor-gcp-dataflow-audit-dashboard.png)

Besides collecting audit logs from your Google Cloud Platform, you can also use Dataflow integrations to ingest data directly into Elastic from [Google BigQuery](https://www.elastic.co/blog/ingest-data-directly-from-google-bigquery-into-elastic-using-google-dataflow) and [Google Cloud Storage](https://www.elastic.co/blog/ingest-data-directly-from-google-cloud-storage-into-elastic-using-google-dataflow).