Loading

Google Cloud Platform Integration

<div class="condensed-table">
| | |
| --- | --- |
| Version | 2.41.0 (View all) |
| Compatible Kibana version(s) | 8.13.0 or higher
9.0.0 or higher |
| Supported Serverless project types
What’s this? | Security
Observability |
| Subscription level
What’s this? | Basic |
| Level of support
What’s this? | Elastic |

</div>
The Google Cloud integration collects and parses Google Cloud Audit Logs, VPC Flow Logs, Firewall Rules Logs and Cloud DNS Logs that have been exported from Cloud Logging to a Google Pub/Sub topic sink and collects Google Cloud metrics and metadata from Google Cloud Monitoring.

To use this Google Cloud Platform (GCP) integration, you need to set up a Service Account with a Role and a Service Account Key to access data on your GCP project.

First, you need to create a Service Account. A Service Account (SA) is a particular type of Google account intended to represent a non-human user who needs to access the GCP resources.

The Elastic Agent uses the SA to access data on Google Cloud Platform using the Google APIs.

If you haven’t already, this might be a good moment to check out the best practices for securing service accounts guide.

You need to grant your Service Account (SA) access to Google Cloud Platform resources by assigning a role to the account. In order to assign minimal privileges, create a custom role that has only the privileges required by Agent. Those privileges are:

  • compute.instances.list (required for GCP Compute instance metadata collection) **

  • monitoring.metricDescriptors.list

  • monitoring.timeSeries.list

  • pubsub.subscriptions.consume

  • pubsub.subscriptions.create *

  • pubsub.subscriptions.get

  • pubsub.topics.attachSubscription *

  • Only required if Agent is expected to create a new subscription. If you create the subscriptions yourself you may omit these privileges.

    • Only required if corresponding collection will be enabled.

After you have created the custom role, assign the role to your service account.

Now, with your brand new Service Account (SA) with access to Google Cloud Platform (GCP) resources, you need some credentials to associate with it: a Service Account Key.

From the list of SA:

  1. Click the one you just created to open the detailed view.
  2. From the Keys section, click "Add key" > "Create new key" and select JSON as the type.
  3. Download and store the generated private key securely (remember, the private key can’t be recovered from GCP if lost).

The next step is to configure the general integration settings used for all logs from the supported services (Audit, DNS, Firewall, and VPC Flow).

The "Project Id" and either the "Credentials File" or "Credentials JSON" will need to be provided in the integration UI when adding the Google Cloud Platform integration.

The Project Id is the Google Cloud project ID where your resources exist.

Based on your preference, specify the information in either the Credentials File OR the Credentials JSON field.

Save the JSON file with the private key in a secure location of the file system, and make sure that the Elastic Agent has at least read-only privileges to this file.

Specify the file path in the Elastic Agent integration UI in the "Credentials File" field. For example: /home/ubuntu/credentials.json.

Specify the content of the JSON file you downloaded from Google Cloud Platform directly in the Credentials JSON field in the Elastic Agent integration.

Elastic recommends using Credentials File, as in this method the credential information doesn’t leave your Google Cloud Platform environment. When using Credentials JSON, the integration stores the info in Elasticsearch, and the access is controlled based on policy permissions or access to underlying Elasticsearch data.

With a properly configured Service Account and the integration setting in place, it’s time to start collecting some logs.

You need to create a few dedicated Google Cloud resources before starting, in detail:

  • Log Sink
  • Pub/Sub Topic
  • Subscription

Elastic recommends separate Pub/Sub topics for each of the log types so that they can be parsed and stored in a specific data stream.

Here’s an example of collecting Audit Logs using a Pub/Sub topic, a subscription, and a Log Router. We will create the resources in the Google Cloud Console and then configure the Google Cloud Platform integration.

At a high level, the steps required are:

  • Visit "Logging" > "Log Router" > "Create Sink" and provide a sink name and description.
  • In "Sink destination", select "Cloud Pub/Sub topic" as the sink service. Select an existing topic or "Create a topic". Note the topic name, as it will be provided in the Topic field in the Elastic agent configuration.
  • If you created a new topic, you must remember to go to that topic and create a subscription for it. A subscription directs messages on a topic to subscribers. Note the "Subscription ID", as it will need to be entered in the "Subscription name" field in the integration settings.
  • Under "Choose logs to include in sink", for example add logName:"cloudaudit.googleapis.com" in the "Inclusion filter" to include all audit logs.

This is just an example; you will need to create your filter expression to select the log types you want to export to the Pub/Sub topic.

More example filters for different log types:

#
# VPC Flow: logs for specific subnet
#
resource.type="gce_subnetwork" AND
log_id("compute.googleapis.com/vpc_flows") AND
resource.labels.subnetwork_name"=[SUBNET_NAME]"
#
# Audit: Google Compute Engine firewall rule deletion
#
resource.type="gce_firewall_rule" AND
log_id("cloudaudit.googleapis.com/activity") AND
protoPayload.methodName:"firewalls.delete"
#
# DNS: all DNS queries
#
resource.type="dns_query"
#
# Firewall: logs for a given country
#
resource.type="gce_subnetwork" AND
log_id("compute.googleapis.com/firewall") AND
jsonPayload.remote_location.country=[COUNTRY_ISO_ALPHA_3]

Start working on your query using the Google Cloud Logs Explorer, so you can preview and pinpoint the exact log types you want to forward to your Elastic Stack.

To learn more, please read how to Build queries in the Logs Explorer, and take a look at the Sample queries using the Logs Explorer page in the Google Cloud docs.

Visit "Management" > "Integrations" > "Installed Integrations" > "Google Cloud Platform" and select the "Integration Policies" tab. Select the integration policy you previously created.

From the list of services, select "Google Cloud Platform (GCP) audit logs (gcp-pubsub)" and:

  • On the "Topic" field, specify the "topic name" you noted before on the Google Cloud Console.
  • On the "Subscription Name", specify the short subscription name you noted before on the Google Cloud Console (note: do NOT use the full-blown subscription name made of project/PROJECT_ID/subscriptions/SUBSCRIPTION_ID). Just pick the Subscription ID from the Google Cloud Console).
  • Click on "Save Integration", and make sure the Elastic Agent gets the updated policy.

If you don’t see Audit logs showing up, check the Agent logs to see if there are errors.

Common error types:

  • Missing roles in the Service Account
  • Misconfigured settings, like "Project Id", "Topic" or "Subscription Name" fields

If your Service Account (SA) does not have the required roles, you might find errors like this one in the elastic_agent.filebeat dataset:

failed to subscribe to pub/sub topic: failed to check if subscription exists: rpc error: code = PermissionDenied desc = User not authorized to perform this action.

Solution: make sure your SA has all the required roles.

If you specify the wrong "Topic field" or "Subscription Name", you might find errors like this one in the elastic_agent.filebeat dataset:

[elastic_agent.filebeat][error] failed to subscribe to pub/sub topic: failed to check if subscription exists: rpc error: code = InvalidArgument desc = Invalid resource name given (name=projects/project/subscriptions/projects/project/subscriptions/non-existent-sub). Refer to https://cloud.google.com/pubsub/docs/admin#resource_names for more information.

Solution: double check the integration settings.

With a properly configured Service Account and the integration setting in place, it’s time to start collecting some metrics.

No additional requirement is needed to collect metrics.

If you don’t see metrics showing up, check the Agent logs to see if there are errors.

Common error types:

  • Period is lower than 60 seconds
  • Missing roles in the Service Account
  • Misconfigured settings, like "Project Id"

Usual minimum collection period for GCP metrics is 60 seconds. Any value lower than that cause an error when retrieving the metric metadata. If an error happens, the affected metric is skipped at the metric collection stage, resulting in no data being sent.

If your Service Account (SA) does not have required roles, you might find errors related to accessing GCP resources.

To check you may add Monitoring Viewer and Compute Viewer roles (built-in GCP roles) to your SA. These roles contain the permission added in the previous step and expand them with additional permissions. You can analyze additional missing permissions from the GCP Console > IAM > clicking on the down arrow near the roles on the same line of your SA > View analyzed permissions. From the shown table you can check which permissions from the role the SA is actively using. They should match what you configured in your custom role.

If you specify a wrong setting you will probably find errors related to missing GCP resources.

Make sure the settings are correct and the SA has proper permissions for the given "Project Id".

The audit dataset collects audit logs of administrative activities and accesses within your Google Cloud resources.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The firewall dataset collects logs from Firewall Rules in your Virtual Private Cloud (VPC) networks.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The vpcflow dataset collects logs sent from and received by VM instances, including instances used as GKE nodes.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The dns dataset collects queries that name servers resolve for your Virtual Private Cloud (VPC) networks, as well as queries from an external entity directly to a public zone.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The loadbalancing_logs dataset collects logs of the requests sent to and handled by GCP Load Balancers.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The billing dataset collects GCP Billing information from Google Cloud BigQuery daily cost detail table.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The compute dataset is designed to fetch metrics for Compute Engine Virtual Machines in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The dataproc dataset is designed to fetch metrics from Dataproc in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The firestore dataset fetches metrics from Firestore in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The gke dataset is designed to fetch metrics from GKE in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The loadbalancing_metrics dataset is designed to fetch HTTPS, HTTP, and Layer 3 metrics from Load Balancing in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The redis dataset is designed to fetch metrics from GCP Memorystore for Redis in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.

The storage dataset fetches metrics from Storage in Google Cloud Platform.

ECS Field Reference

Please refer to the following document for detailed information on ECS fields.