Code Monkey home page Code Monkey logo

terraform-google-log-export's Introduction

Terraform Log Export Module

This module allows you to create log exports at the project, folder, organization, or billing account level. Submodules are also available to configure the destination resource that will store all exported logs. The resources/services/activations/deletions that this module will create/trigger are:

  • An Aggregated log export on the project-level, folder-level, organization-level, or billing-account-level
  • A Service account (logsink writer)
  • A Destination (Cloud Storage bucket, Cloud Pub/Sub topic, BigQuery dataset)

Compatibility

This module is meant for use with Terraform 0.13+ and tested using Terraform 1.0+. If you find incompatibilities using Terraform >=0.13, please open an issue. If you haven't upgraded and need a Terraform 0.12.x-compatible version of this module, the last released version intended for Terraform 0.12.x is v5.1.0.

Usage

The examples directory contains directories for each destination, and within each destination directory are directories for each parent resource level. Consider the following example that will configure a Cloud Storage destination and a log export at the project level:

module "log_export" {
  source                 = "terraform-google-modules/log-export/google"
  version                = "~> 7.0"
  destination_uri        = "${module.destination.destination_uri}"
  filter                 = "severity >= ERROR"
  log_sink_name          = "storage_example_logsink"
  parent_resource_id     = "sample-project"
  parent_resource_type   = "project"
  unique_writer_identity = true
}

module "destination" {
  source                   = "terraform-google-modules/log-export/google//modules/storage"
  version                  = "~> 7.0"
  project_id               = "sample-project"
  storage_bucket_name      = "storage_example_bucket"
  log_sink_writer_identity = "${module.log_export.writer_identity}"
}

At first glance that example seems like a circular dependency as each module declaration is using an output from the other, however Terraform is able to collect and order all the resources so that all dependencies are met.

Inputs

Name Description Type Default Required
bigquery_options (Optional) Options that affect sinks exporting data to BigQuery. use_partitioned_tables - (Required) Whether to use BigQuery's partition tables.
object({
use_partitioned_tables = bool
})
null no
description A description of this sink. The maximum length of the description is 8000 characters. string null no
destination_uri The self_link URI of the destination resource (This is available as an output coming from one of the destination submodules) string n/a yes
disabled (Optional) If set to true, then the sink is disabled and it does not export any log entries. bool false no
exclusions (Optional) A list of sink exclusion filters.
list(object({
name = string,
description = string,
filter = string,
disabled = bool
}))
[] no
filter The filter to apply when exporting logs. Only log entries that match the filter are exported. Default is '' which exports all logs. string "" no
include_children Only valid if 'organization' or 'folder' is chosen as var.parent_resource.type. Determines whether or not to include children organizations/folders in the sink export. If true, logs associated with child projects are also exported; otherwise only logs relating to the provided organization/folder are included. bool false no
log_sink_name The name of the log sink to be created. string n/a yes
parent_resource_id The ID of the GCP resource in which you create the log sink. If var.parent_resource_type is set to 'project', then this is the Project ID (and etc). string n/a yes
parent_resource_type The GCP resource in which you create the log sink. The value must not be computed, and must be one of the following: 'project', 'folder', 'billing_account', or 'organization'. string "project" no
unique_writer_identity Whether or not to create a unique identity associated with this sink. If false (the default), then the writer_identity used is serviceAccount:[email protected]. If true, then a unique service account is created and used for the logging sink. bool false no

Outputs

Name Description
filter The filter to be applied when exporting logs.
log_sink_resource_id The resource ID of the log sink that was created.
log_sink_resource_name The resource name of the log sink that was created.
parent_resource_id The ID of the GCP resource in which you create the log sink.
writer_identity The service account that logging uses to write log entries to the destination.

Requirements

Terraform plugins

Configure a Service Account

In order to execute this module you must have a Service Account with the following:

Roles

The service account should have the following roles:

  • roles/logging.configWriter on the logsink's project, folder, or organization (to create the logsink)
  • roles/resourcemanager.projectIamAdmin on the destination project (to grant write permissions for logsink service account)
  • roles/serviceusage.serviceUsageAdmin on the destination project (to enable destination APIs)

Pub/Sub roles

To use a Google Cloud Pub/Sub topic as the destination:

  • roles/pubsub.admin on the destination project (to create a pub/sub topic)

To integrate the logsink with Splunk, you'll need a topic subscriber (service account):

  • roles/iam.serviceAccountAdmin on the destination project (to create a service account for the logsink subscriber)

Storage role

To use a Google Cloud Storage bucket as the destination:

  • roles/storage.admin on the destination project (to create a storage bucket)

BigQuery role

To use a BigQuery dataset as the destination, one must grant:

  • roles/bigquery.dataEditor on the destination project (to create a BigQuery dataset)

BigQuery Options

To use BigQuery use_partitioned_tables argument you must also have unique_writer_identity set to true.

Usage in module:

bigquery_options = {
   use_partitioned_tables = true
 }

Enabling this option will store logs into a single table that is internally partitioned by day which can improve query performance.

Enable API's

In order to operate with the Service Account you must activate the following API's on the base project where the Service Account was created:

  • Cloud Resource Manager API - cloudresourcemanager.googleapis.com
  • Cloud Billing API - cloudbilling.googleapis.com
  • Identity and Access Management API - iam.googleapis.com
  • Service Usage API - serviceusage.googleapis.com
  • Stackdriver Logging API - logging.googleapis.com
  • Cloud Storage JSON API - storage-api.googleapis.com
  • BigQuery API - bigquery.googleapis.com
  • Cloud Pub/Sub API - pubsub.googleapis.com

Install

Terraform

Be sure you have the correct Terraform version (0.12.x), you can choose the binary here:

terraform-google-log-export's People

Contributors

aaron-lane avatar apeabody avatar bharathkkb avatar cgsyam avatar cloud-foundation-bot avatar daniel-cit avatar dependabot[bot] avatar g-awmalik avatar glarizza avatar grugnog avatar huron25 avatar imrannayer avatar ivankorn avatar jberlinsky avatar kiran002 avatar kopachevsky avatar kravvcu avatar meganzhao10 avatar morgante avatar nlamirault avatar release-please[bot] avatar renovate[bot] avatar rglenn-accenture avatar saloni-patidar avatar stenalpjolly avatar sumeet-chaurasia avatar tanguynicolas avatar umairidris avatar vovinacci avatar wmuizelaar avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

terraform-google-log-export's Issues

The storage submodule should allow public access prevention

TL;DR

modules/storage/main.tf
Add the option for public_access_prevention

Terraform Resources

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/storage_bucket#public_access_prevention

Detailed design

CIS 5.1 requirement - Ensure that Cloud Storage bucket is not anonymously or publicly accessible.

Additional information

No response

Add Support for Specifying Pub/Sub Subscriber Service Account ID - Pub/Sub Module

Feature Request

Specify the name of the Subscriber Service Account ID via an optional variable.

Existing Naming Convention

Currently, the module bases the Subscriber Service Account ID purely on the name output of the google_pubsub_topic resource.

Recommended Solution

  • New variable definition (subscriber_service_account_id) in variables.tf, with default value of "" (empty string), to set the Subscriber Service Account ID.
  • New local variable in main.ts to be used in the google_service_account.pubsub_subscriber as the account_id that conditionally uses a non-empty subscriber_service_account_id value, or the existing topic-based name.
  • Re-generate the modules README.md based on the new input variables.

Not compatible with terraform 0.15

When using the module with terraform 0.15.0, I get the following error:

│ Error: Error in function call
│ 
│   on .terraform/modules/log_export_audit_rapid7_pubsub/main.tf line 27, in locals:
│   27:   log_sink_writer_identity = local.is_project_level ? element(concat(google_logging_project_sink.sink.*.writer_identity, list("")), 0) : local.is_folder_level ? element(concat(google_logging_folder_sink.sink.*.writer_identity, list("")), 0) : local.is_org_level ? element(concat(google_logging_organization_sink.sink.*.writer_identity, list("")), 0) : local.is_billing_level ? element(concat(google_logging_billing_account_sink.sink.*.writer_identity, list("")), 0) : ""
│ 
│ Call to function "list" failed: the "list" function was deprecated in Terraform v0.12 and is no longer available; use tolist([ ... ]) syntax to write a literal list.

list is deprecated as explained here: https://www.terraform.io/upgrade-guides/0-15.html#legacy-configuration-language-features

Provider 3.x support

Hi,

The terraform-google-provider now landed to version 3.x - what is to be expected from this (and probably other) modules regarding supporting that?

I'm willing to submit PRs and help testing if desired.

Thanks,
Wietse

Make quickstart script

Make a quickstart script to activate needed APIs on the service account's host project.

partition_expiration_days does not apply to partitions already created

TL;DR

partition_expiration_days does not apply to partitions already created, after my modification the partion stay on "Partitions do not expire", but if a new partition was created, I have my Partition expiration with the value I have set.

Expected behavior

No response

Observed behavior

No response

Terraform Configuration

module "log_export_to_biqquery" {
  source                 = "terraform-google-modules/log-export/google"
  version                = "~> 7.6.0"
  destination_uri        = "bigquery.googleapis.com/projects/${var.project_id}/datasets/logs_${var.environment}"
  filter                 = local.bq_logs_filter
  log_sink_name          = "sk-ccoe-logging-bq-${var.environment}"
  parent_resource_id     = var.organization_id
  parent_resource_type   = "organization"
  include_children       = true
  unique_writer_identity = true
  bigquery_options = {
    use_partitioned_tables = true
  }
}

module "bigquery_destination" {
  source                     = "terraform-google-modules/log-export/google//modules/bigquery"
  version                    = "~> 7.6.0"
  project_id                 = var.project_id
  dataset_name               = "logs_${var.environment}"
  location                   = var.default_location
  log_sink_writer_identity   = module.log_export_to_biqquery.writer_identity
  expiration_days            = var.audit_logs_table_expiration_days
  partition_expiration_days  = var.partition_expiration_days
  delete_contents_on_destroy = var.audit_logs_table_delete_contents_on_destroy

Terraform Version

Terraform v1.6.3
on linux_amd64
+ provider registry.terraform.io/hashicorp/google v4.84.0

Additional information

No response

create_push_subscriber description is probably misleading

TL;DR

In pubsub submodule input params description there is information that service account will be created with IAM permissions if create_push_subscriber=true. Looking at the code it looks like it is not created for push subscription but only for pull subscription (when var.create_subscriber = true).

Expected behavior

Param description reflect what is actually happening

Observed behavior

Description is not correct

Terraform Configuration

#-----------------------------------------------#
# Pub/Sub topic subscription (for integrations) #
#-----------------------------------------------#
resource "google_service_account" "pubsub_subscriber" {
  count        = var.create_subscriber ? 1 : 0
  account_id   = local.subscriber_id
  display_name = "${local.topic_name} Topic Subscriber"
  project      = var.project_id
}

resource "google_pubsub_subscription_iam_member" "pubsub_subscriber_role" {
  count        = var.create_subscriber ? 1 : 0
  role         = "roles/pubsub.subscriber"
  project      = var.project_id
  subscription = local.pubsub_subscription
  member       = "serviceAccount:${google_service_account.pubsub_subscriber[0].email}"
}

resource "google_pubsub_topic_iam_member" "pubsub_viewer_role" {
  count   = var.create_subscriber ? 1 : 0
  role    = "roles/pubsub.viewer"
  project = var.project_id
  topic   = local.topic_name
  member  = "serviceAccount:${google_service_account.pubsub_subscriber[0].email}"
}

resource "google_pubsub_subscription" "pubsub_subscription" {
  count   = var.create_subscriber ? 1 : 0
  name    = "${local.topic_name}-subscription"
  project = var.project_id
  topic   = local.topic_name
  labels  = var.subscription_labels
}

resource "google_pubsub_subscription" "pubsub_push_subscription" {
  count   = var.create_push_subscriber ? 1 : 0
  name    = "${local.topic_name}-push-subscription"
  project = var.project_id
  topic   = local.topic_name

  push_config {
    push_endpoint = var.push_endpoint
  }
}

Terraform Version

master version

Additional information

No response

log-export module fails to create service accounts and export writer_identity

TL;DR

When replicating a basic example of creating an export to a logging bucket, the base log-export module outputs an empty writer_identity string output that cannot be used as an input to the logbucket submodule. Root cause appears to be that the log-export module is not creating service accounts even though unique_writer_identity is true

Expected behavior

The parent module should create a service account and output a valid valid writer_identity string that can be used as an input to the logbucket submodule.

Observed behavior

output "log_export_1_writer_identity" {
  value = module.log_export_1.writer_identity
}

results in

log_export_1_writer_identity = "

Terraform Configuration

module "log_export_1" {
  source  = "terraform-google-modules/log-export/google"
  version = "7.4.2"

  destination_uri        = module.destination_1.destination_uri
  filter                 = "severity >= ERROR"
  log_sink_name          = "log_bucket_1"
  parent_resource_id     = var.project_id
  parent_resource_type   = "project"
  unique_writer_identity = true
}

module "destination_1" {
  source  = "terraform-google-modules/log-export/google//modules/logbucket"
  version = "7.4.2"

  project_id               = var.project_id
  name                     = "log_bucket_1"
  log_sink_writer_identity = module.log_export_1.writer_identity
}

Terraform Version

Terraform v1.3.5
on darwin_arm64
+ provider registry.terraform.io/hashicorp/google v4.44.0
+ provider registry.terraform.io/hashicorp/google-beta v4.44.1

Additional information

I am unable to test with unique_writer_identity = false because I get the error:

Error 400: Advanced sink options require using per sink service accounts. Use uniqueWriterIdentity=true to create a unique service account for this sink, badRequest

Under which circumstances can unique_writer_identity be false?

Add labels to pubsub_push_subscription

TL;DR

Hello everybody...
We can define topic_labels and subscription_labels but we don't know why we can't define labels for pubsub_push_subscription as well.
It would be very important to have support for the pubsub_push_subscription labels. May be required for IaC policy or compliance reasons.

Terraform Resources

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/pubsub_subscription

Detailed design

No response

Additional information

Resource:
https://github.com/terraform-google-modules/terraform-google-log-export/blob/master/modules/pubsub/main.tf#L101-L110

Similar implementation:
https://github.com/terraform-google-modules/terraform-google-log-export/blob/master/modules/pubsub/main.tf#L98

Add labels to pubsub subscription

You can set topic_labels, but it would be nice to support subscription_labels as well, as they are supported in the UI, API, and even in the Terraform resource

[module storage] Flexible storage lifecycle rule

As of now, the storage module lifecycle rule does not support SetStorageClass as an action type.

Accepting a list of lifecycle rules could allow user to customize actions on their logs.

  • Current implementation
  dynamic "lifecycle_rule" {
    for_each = var.expiration_days == null ? [] : [var.expiration_days]
    content {
      action {
        type = "Delete"
      }
      condition {
        age        = var.expiration_days
        with_state = "ANY"
      }
    }
  }
  • Proposed parameter for lifecycle rule, naming reference to google_storage_bucket resource
lifecycle_rules = [{
  action_type          = "SetStorageClass"
  action_storage_class = "NEARLINE"
  condition_age        = 90
  condition_with_state = "ANY"
}, {
  action_type          = "SetStorageClass"
  action_storage_class = "COLDLINE"
  condition_age        = 180
  condition_with_state = "ANY"
}, {
  action_type          = "Delete"
  condition_age        = 365
  condition_with_state = "ANY"
}]

Error converting from days to ms

I am using the submodule: "terraform-google-modules/log-export/google//modules/bigquery" and when I am supplying 90 for expiration_days variable, terraform throws following error:

Error: Attribute must be a whole number, got 7.776e+09

│ with module.log_export.module.bq_destination.google_bigquery_dataset.dataset,
│ on .terraform/modules/log_export.bq_destination/modules/bigquery/main.tf line 47, in resource "google_bigquery_dataset" "dataset":
│ 47: default_table_expiration_ms = var.expiration_days == null ? null : var.expiration_days * 8.64 * pow(10, 7)

This error is because of the way value is being converted from days to ms.

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Pending Status Checks

These updates await pending status checks. To force their creation now, click the checkbox below.

  • chore(deps): Update Terraform terraform-google-modules/project-factory/google to v15

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

gomod
test/integration/go.mod
  • go 1.21
  • go 1.21.9
  • github.com/GoogleCloudPlatform/cloud-foundation-toolkit/infra/blueprint-test v0.14.0
  • github.com/stretchr/testify v1.9.0
npm
modules/bq-log-alerting/logging/cloud_function/package.json
  • @google-cloud/bigquery ^7.0.0
  • @google-cloud/security-center >=3.0.1
  • crypto-js ^4.2.0
regex
Makefile
  • cft/developer-tools 1.20
build/int.cloudbuild.yaml
  • cft/developer-tools 1.20
build/lint.cloudbuild.yaml
  • cft/developer-tools 1.20
terraform
examples/bigquery/billing_account/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/bigquery/billing_account/versions.tf
  • hashicorp/terraform >= 0.13
examples/bigquery/folder/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/bigquery/folder/versions.tf
  • hashicorp/terraform >= 0.13
examples/bigquery/organization/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/bigquery/organization/versions.tf
  • hashicorp/terraform >= 0.13
examples/bigquery/project/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/bigquery/project/versions.tf
  • hashicorp/terraform >= 0.13
examples/bq-log-alerting/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
examples/bq-log-alerting/versions.tf
  • hashicorp/terraform >= 0.13
examples/datadog-sink/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/datadog-sink/versions.tf
  • hashicorp/terraform >= 0.13
examples/logbucket/folder/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/logbucket/folder/versions.tf
  • hashicorp/terraform >= 0.13
examples/logbucket/organization/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/logbucket/organization/versions.tf
  • hashicorp/terraform >= 0.13
examples/logbucket/project/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/logbucket/project/providers.tf
examples/logbucket/project/versions.tf
  • hashicorp/terraform >= 0.13
examples/project/project/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/project/project/versions.tf
  • hashicorp/terraform >= 0.13
examples/pubsub/billing_account/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/pubsub/billing_account/versions.tf
  • hashicorp/terraform >= 0.13
examples/pubsub/folder/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/pubsub/folder/versions.tf
  • hashicorp/terraform >= 0.13
examples/pubsub/organization/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/pubsub/organization/versions.tf
  • hashicorp/terraform >= 0.13
examples/pubsub/project/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/pubsub/project/versions.tf
  • hashicorp/terraform >= 0.13
examples/splunk-sink/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/splunk-sink/versions.tf
  • hashicorp/terraform >= 0.13
examples/storage/billing_account/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/storage/billing_account/versions.tf
  • hashicorp/terraform >= 0.13
examples/storage/folder/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/storage/folder/versions.tf
  • hashicorp/terraform >= 0.13
examples/storage/organization/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/storage/organization/versions.tf
  • hashicorp/terraform >= 0.13
examples/storage/project/main.tf
  • terraform-google-modules/log-export/google ~> 8.0
  • terraform-google-modules/log-export/google ~> 8.0
examples/storage/project/versions.tf
  • hashicorp/terraform >= 0.13
modules/bigquery/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
modules/bq-log-alerting/main.tf
  • terraform-google-modules/scheduled-function/google ~> 4.0
modules/bq-log-alerting/versions.tf
  • google >= 3.53, < 6
  • random >= 2.1
  • hashicorp/terraform >= 0.13
modules/logbucket/versions.tf
  • google >= 4.59, < 6
  • hashicorp/terraform >= 0.13
modules/project/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
modules/pubsub/versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13
modules/storage/versions.tf
  • google >= 4.42, < 6
  • hashicorp/terraform >= 0.13
test/fixtures/bigquery/folder/main.tf
test/fixtures/bigquery/organization/main.tf
test/fixtures/bigquery/organization/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/bigquery/project/main.tf
test/fixtures/bigquery/project/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/bq-log-alerting/main.tf
test/fixtures/bq-log-alerting/versions.tf
  • hashicorp/terraform >=0.13.0
test/fixtures/computed_values/main.tf
test/fixtures/computed_values/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/pubsub/folder/main.tf
test/fixtures/pubsub/folder/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/pubsub/organization/main.tf
test/fixtures/pubsub/organization/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/pubsub/project/main.tf
test/fixtures/pubsub/project/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/storage/folder/main.tf
test/fixtures/storage/folder/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/storage/organization/main.tf
test/fixtures/storage/organization/versions.tf
  • hashicorp/terraform >=0.12.6
test/fixtures/storage/project/main.tf
test/setup/main.tf
  • terraform-google-modules/project-factory/google ~> 14.0
  • terraform-google-modules/project-factory/google ~> 14.0
test/setup/versions.tf
  • google >= 3.53.0, < 6
  • google-beta >= 3.53.0, < 6
  • hashicorp/terraform >= 0.13
versions.tf
  • google >= 3.53, < 6
  • hashicorp/terraform >= 0.13

  • Check this box to trigger a request for Renovate to run again on this repository

Main log export defaults

While reviewing the main log exports module I noticed a few improvements could be made there too for defaults.

  • unique_writer_identity should be true.
    https://cloud.google.com/logging/docs/api/tasks/exporting-logs
    When exporting logs, Logging adopts this identity for authorization. For increased security, new sinks get their own unique service account:

  • include_children should be true, since most audit users probably want resources witin folder and orgs to be captured by default.

We can probably wait till v6 (whenever that is) to make the improvements.

How to use two depending on each other modules in `for_each` loop?

Hi,
is it possible to use two depending on each other modules in for_each loop? Some example below:

locals {
  project_id = "xxx"
  sinks = {
    logs_resource_usage = {
      filter  = "resource.type=\"k8s_container\" \nlabels.log_name=\"resource-usage\""
      dataset = "logs_resource_usage_eu"
    }
  }
}


module "log_export_bigquery" {
  source  = "terraform-google-modules/log-export/google"
  version = "7.4.3"

  for_each = local.sinks

  destination_uri        = module.destination[each.key].destination_uri
  filter                 = each.value.filter
  log_sink_name          = each.key
  parent_resource_id     = local.project_id
  parent_resource_type   = "project"
  unique_writer_identity = true
}

module "destination" {
  source  = "terraform-google-modules/log-export/google//modules/bigquery"
  version = "7.4.3"

  for_each = local.sinks

  project_id               = local.project_id
  dataset_name             = each.value.dataset
  log_sink_writer_identity = module.log_export_bigquery[each.key].writer_identity
}

But unfortunately it gives an error:

Error: Cycle: module.log_export_bigquery (close), module.log_export_bigquery.local.log_sink_parent_id (expand), module.log_export_bigquery.output.parent_resource_id (expand), module.log_export_bigquery.local.log_sink_resource_name (expand), module.log_export_bigquery.output.log_sink_resource_name (expand), module.log_export_bigquery.google_logging_folder_sink.sink, module.log_export_bigquery.google_logging_project_sink.sink, module.log_export_bigquery.google_logging_organization_sink.sink, module.log_export_bigquery.local.log_sink_resource_id (expand), module.log_export_bigquery.output.log_sink_resource_id (expand), module.log_export_bigquery.output.writer_identity (expand), module.destination.var.log_sink_writer_identity (expand), module.destination.google_project_iam_member.bigquery_sink_member, module.destination (close), module.log_export_bigquery.var.destination_uri (expand), module.log_export_bigquery.google_logging_billing_account_sink.sink, module.log_export_bigquery.local.log_sink_writer_identity (expand)

Is there any way to make it work like this?

Any update on next release?

We want to upgrade to terraform 5.4.0 and since this module forced < 5.0.0, we are unable to. Any update on when 5.4.0 support might be released?

logbucket submodule contains provider configuration block

TL;DR

Terraform unable to use terraform-google-modules/log-export/google//modules/logbucket as it now contains provider blocks.

Expected behavior

Existing terraform states applied/planned with release v7.8 to continue working when v7.8.1 was released

Observed behavior

│ Error: Module is incompatible with count, for_each, and depends_on
│ 
│   on ../../modules/centralized-logging/main.tf line 105, in module "destination_logbucket":105:   count = var.logbucket_options != null ? 1 : 0
│ 
│ The module at module.logs_export.module.destination_logbucket is a legacy module which contains its own local provider configurations, and so calls to it may not use the count, for_each, or depends_on
│ arguments.
│ 
│ If you also control the module "registry.terraform.io/terraform-google-modules/log-export/google//modules/logbucket", consider updating this module to instead expect provider configurations to be passed by its
│ caller.

Terraform Configuration

# Source: https://github.com/terraform-google-modules/terraform-example-foundation/blob/d4cb8783e99ce160447663047644b0ce6c4888b0/1-org/modules/centralized-logging/main.tf#L96-L111

module "destination_logbucket" {
  source  = "terraform-google-modules/log-export/google//modules/logbucket"
  version = "~> 7.7"

  count = var.logbucket_options != null ? 1 : 0

  project_id                    = var.logging_destination_project_id
  name                          = coalesce(var.logbucket_options.name, local.logging_tgt_name.lbk)
  log_sink_writer_identity      = module.log_export["${local.value_first_resource}_lbk"].writer_identity
  location                      = var.logbucket_options.location
  enable_analytics              = var.logbucket_options.enable_analytics
  linked_dataset_id             = var.logbucket_options.linked_dataset_id
  linked_dataset_description    = var.logbucket_options.linked_dataset_description
  retention_days                = var.logbucket_options.retention_days
  grant_write_permission_on_bkt = false
}

Terraform Version

Terraform v1.5.7
on linux_amd64

Additional information

I believe this pr/commit is the culprit: https://github.com/terraform-google-modules/terraform-google-log-export/pull/195/files#diff-f783bd693fdc1cfb43bef82e6ae39e95aa7b030c729aa05ac3f72b1511122e04

Refactor examples for level instead of destination types

Currently, the examples are organized as follow:

  • bigquery
  • pubsub
  • storage

Since someone that wants to setup a logsink usually thinks about project/folder/organization sinks before the destination, we should refactor the examples like so:

  • project-level
  • folder-level
  • org-level

Update storage module to include versioning

The log export storage module should support the versioning parameter so it can be set on log export buckets.

Not having object versioning enabled on a bucket will cause a finding in Security Health Analytics against the CIS Benchmark for Google Cloud. This is especially important on a log export bucket in the event that an actor would attempt to modify or destroy the logs.

Error when reading or editing Dataset: googleapi: Error 400: Dataset <resource-name> is still in use, resourceInUse

TL;DR

We previously applied the terraform conditionally via a module (i.e. count = 1). Now when we run the same "setup code" and set the conditionally to 0 on said module we get a failure to delete the Dataset saying it's still in use. We're not exactly sure how to proceed here. Any help would be appreciated.

EDIT: deleting through the console works as expected but terraform is failing.

Expected behavior

No errors to happen :)

Observed behavior

We get the below error when trying to delete (forgive all the extra added markers from our automation code):

�[31m╷�[0m�[0m
�[31m│�[0m �[0m�[1m�[31mError: �[0m�[0m�[1mError when reading or editing Dataset: googleapi: Error 400: Dataset <resource-name> is still in use, resourceInUse�[0m
�[31m│�[0m �[0m
�[31m│�[0m �[0m�[0m
�[31m╵�[0m�[0m
�[31m╷�[0m�[0m


�[31m│�[0m �[0m�[1m�[31mError: �[0m�[0m�[1mError creating Dataset: googleapi: Error 409: Already Exists: Dataset <resource-name>, duplicate�[0m
�[31m│�[0m �[0m
�[31m│�[0m �[0m�[0m  with module.integration.module.logging_framework[0].module.destination_bq.google_bigquery_dataset.dataset,
�[31m│�[0m �[0m  on .terraform/modules/integration.logging_framework.destination_bq/modules/bigquery/main.tf line 41, in resource "google_bigquery_dataset" "dataset":
�[31m│�[0m �[0m  41: resource "google_bigquery_dataset" "dataset" �[4m{�[0m�[0m

Terraform Configuration

module "log_bq_sink" {
  source                 = "terraform-google-modules/log-export/google"
  destination_uri        = module.destination_bq.destination_uri
  log_sink_name          = local.bq_logsink_name
  filter                 = "resource.labels.project_id=${var.AccountID}"
  //exclusions = []
  bigquery_options       = {
                             use_partitioned_tables = true
                           }
  parent_resource_id     = var.AccountID
  parent_resource_type   = "project"
  unique_writer_identity = true  //Must be set to true for BQ option
}

module "destination_bq" {
  source                   = "terraform-google-modules/log-export/google//modules/bigquery"
  description              = "Data set which contains all logs generated from pega GKE cluster"
  project_id               = var.AccountID
  dataset_name             = local.bq_dataset_name
  location                 = var.Region
  log_sink_writer_identity = module.log_bq_sink.writer_identity
  expiration_days          = local.bq_expiration_in_days
  delete_contents_on_destroy = local.delete_contents_on_destroy
  labels = {
    project = var.AccountID
    owner = var.Owner
    location = var.Region
  }
}

Terraform Version

1.0.11

Additional information

No response

Allow for specyfing service account to use with `terraform-google-modules/log-export/google`

TL;DR

I cannot reuse modules terraform-google-modules/log-export/google with for_each because it generate circular reference errors with submodules on terraform plan. If I could specify existing service account as writer identity in module I could overcome this.

Terraform Resources

No response

Detailed design

This fails

locals {
  projects = ["project1", "project2"]
  api_key = "api_key"
}

module "gcp_log_export_qa" {
  for_each = local.projects
  source                 = "terraform-google-modules/log-export/google"
  destination_uri        = module.datadog_log_pub_sub_exporter_qa[each.value].destination_uri
  filter                 = "resource.type=\"cloud_run_revision\" OR resource.type=\"http_load_balancer\""
  log_sink_name          = "datadog-log-sink"
  parent_resource_id     = each.value
  parent_resource_type   = "project"
  unique_writer_identity = true
}

module "datadog_log_pub_sub_exporter_qa" {
  for_each = local.projects
  source                   = "terraform-google-modules/log-export/google//modules/pubsub"
  project_id               = each.value
  topic_name               = "datadog-log-exporter"
  log_sink_writer_identity = module.gcp_log_export_qa[each.value].writer_identity
  create_subscriber        = false
  create_push_subscriber   = true
  push_endpoint            = "https://gcp-intake.logs.datadoghq.eu/v1/input/${local.api_key}/"
}

with error:

╷
│ Error: Cycle: module.datadog.module.datadog_log_pub_sub_exporter_qa.google_pubsub_topic_iam_member.pubsub_sink_member, module.datadog.module.gcp_log_export_qa.local.log_sink_resource_id (expand), module.datadog.module.gcp_log_export_qa.output.log_sink_resource_id (expand), module.datadog.module.gcp_log_export_qa.local.log_sink_resource_name (expand), module.datadog.module.gcp_log_export_qa.output.log_sink_resource_name (expand), module.datadog.module.gcp_log_export_qa.google_logging_project_sink.sink, module.datadog.module.gcp_log_export_qa.google_logging_organization_sink.sink, module.datadog.module.gcp_log_export_qa.google_logging_billing_account_sink.sink, module.datadog.module.gcp_log_export_qa.local.log_sink_parent_id (expand), module.datadog.module.gcp_log_export_qa.output.parent_resource_id (expand), module.datadog.module.datadog_log_pub_sub_exporter_qa.var.log_sink_writer_identity (expand), module.datadog.module.datadog_log_pub_sub_exporter_qa (close), module.datadog.module.gcp_log_export_qa.var.destination_uri (expand), module.datadog.module.gcp_log_export_qa.google_logging_folder_sink.sink, module.datadog.module.gcp_log_export_qa.local.log_sink_writer_identity (expand), module.datadog.module.gcp_log_export_qa.output.writer_identity (expand), module.datadog.module.gcp_log_export_qa (close)

To break this circular ref we could just create service account and specify it like this:

module "gcp_log_export_qa" {
  for_each = local.projects
  source                 = "terraform-google-modules/log-export/google"
  destination_uri        = module.datadog_log_pub_sub_exporter_qa[each.value].destination_uri
  filter                 = "resource.type=\"cloud_run_revision\" OR resource.type=\"http_load_balancer\""
  log_sink_name          = "datadog-log-sink"
  parent_resource_id     = each.value
  parent_resource_type   = "project"
  // instead of specifying SA (uniq or not) we use existing SA
  writer_identity        = google_service_account.log_exporter.email 
}

module "datadog_log_pub_sub_exporter_qa" {
  for_each = local.projects
  source                   = "terraform-google-modules/log-export/google//modules/pubsub"
  project_id               = each.value
  topic_name               = "datadog-log-exporter"
  log_sink_writer_identity = google_service_account.log_exporter.email // no ref to module
  create_subscriber        = false
  create_push_subscriber   = true
  push_endpoint            = "https://gcp-intake.logs.datadoghq.eu/v1/input/${local.api_key}/"
}


### Additional information

_No response_

GCS bucket options

We should have an option to change the default 'MULTI_REGIONAL' bucket to other types of GCS buckets, like 'COLDLINE', 'NEARLINE' or 'REGIONAL'.

We should also be able to specify an expiration for objects in the bucket, using 'lifecycle' option for GCS Terraform resource.

Error: google-beta does not support "google_project_service_identity".

make docker_test_lint is failing with:

terraform_validate ./test/setup

Error: Invalid resource type

  on .terraform/modules/project/modules/project_services/main.tf line 38, in resource "google_project_service_identity" "project_service_identities":
  38: resource "google_project_service_identity" "project_service_identities" {

The provider provider.google-beta does not support resource type
"google_project_service_identity".

It is relate to the Support service_identity #448 in the new release of project-factory

To fix we cloud bump the version of the google beta provider

From:

provider "google-beta" {
  version = "~> 3.36.0"
}

to

provider "google-beta" {
  version = "~> 3.38.0"
}

OR

add more control over the major version of the project factory module in the test setup and bump the google-beta provider later.

From:

module "project" {
  source  = "terraform-google-modules/project-factory/google"
  version = "~> 9.0"

To:

module "project" {
  source  = "terraform-google-modules/project-factory/google"
  version = "~> 9.0.0"

Support default_partition_expiration_ms on bq datasets

TL;DR

The terraform provider supports default_table_expiration_ms but not default_partition_expiration_ms. We just want to add a variable for this and update the module to use it.
I will implement this feature and link the issue.

Terraform Resources

https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/bigquery_dataset#default_partition_expiration_ms

Detailed design

No response

Additional information

No response

Error with bigquery-json.googleapis.com. Should it be bigquery.googleapis.com?

Started using Project Factory version v6 which in turn downloaded a new Google provider for Terraform. Since then, any references to "bigquery-json.googleapis.com" fails as it expects just "bigquery.googleapis.com"

I believe this is the cause of the following error when attempting to use terraform-google-log-export

Are we able to get a new version that supports "bigquery.googleapis.com"

Error: expected service to not match any of [dataproc-control.googleapis.com source.googleapis.com stackdriverprovisioning.googleapis.com bigquery-json.googleapis.com], got bigquery-json.googleapis.com

  on .terraform/modules/security.log_export_bigquery_sink_dest/terraform-google-modules-terraform-google-log-export-ae0dd08/modules/bigquery/main.tf line 32, in resource "google_project_service" "enable_destination_api":
  32: resource "google_project_service" "enable_destination_api" {

Regards,
Doug.

Problem having log-export and logbucket destination on same project

TL;DR

When trying to use log-export module for a project "X" and create the destination logbucket on the same project "X" it raises an error.

Expected behavior

Create the log sink resource (based on log-export module) and logbucket as a destination of the sink.

Observed behavior

It raised the following error because field log_sink_writer_identity in module.destination has a blank value:

Error: Request `Create IAM Members roles/logging.bucketWriter  for project "my-project"` returned error: Error applying IAM policy for project "my-project": Error setting IAM policy for project "my-project": googleapi: Error 400: Policy members must be of the form "<type>:<value>".
Details:
[
  {
    "@type": "type.googleapis.com/google.rpc.BadRequest",
    "fieldViolations": [
      {
        "description": "Policy members must be prefixed of the form '\u003ctype\u003e:\u003cvalue\u003e', where \u003ctype\u003e is 'domain', 'group', 'serviceAccount', or 'user'.",
        "field": "policy.bindings.member"
      }
    ]
  },
  {
    "@type": "type.googleapis.com/google.rpc.ErrorInfo",
    "domain": "cloudresourcemanager.googleapis.com",
    "reason": "PROJECT_SET_IAM_DISALLOWED_MEMBER_TYPE"
  },
  {
    "@type": "type.googleapis.com/google.rpc.ResourceInfo",
    "resourceName": "projects/my-project"
  }
]
, badRequest

  on .terraform/modules/destination_logbucket/modules/logbucket/main.tf line 45, in resource "google_project_iam_member" "logbucket_sink_member":
  45: resource "google_project_iam_member" "logbucket_sink_member" {

Terraform Configuration

module "log_export" {
  source  = "terraform-google-modules/log-export/google"
  version = "~> 7.3.0"

  destination_uri        = module.destination_logbucket.destination_uri
  filter                 = ""
  log_sink_name          = "my-sink-name"
  parent_resource_id     = "my-project"
  parent_resource_type   = "project"
  unique_writer_identity = true
  include_children       = true
}

module "destination_logbucket" {
  source  = "terraform-google-modules/log-export/google//modules/logbucket"
  version = "~> 7.4.0"

  project_id               = "my-project"
  name                     = "my-log-bucket-name"
  log_sink_writer_identity = module.log_export.writer_identity
  location                 = "us-east4"
  retention_days           = "30"
}

Terraform Version

Terraform v0.13.7
+ provider registry.terraform.io/hashicorp/google v4.27.0
+ provider registry.terraform.io/hashicorp/google-beta v4.27.0
+ provider registry.terraform.io/hashicorp/random v3.3.2

Additional information

According to Configure and manage sinks documentation:

If you're using a sink to route logs between Logging buckets in the same Cloud project, no new service account is created; the sink works without the unique writer identity.

Add support for project as log export destination

TL;DR

Add support for project as log export destination

Terraform Resources

https://cloud.google.com/logging/docs/routing/overview#destinations

Destination URI will be destination_uri = "logging.googleapis.com/projects/${var.project_id}"

Detailed design

No response

Additional information

No response

modules/local-log-export does not provide way to set locked attribute

TL;DR

modules/local-log-export does not provide way to set locked attribute value.

Expected behavior

Give ability to define the locked parameter, otherwhise the locked parameter is set to null by default and so to false.

Observed behavior

Default value is false.
And when working with locked bucket we get the following changes which is not what we want ...

module.local_log_export["access_approval"].module.destination_log_bucket.google_logging_project_bucket_config.bucket will be updated in-place

~ resource "google_logging_project_bucket_config" "bucket" {
id = "projects/prj-shared-log-core-02ec/locations/europe-west1/buckets/shared-log-core-access-approval"
- locked = true -> null
name = "projects/prj-shared-log-core-02ec/locations/europe-west1/buckets/shared-log-core-access-approval"
# (6 unchanged attributes hidden)
}

Terraform Configuration

module "local_log_export" {
  source           = "../../modules/local-log-export"
  for_each         = var.log-export-config
  log_sink_filter  = each.value.log_sink_filter
  log_destination_name  = each.value.log_destination_name
  project_id       = lookup(var.projects, "prj-shared-log-core","")
  org_id           = var.org_id
  depends_on       = [google_project_iam_member.sa_stack_permissions]
}

Terraform Version

1.3.9

Additional information

No response

pubsub_subscriber_role applied to subscription (vs topic)

I believe that this IAM role should be applied to the subscription (instead of the topic).

Here's the resource (currently line 76 of https://github.com/terraform-google-modules/terraform-google-log-export/blob/master/modules/pubsub/main.tf) :

resource "google_pubsub_topic_iam_member" "pubsub_subscriber_role" {
  count   = var.create_subscriber ? 1 : 0
  role    = "roles/pubsub.subscriber"
  project = var.project_id
  topic   = local.topic_name
  member  = "serviceAccount:${google_service_account.pubsub_subscriber[0].email}"
}

As far as I can have been able to tell with using this and manual testing in the UI, this permission does not get inherited to the from the Pub/Sub topic to the subscription - to get something to pull messages using the destination pub/sub module, I had to manually add the subscriber created by this module to the subscription also created by this module.

I think the correction is just applying that role to the subscription instead of the topic:

resource "google_pubsub_subscription_iam_member" "pubsub_subscriber_role" {
  count   = var.create_subscriber ? 1 : 0
  role    = "roles/pubsub.subscriber"
  project = var.project_id
  subscription = google_pubsub_subscription.pubsub_subscription.name
  member  = "serviceAccount:${google_service_account.pubsub_subscriber[0].email}"
}

Add ttl_days fields to storage and bq destination

Would be nice to have a helper field ttl_days for expiration for days.

Then customers can easily set how long they want their logs to be stored for.

Days is also an appropriate number as most users will want some log exports for < 1 year and some for > 1 year, so it makes it a unit that is not so small (like bigquery currently has expiration in ms which makes it annoying to set) while also not being so large to be inflexible.

Deprecate expiration_days in favor of table_expiration_days

TL;DR

From review of #143 ...
Once this PR has been merged, we should rename expiration_days to be table_expiration_days to make it clear the difference between that and partition_expiration_days.
I'll add a PR that can wait until we're ready to release a new module version.

Terraform Resources

No response

Detailed design

No response

Additional information

No response

Add support for Log Analytics in `logbucket` module

TL;DR

Add support for the Log Analytics feature in the logbucket module.

Terraform Resources

- https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_project_bucket_config
- https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_linked_dataset

Detailed design

add a variable for `enable_analytics` with default value `false` as to not affect existing log buckets
also add the option to link the bucket with a BQ dataset

Additional information

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.