Provision a GCS Bucket Using GitHub and Terraform Cloud

A hands-on Terraform lab to build GCP infrastructure

DevOps Lab Notes โ€” Saturday Drop

Hey โ€” itโ€™s Anuj ๐Ÿ‘‹

Welcome to DevOps Lab Notes, a weekly dose of hands-on learning.

Every Saturday, youโ€™ll get one actionable lab or real-world DevOps use case โ€” no fluff, just practical insights you can apply.

Built for engineers who learn best by doing.

Welcome back to DevOps Lab Notes โ€” where we bridge theory with hands-on practice.

In Lab #1, we set up Terraform the right way. Now itโ€™s time to do something real: create a Google Cloud Storage (GCS) bucket using a proper, production-grade Terraform setup.

In this lab, youโ€™ll:

  • Use GitHub to store Terraform code

  • Use Terraform Cloud (TFC) to manage execution and state

  • Provision a GCS bucket in Google Cloud Platform (GCP)

Letโ€™s go!

๐ŸŽฏ What This Lab Covers

  • Creating a reusable Terraform module

  • Defining a real-world directory structure

  • Connecting GitHub to Terraform Cloud

  • Using TFC to provision resources on GCP

๐Ÿงฉ Real-World Setup Overview

Both Terraform Cloud and Google Cloud Platform offer generous free tiers that make it possible to complete this lab at no cost:

  • Terraform Cloud (TFC) Free tier includes:

    • Unlimited users

    • One concurrent run per workspace

    • Remote state management

    • VCS (GitHub) integration

  • Google Cloud Platform (GCP) Free tier includes:

    • 5 GB of storage in GCS

    • 1 GB of egress/month

    • No cost to create and delete storage buckets

๐Ÿงช For testing or learning in the free tier, you can temporarily use a service account key file to authenticate Terraform with GCP.

โš ๏ธ In production, the recommended approach is to use Workload Identity Federation (WIF) โ€” available in TFCโ€™s paid plan.

Traditionally, Terraform authenticates to cloud providers using service account keys โ€” long-lived credentials that can be accidentally leaked or abused. Workload Identity Federation (WIF) solves this problem.

With WIF, Terraform Cloud uses short-lived identity tokens to impersonate a Google Cloud service account. This way:

  • No keys are ever stored

  • Credentials are rotated automatically

  • Authentication is scoped to only what's needed

๐Ÿ—‚๏ธ Directory Structure

terraform-gcp-buckets/
โ”œโ”€โ”€ modules/
โ”‚   โ””โ”€โ”€ gcs_bucket/
โ”‚       โ”œโ”€โ”€ main.tf
โ”‚       โ”œโ”€โ”€ variables.tf
โ”‚       โ”œโ”€โ”€ outputs.tf
โ”œโ”€โ”€ env/
โ”‚   โ””โ”€โ”€ dev/
โ”‚       โ”œโ”€โ”€ main.tf
โ”‚       โ”œโ”€โ”€ backend.tf
โ”‚       โ”œโ”€โ”€ provider.tf
โ”‚       โ”œโ”€โ”€ variables.tf

๐Ÿงฑ Step 1: Create a Reusable Module

modules/gcs_bucket/main.tf

terraform {
  required_version = ">= 1.5.0"

  required_providers {
    google = {
      source  = "hashicorp/google"
      version = ">= 5.0.0"
    }
  }
}
resource "google_storage_bucket" "this" {
  name          = var.bucket_name
  location      = var.location
  force_destroy = var.force_destroy

  uniform_bucket_level_access = true

  versioning {
    enabled = var.versioning_enabled
  }

  lifecycle_rule {
    action {
      type = "Delete"
    }
    condition {
      age = var.lifecycle_age
    }
  }
}

variables.tf and outputs.tf support reusability across environments.

modules/gcs_bucket/variables.tf

variable "bucket_name" {
  description = "Name of the GCS bucket"
  type        = string
}

variable "location" {
  description = "GCP region for the bucket"
  type        = string
}

variable "force_destroy" {
  description = "Whether to force destroy bucket"
  type        = bool
  default     = false
}

variable "versioning_enabled" {
  description = "Enable versioning on the bucket"
  type        = bool
  default     = true
}

variable "lifecycle_age" {
  description = "Number of days after which objects should be deleted"
  type        = number
  default     = 30
}

modules/gcs_bucket/outputs.tf

output "bucket_name" {
  description = "Name of the created bucket"
  value       = google_storage_bucket.this.name
}

๐ŸŒŽ Step 2: Configure the Environment (env/dev)

env/dev/main.tf

module "gcs_bucket" {
  source             = "../../modules/gcs_bucket"
  bucket_name        = var.bucket_name
  location           = var.region
  force_destroy      = false
  versioning_enabled = true
  lifecycle_age      = 30
}

env/dev/backend.tf (if you opt to use GCS for state)

terraform {
  backend "remote" {
    organization = "devopslabnotes-org"

    workspaces {
      name = "dev-environment"
    }
  }
}

Or leave this file empty if you're using TFC for state. We are keeping it empty in this example.

env/dev/provider.tf

provider "google" {
    credentials = base64decode(var.google_credentials)
    project = var.project_id
    region = var.region
}

env/dev/variables.tf

variable "bucket_name" {
  description = "Name of the GCS bucket"
  type        = string
  default     = "devopslabnotes-bucket-dev"
}

variable "project_id" {
  description = "GCP project ID"
  type        = string
}

variable "region" {
  description = "Region where resources will be deployed"
  type        = string
}

variable "google_credentials" {
  description = "Service account credentials in JSON format"
  type        = string
}

๐Ÿ”‘ Step 3: Create and Download Service Account Key

  1. Go to Google Cloud Console โ†’ IAM & Admin โ†’ Service Accounts.

  2. Click Create Service Account.

  3. Give it a name like terraform-executor, click Create and Continue.

  4. Assign it the role Storage Admin, click Continue, then Done.

  5. Click the three dots (โ‹ฎ) on the created service account โ†’ Manage keys.

  6. Click Add Key โ†’ Create new key โ†’ JSON.

  7. Download the service-account-key.json file only temporarily to copy its contents.

  8. Paste the contents into Terraform Cloud variable in the next step.

  9. Delete the local copy to avoid accidental commits.

โš™๏ธ Step 4: Connect GitHub Repo to Terraform Cloud

  1. Log in to Terraform Cloud.

  2. Create a new organization (e.g., devopslabnotes-org).

  3. Create a workspace (e.g., dev-environment).

  4. Under Version Control Workflow, connect your GitHub account and select the terraform-gcp-buckets repo.

  5. Under General Settings, set the working directory to env/dev.

๐Ÿ” Step 5: Add Required Variables in Terraform Cloud

Go to your workspace โ†’ Variables tab and add the following:

Variable Name

Type

Value

Category

project_id

string

Your GCP Project ID

Terraform

region

string

e.g., us-central1

Terraform

bucket_name

string

e.g., devopslabnotes-bucket-dev

Terraform

google_credentials

string

Paste base64 encodedcontent of your service-account-key.json

Terraform (Sensitive)

โ–ถ๏ธ Step 6: Run Terraform via Terraform Cloud

Once variables are configured and GitHub is integrated:

  1. Make a small change (e.g., add a newline) and push to GitHub.

  2. Terraform Cloud will automatically:

    • Detect the change

    • Run terraform plan

    • Await approval for terraform apply (click Confirm & Apply)

After the run completes, youโ€™ll have a GCS bucket provisioned in GCP ๐ŸŽ‰

๐Ÿงน Optional: Cleanup Resources

Please make sure to delete the created resources from UI to avoid costs.

โœจ Summary

Youโ€™ve now:

  • Structured Terraform code for real-world environments

  • Created reusable modules

  • Used Terraform Cloud to orchestrate provisioning

  • Created a secure and version-controlled GCS bucket setup

โญ๏ธ Up Next

In Lab #3, weโ€™ll modernize this setup by replacing service account keys with Workload Identity Federation (WIF) โ€” a secure, keyless way to authenticate Terraform runs on Google Cloud.

From my terminal to yours,
~ Anuj