- DevOps Lab Notes
- Posts
- Provision a GCS Bucket Using GitHub and Terraform Cloud
Provision a GCS Bucket Using GitHub and Terraform Cloud
A hands-on Terraform lab to build GCP infrastructure
DevOps Lab Notes — Saturday Drop
Hey — it’s Anuj 👋
Welcome to DevOps Lab Notes, a weekly dose of hands-on learning.
Every Saturday, you’ll get one actionable lab or real-world DevOps use case — no fluff, just practical insights you can apply.
Built for engineers who learn best by doing.
Welcome back to DevOps Lab Notes — where we bridge theory with hands-on practice.
In Lab #1, we set up Terraform the right way. Now it’s time to do something real: create a Google Cloud Storage (GCS) bucket using a proper, production-grade Terraform setup.
In this lab, you’ll:
Use GitHub to store Terraform code
Use Terraform Cloud (TFC) to manage execution and state
Provision a GCS bucket in Google Cloud Platform (GCP)
Let’s go!
🎯 What This Lab Covers
Creating a reusable Terraform module
Defining a real-world directory structure
Connecting GitHub to Terraform Cloud
Using TFC to provision resources on GCP
🧩 Real-World Setup Overview
Both Terraform Cloud and Google Cloud Platform offer generous free tiers that make it possible to complete this lab at no cost:
Terraform Cloud (TFC) Free tier includes:
Unlimited users
One concurrent run per workspace
Remote state management
VCS (GitHub) integration
Google Cloud Platform (GCP) Free tier includes:
5 GB of storage in GCS
1 GB of egress/month
No cost to create and delete storage buckets
🧪 For testing or learning in the free tier, you can temporarily use a service account key file to authenticate Terraform with GCP.
⚠️ In production, the recommended approach is to use Workload Identity Federation (WIF) — available in TFC’s paid plan.
Traditionally, Terraform authenticates to cloud providers using service account keys — long-lived credentials that can be accidentally leaked or abused. Workload Identity Federation (WIF) solves this problem.
With WIF, Terraform Cloud uses short-lived identity tokens to impersonate a Google Cloud service account. This way:
No keys are ever stored
Credentials are rotated automatically
Authentication is scoped to only what's needed
🗂️ Directory Structure
terraform-gcp-buckets/
├── modules/
│ └── gcs_bucket/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
├── env/
│ └── dev/
│ ├── main.tf
│ ├── backend.tf
│ ├── provider.tf
│ ├── variables.tf
🧱 Step 1: Create a Reusable Module
modules/gcs_bucket/main.tf
terraform {
required_version = ">= 1.5.0"
required_providers {
google = {
source = "hashicorp/google"
version = ">= 5.0.0"
}
}
}
resource "google_storage_bucket" "this" {
name = var.bucket_name
location = var.location
force_destroy = var.force_destroy
uniform_bucket_level_access = true
versioning {
enabled = var.versioning_enabled
}
lifecycle_rule {
action {
type = "Delete"
}
condition {
age = var.lifecycle_age
}
}
}
variables.tf and outputs.tf support reusability across environments.
modules/gcs_bucket/variables.tf
variable "bucket_name" {
description = "Name of the GCS bucket"
type = string
}
variable "location" {
description = "GCP region for the bucket"
type = string
}
variable "force_destroy" {
description = "Whether to force destroy bucket"
type = bool
default = false
}
variable "versioning_enabled" {
description = "Enable versioning on the bucket"
type = bool
default = true
}
variable "lifecycle_age" {
description = "Number of days after which objects should be deleted"
type = number
default = 30
}
modules/gcs_bucket/outputs.tf
output "bucket_name" {
description = "Name of the created bucket"
value = google_storage_bucket.this.name
}
🌎 Step 2: Configure the Environment (env/dev)
env/dev/main.tf
module "gcs_bucket" {
source = "../../modules/gcs_bucket"
bucket_name = var.bucket_name
location = var.region
force_destroy = false
versioning_enabled = true
lifecycle_age = 30
}
env/dev/backend.tf (if you opt to use GCS for state)
terraform {
backend "remote" {
organization = "devopslabnotes-org"
workspaces {
name = "dev-environment"
}
}
}
Or leave this file empty if you're using TFC for state. We are keeping it empty in this example.
env/dev/provider.tf
provider "google" {
credentials = base64decode(var.google_credentials)
project = var.project_id
region = var.region
}
env/dev/variables.tf
variable "bucket_name" {
description = "Name of the GCS bucket"
type = string
default = "devopslabnotes-bucket-dev"
}
variable "project_id" {
description = "GCP project ID"
type = string
}
variable "region" {
description = "Region where resources will be deployed"
type = string
}
variable "google_credentials" {
description = "Service account credentials in JSON format"
type = string
}
🔑 Step 3: Create and Download Service Account Key
Go to Google Cloud Console → IAM & Admin → Service Accounts.
Click Create Service Account.
Give it a name like
terraform-executor
, click Create and Continue.Assign it the role Storage Admin, click Continue, then Done.
Click the three dots (⋮) on the created service account → Manage keys.
Click Add Key → Create new key → JSON.
Download the
service-account-key.json
file only temporarily to copy its contents.Paste the contents into Terraform Cloud variable in the next step.
Delete the local copy to avoid accidental commits.
⚙️ Step 4: Connect GitHub Repo to Terraform Cloud
Log in to Terraform Cloud.
Create a new organization (e.g.,
devopslabnotes-org
).Create a workspace (e.g.,
dev-environment
).Under Version Control Workflow, connect your GitHub account and select the
terraform-gcp-buckets
repo.Under General Settings, set the working directory to
env/dev
.
🔐 Step 5: Add Required Variables in Terraform Cloud
Go to your workspace → Variables tab and add the following:
Variable Name | Type | Value | Category |
---|---|---|---|
| string | Your GCP Project ID | Terraform |
| string | e.g., | Terraform |
| string | e.g., | Terraform |
| string | Paste base64 encodedcontent of your service-account-key.json | Terraform (Sensitive) |
▶️ Step 6: Run Terraform via Terraform Cloud
Once variables are configured and GitHub is integrated:
Make a small change (e.g., add a newline) and push to GitHub.
Terraform Cloud will automatically:
Detect the change
Run
terraform plan
Await approval for
terraform apply
(click Confirm & Apply)
After the run completes, you’ll have a GCS bucket provisioned in GCP 🎉
🧹 Optional: Cleanup Resources
Please make sure to delete the created resources from UI to avoid costs.
✨ Summary
You’ve now:
Structured Terraform code for real-world environments
Created reusable modules
Used Terraform Cloud to orchestrate provisioning
Created a secure and version-controlled GCS bucket setup
⏭️ Up Next
In Lab #3, we’ll modernize this setup by replacing service account keys with Workload Identity Federation (WIF) — a secure, keyless way to authenticate Terraform runs on Google Cloud.
From my terminal to yours,
~ Anuj