Skip to main content

Google Cloud Storage Setup

This tutorial walks you through configuring Google Cloud Storage (GCS) as a storage destination for your FireBackup backups. GCS offers excellent integration with Firebase projects since both are Google Cloud services.

What You'll Learn

  • Create and configure a GCS bucket with security best practices
  • Set up a service account with least-privilege permissions
  • Configure lifecycle policies for cost optimization
  • Enable dual-region or multi-region storage for resilience
  • Connect GCS to FireBackup

Prerequisites

  • A Google Cloud account with billing enabled
  • Project Owner or Editor role in your GCP project
  • Access to FireBackup dashboard as an organization admin
  • gcloud CLI installed (optional but recommended)

Time Required

Approximately 20-25 minutes


Step 1: Create a GCS Bucket

Using Google Cloud Console

  1. Navigate to Cloud Storage

  2. Click Create bucket

  3. Configure bucket settings:

    SettingRecommended Value
    Nameyour-company-firebackup-prod
    Location typeRegion, Dual-region, or Multi-region
    Storage classStandard
  1. Choose location:

    • Single region: Lowest cost, single-region availability
    • Dual-region: Automatic failover between two regions
    • Multi-region: Highest availability across continent
  2. Storage class:

    • Select Standard for frequently accessed backups
    • Lifecycle rules will transition older backups automatically
  3. Access control:

    • Select Uniform (recommended)
    • This enforces IAM-only access control
  4. Protection tools:

    • Object versioning: Enable
    • Retention policy: Configure based on compliance needs
  5. Click Create

Using gcloud CLI

# Set your project
gcloud config set project YOUR_PROJECT_ID

# Create bucket (single region)
gcloud storage buckets create gs://your-company-firebackup-prod \
--location=us-central1 \
--uniform-bucket-level-access \
--public-access-prevention

# Or create with dual-region
gcloud storage buckets create gs://your-company-firebackup-prod \
--location=us \
--placement=us-central1,us-east1 \
--uniform-bucket-level-access \
--public-access-prevention

# Enable versioning
gcloud storage buckets update gs://your-company-firebackup-prod \
--versioning

# Set default storage class
gcloud storage buckets update gs://your-company-firebackup-prod \
--default-storage-class=STANDARD

Step 2: Create a Service Account

Create a dedicated service account for FireBackup with minimal permissions.

Using Cloud Console

  1. Navigate to IAM & Admin → Service Accounts

  2. Click Create Service Account

  3. Enter service account details:

    • Name: firebackup-storage
    • ID: firebackup-storage
    • Description: Service account for FireBackup storage access
  4. Click Create and Continue

  5. Skip the optional steps for now (we'll add bucket-specific permissions)

  6. Click Done

Using gcloud CLI

# Create service account
gcloud iam service-accounts create firebackup-storage \
--display-name="FireBackup Storage Access" \
--description="Service account for FireBackup to access Cloud Storage"

# Verify creation
gcloud iam service-accounts list --filter="email:firebackup-storage"

Step 3: Grant Bucket Permissions

Apply least-privilege permissions at the bucket level, not project level.

Using Cloud Console

  1. Go to Cloud Storage

  2. Click on your bucket name

  3. Go to Permissions tab

  4. Click Grant Access

  5. Add the service account and roles:

    PrincipalRole
    firebackup-storage@PROJECT_ID.iam.gserviceaccount.comStorage Object User

Using gcloud CLI

# Grant Storage Object User role on the bucket
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/storage.objectUser"

Custom IAM Role (Optional)

For stricter access control, create a custom role:

# firebackup-role.yaml
title: "FireBackup Storage Role"
description: "Minimal permissions for FireBackup storage operations"
stage: "GA"
includedPermissions:
- storage.objects.create
- storage.objects.get
- storage.objects.list
- storage.objects.delete
- storage.multipartUploads.create
- storage.multipartUploads.abort
- storage.multipartUploads.listParts

Apply the custom role:

# Create custom role
gcloud iam roles create FireBackupStorage \
--project=YOUR_PROJECT_ID \
--file=firebackup-role.yaml

# Apply to bucket
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
--role="projects/YOUR_PROJECT_ID/roles/FireBackupStorage"

Step 4: Generate Service Account Key

Create a JSON key file for authentication.

Using Cloud Console

  1. Go to Service Accounts

  2. Click on firebackup-storage service account

  3. Go to Keys tab

  4. Click Add KeyCreate new key

  5. Select JSON format

  6. Click Create

  7. The key file downloads automatically - store it securely!

Using gcloud CLI

# Create and download key
gcloud iam service-accounts keys create firebackup-sa-key.json \
--iam-account=firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com

# Verify key was created
cat firebackup-sa-key.json
Security Warning
  • Never commit service account keys to version control
  • Store keys in a secure secrets manager
  • Rotate keys regularly (every 90 days recommended)
  • Delete unused keys immediately

Key File Contents

The JSON key file looks like:

{
"type": "service_account",
"project_id": "your-project-id",
"private_key_id": "key-id",
"private_key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
"client_email": "firebackup-storage@your-project-id.iam.gserviceaccount.com",
"client_id": "123456789",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token"
}

Step 5: Configure Lifecycle Rules

Set up lifecycle rules to automatically manage storage costs.

Using Cloud Console

  1. Go to your bucket → Lifecycle tab

  2. Click Add a rule

  3. Configure rules:

Rule 1: Transition to Nearline

SettingValue
ActionSet storage class to Nearline
ConditionAge > 30 days

Rule 2: Transition to Coldline

SettingValue
ActionSet storage class to Coldline
ConditionAge > 90 days

Rule 3: Transition to Archive

SettingValue
ActionSet storage class to Archive
ConditionAge > 365 days

Rule 4: Delete old versions

SettingValue
ActionDelete object
ConditionNumber of newer versions > 3

Using gcloud CLI

Create a lifecycle configuration file:

{
"lifecycle": {
"rule": [
{
"action": {
"type": "SetStorageClass",
"storageClass": "NEARLINE"
},
"condition": {
"age": 30
}
},
{
"action": {
"type": "SetStorageClass",
"storageClass": "COLDLINE"
},
"condition": {
"age": 90
}
},
{
"action": {
"type": "SetStorageClass",
"storageClass": "ARCHIVE"
},
"condition": {
"age": 365
}
},
{
"action": {
"type": "Delete"
},
"condition": {
"numNewerVersions": 3
}
},
{
"action": {
"type": "AbortIncompleteMultipartUpload"
},
"condition": {
"age": 7
}
}
]
}
}

Apply the configuration:

gcloud storage buckets update gs://your-company-firebackup-prod \
--lifecycle-file=lifecycle.json

Cost Comparison

Storage Class$/GB/monthRetrievalUse Case
Standard$0.020FreeActive backups (< 30 days)
Nearline$0.010$0.01/GBMonthly access (30-90 days)
Coldline$0.004$0.02/GBQuarterly access (90-365 days)
Archive$0.0012$0.05/GBYearly access (> 365 days)

Step 6: Enable Object Retention (Compliance)

For compliance requirements, configure retention policies.

Retention Policy

# Set 90-day retention (objects cannot be deleted for 90 days)
gcloud storage buckets update gs://your-company-firebackup-prod \
--retention-period=90d

# Lock the retention policy (CANNOT be undone!)
gcloud storage buckets update gs://your-company-firebackup-prod \
--lock-retention-period

Object Hold

# Enable default event-based hold
gcloud storage buckets update gs://your-company-firebackup-prod \
--default-event-based-hold
warning

Locked retention policies cannot be removed or shortened. Only use for compliance requirements.


Step 7: Connect to FireBackup

Using the Dashboard

  1. Log in to FireBackup

  2. Navigate to SettingsStorage

  3. Click Add Storage Destination

  4. Select Google Cloud Storage

  5. Enter configuration:

    FieldValue
    NameProduction GCS
    Bucketyour-company-firebackup-prod
    Project IDyour-project-id
    CredentialsUpload or paste JSON key
    Path Prefixbackups/ (optional)
  6. Click Test Connection

  7. Click Save

Using the API

# First, base64 encode the service account key
CREDENTIALS=$(cat firebackup-sa-key.json | base64)

# Create storage destination
curl -X POST https://api.firebackup.io/api/v1/storage \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"name": "Production GCS",
"type": "gcs",
"config": {
"bucket": "your-company-firebackup-prod",
"projectId": "your-project-id",
"credentials": "'"$CREDENTIALS"'",
"prefix": "backups/"
}
}'

Step 8: Verify the Setup

Run a Test Backup

  1. Go to Projects in FireBackup

  2. Select a project

  3. Click Run Backup Now

  4. Select your GCS storage destination

  5. Monitor the progress

Verify in GCS

# List backups
gcloud storage ls gs://your-company-firebackup-prod/backups/ --recursive

# Check object details
gcloud storage objects describe \
gs://your-company-firebackup-prod/backups/proj_abc123/2024-01-15/backup.enc

Expected output:

gs://your-company-firebackup-prod/backups/proj_abc123/2024-01-15/backup_full_1705312245.enc

Troubleshooting

"403 Forbidden" Error

Cause: Service account lacks required permissions

Solution:

# Verify bucket permissions
gcloud storage buckets get-iam-policy gs://your-company-firebackup-prod

# Add missing permission
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/storage.objectUser"

"Bucket Not Found" Error

Cause: Bucket name typo or wrong project

Solution:

# List buckets in project
gcloud storage buckets list --project=YOUR_PROJECT_ID

# Verify bucket exists
gcloud storage buckets describe gs://your-company-firebackup-prod

"Invalid Credentials" Error

Cause: Malformed or expired service account key

Solution:

  1. Verify the JSON key file is complete and valid
  2. Check if the key has been disabled in IAM console
  3. Generate a new key if needed

Upload Failures for Large Backups

Cause: Resumable upload timeout

Solution:

  1. GCS automatically handles resumable uploads for large files
  2. Check network stability
  3. Verify no rate limiting is occurring

Security Best Practices

Enable Audit Logging

# Enable data access audit logs
gcloud projects get-iam-policy YOUR_PROJECT_ID > policy.yaml
# Add audit config and apply
gcloud projects set-iam-policy YOUR_PROJECT_ID policy.yaml

Use VPC Service Controls (Enterprise)

Create a service perimeter to restrict data access:

gcloud access-context-manager perimeters create firebackup-perimeter \
--title="FireBackup Perimeter" \
--resources=projects/YOUR_PROJECT_NUMBER \
--restricted-services=storage.googleapis.com \
--policy=POLICY_ID

Workload Identity (GKE)

For Kubernetes deployments, use Workload Identity instead of service account keys:

# Link Kubernetes service account to GCP service account
gcloud iam service-accounts add-iam-policy-binding \
firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com \
--role="roles/iam.workloadIdentityUser" \
--member="serviceAccount:YOUR_PROJECT_ID.svc.id.goog[firebackup/firebackup-sa]"

Next Steps