Google Cloud Storage Setup
This tutorial walks you through configuring Google Cloud Storage (GCS) as a storage destination for your FireBackup backups. GCS offers excellent integration with Firebase projects since both are Google Cloud services.
What You'll Learn
- Create and configure a GCS bucket with security best practices
- Set up a service account with least-privilege permissions
- Configure lifecycle policies for cost optimization
- Enable dual-region or multi-region storage for resilience
- Connect GCS to FireBackup
Prerequisites
- A Google Cloud account with billing enabled
- Project Owner or Editor role in your GCP project
- Access to FireBackup dashboard as an organization admin
gcloudCLI installed (optional but recommended)
Time Required
Approximately 20-25 minutes
Step 1: Create a GCS Bucket
Using Google Cloud Console
-
Navigate to Cloud Storage
-
Click Create bucket
-
Configure bucket settings:
Setting Recommended Value Name your-company-firebackup-prodLocation type Region, Dual-region, or Multi-region Storage class Standard
-
Choose location:
- Single region: Lowest cost, single-region availability
- Dual-region: Automatic failover between two regions
- Multi-region: Highest availability across continent
-
Storage class:
- Select Standard for frequently accessed backups
- Lifecycle rules will transition older backups automatically
-
Access control:
- Select Uniform (recommended)
- This enforces IAM-only access control
-
Protection tools:
- Object versioning: Enable
- Retention policy: Configure based on compliance needs
-
Click Create
Using gcloud CLI
# Set your project
gcloud config set project YOUR_PROJECT_ID
# Create bucket (single region)
gcloud storage buckets create gs://your-company-firebackup-prod \
--location=us-central1 \
--uniform-bucket-level-access \
--public-access-prevention
# Or create with dual-region
gcloud storage buckets create gs://your-company-firebackup-prod \
--location=us \
--placement=us-central1,us-east1 \
--uniform-bucket-level-access \
--public-access-prevention
# Enable versioning
gcloud storage buckets update gs://your-company-firebackup-prod \
--versioning
# Set default storage class
gcloud storage buckets update gs://your-company-firebackup-prod \
--default-storage-class=STANDARD
Step 2: Create a Service Account
Create a dedicated service account for FireBackup with minimal permissions.
Using Cloud Console
-
Navigate to IAM & Admin → Service Accounts
-
Click Create Service Account
-
Enter service account details:
- Name:
firebackup-storage - ID:
firebackup-storage - Description:
Service account for FireBackup storage access
- Name:
-
Click Create and Continue
-
Skip the optional steps for now (we'll add bucket-specific permissions)
-
Click Done
Using gcloud CLI
# Create service account
gcloud iam service-accounts create firebackup-storage \
--display-name="FireBackup Storage Access" \
--description="Service account for FireBackup to access Cloud Storage"
# Verify creation
gcloud iam service-accounts list --filter="email:firebackup-storage"
Step 3: Grant Bucket Permissions
Apply least-privilege permissions at the bucket level, not project level.
Using Cloud Console
-
Go to Cloud Storage
-
Click on your bucket name
-
Go to Permissions tab
-
Click Grant Access
-
Add the service account and roles:
Principal Role firebackup-storage@PROJECT_ID.iam.gserviceaccount.comStorage Object User
Using gcloud CLI
# Grant Storage Object User role on the bucket
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/storage.objectUser"
Custom IAM Role (Optional)
For stricter access control, create a custom role:
# firebackup-role.yaml
title: "FireBackup Storage Role"
description: "Minimal permissions for FireBackup storage operations"
stage: "GA"
includedPermissions:
- storage.objects.create
- storage.objects.get
- storage.objects.list
- storage.objects.delete
- storage.multipartUploads.create
- storage.multipartUploads.abort
- storage.multipartUploads.listParts
Apply the custom role:
# Create custom role
gcloud iam roles create FireBackupStorage \
--project=YOUR_PROJECT_ID \
--file=firebackup-role.yaml
# Apply to bucket
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
--role="projects/YOUR_PROJECT_ID/roles/FireBackupStorage"
Step 4: Generate Service Account Key
Create a JSON key file for authentication.
Using Cloud Console
-
Go to Service Accounts
-
Click on
firebackup-storageservice account -
Go to Keys tab
-
Click Add Key → Create new key
-
Select JSON format
-
Click Create
-
The key file downloads automatically - store it securely!
Using gcloud CLI
# Create and download key
gcloud iam service-accounts keys create firebackup-sa-key.json \
--iam-account=firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com
# Verify key was created
cat firebackup-sa-key.json
- Never commit service account keys to version control
- Store keys in a secure secrets manager
- Rotate keys regularly (every 90 days recommended)
- Delete unused keys immediately
Key File Contents
The JSON key file looks like:
{
"type": "service_account",
"project_id": "your-project-id",
"private_key_id": "key-id",
"private_key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
"client_email": "firebackup-storage@your-project-id.iam.gserviceaccount.com",
"client_id": "123456789",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token"
}
Step 5: Configure Lifecycle Rules
Set up lifecycle rules to automatically manage storage costs.
Using Cloud Console
-
Go to your bucket → Lifecycle tab
-
Click Add a rule
-
Configure rules:
Rule 1: Transition to Nearline
| Setting | Value |
|---|---|
| Action | Set storage class to Nearline |
| Condition | Age > 30 days |
Rule 2: Transition to Coldline
| Setting | Value |
|---|---|
| Action | Set storage class to Coldline |
| Condition | Age > 90 days |
Rule 3: Transition to Archive
| Setting | Value |
|---|---|
| Action | Set storage class to Archive |
| Condition | Age > 365 days |
Rule 4: Delete old versions
| Setting | Value |
|---|---|
| Action | Delete object |
| Condition | Number of newer versions > 3 |
Using gcloud CLI
Create a lifecycle configuration file:
{
"lifecycle": {
"rule": [
{
"action": {
"type": "SetStorageClass",
"storageClass": "NEARLINE"
},
"condition": {
"age": 30
}
},
{
"action": {
"type": "SetStorageClass",
"storageClass": "COLDLINE"
},
"condition": {
"age": 90
}
},
{
"action": {
"type": "SetStorageClass",
"storageClass": "ARCHIVE"
},
"condition": {
"age": 365
}
},
{
"action": {
"type": "Delete"
},
"condition": {
"numNewerVersions": 3
}
},
{
"action": {
"type": "AbortIncompleteMultipartUpload"
},
"condition": {
"age": 7
}
}
]
}
}
Apply the configuration:
gcloud storage buckets update gs://your-company-firebackup-prod \
--lifecycle-file=lifecycle.json
Cost Comparison
| Storage Class | $/GB/month | Retrieval | Use Case |
|---|---|---|---|
| Standard | $0.020 | Free | Active backups (< 30 days) |
| Nearline | $0.010 | $0.01/GB | Monthly access (30-90 days) |
| Coldline | $0.004 | $0.02/GB | Quarterly access (90-365 days) |
| Archive | $0.0012 | $0.05/GB | Yearly access (> 365 days) |
Step 6: Enable Object Retention (Compliance)
For compliance requirements, configure retention policies.
Retention Policy
# Set 90-day retention (objects cannot be deleted for 90 days)
gcloud storage buckets update gs://your-company-firebackup-prod \
--retention-period=90d
# Lock the retention policy (CANNOT be undone!)
gcloud storage buckets update gs://your-company-firebackup-prod \
--lock-retention-period
Object Hold
# Enable default event-based hold
gcloud storage buckets update gs://your-company-firebackup-prod \
--default-event-based-hold
Locked retention policies cannot be removed or shortened. Only use for compliance requirements.
Step 7: Connect to FireBackup
Using the Dashboard
-
Log in to FireBackup
-
Navigate to Settings → Storage
-
Click Add Storage Destination
-
Select Google Cloud Storage
-
Enter configuration:
Field Value Name Production GCS Bucket your-company-firebackup-prod Project ID your-project-id Credentials Upload or paste JSON key Path Prefix backups/ (optional) -
Click Test Connection
-
Click Save
Using the API
# First, base64 encode the service account key
CREDENTIALS=$(cat firebackup-sa-key.json | base64)
# Create storage destination
curl -X POST https://api.firebackup.io/api/v1/storage \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"name": "Production GCS",
"type": "gcs",
"config": {
"bucket": "your-company-firebackup-prod",
"projectId": "your-project-id",
"credentials": "'"$CREDENTIALS"'",
"prefix": "backups/"
}
}'
Step 8: Verify the Setup
Run a Test Backup
-
Go to Projects in FireBackup
-
Select a project
-
Click Run Backup Now
-
Select your GCS storage destination
-
Monitor the progress
Verify in GCS
# List backups
gcloud storage ls gs://your-company-firebackup-prod/backups/ --recursive
# Check object details
gcloud storage objects describe \
gs://your-company-firebackup-prod/backups/proj_abc123/2024-01-15/backup.enc
Expected output:
gs://your-company-firebackup-prod/backups/proj_abc123/2024-01-15/backup_full_1705312245.enc
Troubleshooting
"403 Forbidden" Error
Cause: Service account lacks required permissions
Solution:
# Verify bucket permissions
gcloud storage buckets get-iam-policy gs://your-company-firebackup-prod
# Add missing permission
gcloud storage buckets add-iam-policy-binding gs://your-company-firebackup-prod \
--member="serviceAccount:firebackup-storage@PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/storage.objectUser"
"Bucket Not Found" Error
Cause: Bucket name typo or wrong project
Solution:
# List buckets in project
gcloud storage buckets list --project=YOUR_PROJECT_ID
# Verify bucket exists
gcloud storage buckets describe gs://your-company-firebackup-prod
"Invalid Credentials" Error
Cause: Malformed or expired service account key
Solution:
- Verify the JSON key file is complete and valid
- Check if the key has been disabled in IAM console
- Generate a new key if needed
Upload Failures for Large Backups
Cause: Resumable upload timeout
Solution:
- GCS automatically handles resumable uploads for large files
- Check network stability
- Verify no rate limiting is occurring
Security Best Practices
Enable Audit Logging
# Enable data access audit logs
gcloud projects get-iam-policy YOUR_PROJECT_ID > policy.yaml
# Add audit config and apply
gcloud projects set-iam-policy YOUR_PROJECT_ID policy.yaml
Use VPC Service Controls (Enterprise)
Create a service perimeter to restrict data access:
gcloud access-context-manager perimeters create firebackup-perimeter \
--title="FireBackup Perimeter" \
--resources=projects/YOUR_PROJECT_NUMBER \
--restricted-services=storage.googleapis.com \
--policy=POLICY_ID
Workload Identity (GKE)
For Kubernetes deployments, use Workload Identity instead of service account keys:
# Link Kubernetes service account to GCP service account
gcloud iam service-accounts add-iam-policy-binding \
firebackup-storage@YOUR_PROJECT_ID.iam.gserviceaccount.com \
--role="roles/iam.workloadIdentityUser" \
--member="serviceAccount:YOUR_PROJECT_ID.svc.id.goog[firebackup/firebackup-sa]"
Next Steps
- Configure backup schedules for automatic backups
- Set up PITR for point-in-time recovery
- Enable webhooks for notifications
Related
- AWS S3 Setup - Alternative cloud storage
- DO Spaces Setup - DigitalOcean storage
- Storage API Reference - API documentation