Skip to main content

@firebase-backup-platform/compression

The compression package provides high-performance data compression utilities for FireBackup. It supports Brotli and Gzip algorithms with configurable compression levels.

Installation

npm install @firebase-backup-platform/compression
# or
yarn add @firebase-backup-platform/compression

Overview

This package offers:

  • Brotli compression - Best compression ratio, ideal for storage
  • Gzip compression - Faster compression, widely compatible
  • Streaming support - Memory-efficient processing of large files
  • Auto-detection - Automatically detect compression format

Compression Comparison

AlgorithmRatioSpeedBest For
BrotliBestSlowerLong-term storage
GzipGoodFasterQuick backups

Compression Levels

LevelDescription
Low (1-3)Fast compression, larger files
Medium (4-6)Balanced speed and compression
High (7-9)Best compression, slower
Max (10-11)Maximum compression (Brotli only)

Quick Start

Basic Compression

import { compress, decompress } from '@firebase-backup-platform/compression';

// Compress data
const data = Buffer.from(JSON.stringify(backupData));
const compressed = await compress(data);

console.log('Original size:', data.length);
console.log('Compressed size:', compressed.length);
console.log('Ratio:', ((1 - compressed.length / data.length) * 100).toFixed(1) + '%');

// Decompress
const decompressed = await decompress(compressed);
const restored = JSON.parse(decompressed.toString());

With Options

// Brotli compression (best ratio)
const compressed = await compress(data, {
algorithm: 'brotli',
level: 6
});

// Gzip compression (faster)
const compressed = await compress(data, {
algorithm: 'gzip',
level: 6
});

API Reference

compress(data, options)

Compresses data using the specified algorithm.

function compress(
data: Buffer | string,
options?: CompressOptions
): Promise<Buffer>

CompressOptions:

PropertyTypeDefaultDescription
algorithm'brotli' | 'gzip''brotli'Compression algorithm
levelnumber6Compression level
chunkSizenumber16384Processing chunk size

Compression Levels:

AlgorithmMinMaxRecommended
Brotli0116
Gzip196

Example:

// Maximum Brotli compression
const maxCompressed = await compress(data, {
algorithm: 'brotli',
level: 11
});

// Fast Gzip compression
const fastCompressed = await compress(data, {
algorithm: 'gzip',
level: 1
});

decompress(data, options)

Decompresses data (auto-detects algorithm).

function decompress(
data: Buffer,
options?: DecompressOptions
): Promise<Buffer>

DecompressOptions:

PropertyTypeDefaultDescription
algorithm'brotli' | 'gzip' | 'auto''auto'Decompression algorithm

Example:

// Auto-detect compression format
const decompressed = await decompress(compressedData);

// Explicit algorithm
const decompressed = await decompress(compressedData, {
algorithm: 'brotli'
});

detectAlgorithm(data)

Detects the compression algorithm used.

function detectAlgorithm(data: Buffer): 'brotli' | 'gzip' | 'none'

Example:

const algorithm = detectAlgorithm(compressedData);

if (algorithm === 'none') {
console.log('Data is not compressed');
} else {
console.log('Compression algorithm:', algorithm);
}

CompressionService Class

A high-level service for compression operations.

Constructor

new CompressionService(config?: CompressionConfig)

CompressionConfig:

PropertyTypeDefaultDescription
algorithmstring'brotli'Default algorithm
levelnumber6Default compression level
minSizenumber1024Minimum size to compress

Example:

const compressionService = new CompressionService({
algorithm: 'brotli',
level: 6,
minSize: 1024 // Only compress data > 1KB
});

Methods

compress(data)

async compress(data: Buffer | string): Promise<CompressionResult>

CompressionResult:

interface CompressionResult {
data: Buffer;
algorithm: string;
originalSize: number;
compressedSize: number;
ratio: number;
compressed: boolean; // false if below minSize
}

decompress(data)

async decompress(data: Buffer): Promise<Buffer>

getStats()

getStats(): CompressionStats

Example:

const service = new CompressionService();

// Compress
const result = await service.compress(data);
console.log(`Compressed ${result.originalSize} -> ${result.compressedSize} bytes`);
console.log(`Compression ratio: ${(result.ratio * 100).toFixed(1)}%`);

// Get cumulative stats
const stats = service.getStats();
console.log('Total compressed:', stats.totalCompressed);
console.log('Average ratio:', stats.averageRatio);

Streaming

For large files, use streaming compression:

createCompressStream(options)

Creates a compression transform stream.

function createCompressStream(options?: CompressOptions): Transform

Example:

import { createCompressStream, createDecompressStream } from '@firebase-backup-platform/compression';
import { pipeline } from 'stream/promises';
import fs from 'fs';

// Compress a file
await pipeline(
fs.createReadStream('large-backup.json'),
createCompressStream({ algorithm: 'brotli', level: 6 }),
fs.createWriteStream('large-backup.json.br')
);

// Decompress a file
await pipeline(
fs.createReadStream('large-backup.json.br'),
createDecompressStream(),
fs.createWriteStream('large-backup.restored.json')
);

Streaming with Progress

import { createCompressStream } from '@firebase-backup-platform/compression';
import { Transform } from 'stream';

// Create progress tracking stream
const progressStream = new Transform({
transform(chunk, encoding, callback) {
bytesProcessed += chunk.length;
const percent = (bytesProcessed / totalSize * 100).toFixed(1);
console.log(`Progress: ${percent}%`);
callback(null, chunk);
}
});

await pipeline(
fs.createReadStream('large-file.json'),
progressStream,
createCompressStream(),
fs.createWriteStream('large-file.json.br')
);

Algorithm Comparison

Brotli vs Gzip

import { compress, benchmark } from '@firebase-backup-platform/compression';

// Benchmark both algorithms
const data = Buffer.from(JSON.stringify(largeObject));

const brotliResult = await benchmark(data, { algorithm: 'brotli', level: 6 });
const gzipResult = await benchmark(data, { algorithm: 'gzip', level: 6 });

console.log('Brotli:');
console.log(` Size: ${brotliResult.compressedSize} bytes`);
console.log(` Time: ${brotliResult.compressionTime}ms`);
console.log(` Ratio: ${(brotliResult.ratio * 100).toFixed(1)}%`);

console.log('Gzip:');
console.log(` Size: ${gzipResult.compressedSize} bytes`);
console.log(` Time: ${gzipResult.compressionTime}ms`);
console.log(` Ratio: ${(gzipResult.ratio * 100).toFixed(1)}%`);

Typical Results

Data TypeBrotli RatioGzip RatioBrotli SpeedGzip Speed
JSON85-92%80-88%SlowerFaster
Text70-85%65-80%SlowerFaster
Binary30-50%25-45%SlowerFaster

Compression Levels

Level Selection Guide

// Level 1-3: Fast compression
// Good for: Real-time processing, temporary files
const fast = await compress(data, { level: 1 });

// Level 4-6: Balanced (recommended)
// Good for: Daily backups, regular storage
const balanced = await compress(data, { level: 6 });

// Level 7-9: High compression
// Good for: Long-term archival, bandwidth-limited transfers
const high = await compress(data, { level: 9 });

// Level 10-11: Maximum compression (Brotli only)
// Good for: Cold storage, rarely accessed archives
const max = await compress(data, { algorithm: 'brotli', level: 11 });

Level Benchmarks

Benchmarks for a 100MB JSON file:

Brotli:

LevelSizeTimeRatio
118.2 MB0.8s81.8%
612.5 MB3.2s87.5%
1110.1 MB45s89.9%

Gzip:

LevelSizeTimeRatio
122.1 MB0.5s77.9%
615.8 MB1.8s84.2%
915.2 MB6.5s84.8%

Memory Management

Chunk Processing

For very large files:

import { createCompressStream } from '@firebase-backup-platform/compression';

// Configure chunk size based on available memory
const compressStream = createCompressStream({
algorithm: 'brotli',
level: 6,
chunkSize: 64 * 1024 // 64KB chunks
});

Memory Limits

// Set maximum memory for compression
const compressed = await compress(data, {
algorithm: 'brotli',
level: 6,
maxMemory: 256 * 1024 * 1024 // 256MB limit
});

Error Handling

import {
compress,
decompress,
CompressionError,
DecompressionError,
InvalidFormatError,
UnsupportedAlgorithmError
} from '@firebase-backup-platform/compression';

try {
const decompressed = await decompress(data);
} catch (error) {
if (error instanceof InvalidFormatError) {
console.error('Data is not in a recognized compression format');
} else if (error instanceof DecompressionError) {
console.error('Failed to decompress:', error.message);
} else if (error instanceof UnsupportedAlgorithmError) {
console.error('Unsupported algorithm:', error.algorithm);
} else if (error instanceof CompressionError) {
console.error('Compression error:', error.message);
}
}

TypeScript Support

import {
compress,
decompress,
detectAlgorithm,
createCompressStream,
createDecompressStream,
CompressionService,
CompressOptions,
DecompressOptions,
CompressionResult,
CompressionStats,
CompressionConfig,
CompressionError
} from '@firebase-backup-platform/compression';

// Typed options
const options: CompressOptions = {
algorithm: 'brotli',
level: 6
};

// Typed results
const result: Buffer = await compress(data, options);

// Typed service
const service = new CompressionService({
algorithm: 'brotli',
level: 6
});

const compressionResult: CompressionResult = await service.compress(data);

Integration with Backup Core

import { BackupCore } from '@firebase-backup-platform/backup-core';

const backupCore = new BackupCore({
firebase: { ... },
storage: { ... },
compression: {
enabled: true,
algorithm: 'brotli',
level: 6
},
encryption: {
enabled: true,
key: encryptionKey
}
});

// Backup flow: Export → Compress → Encrypt → Upload
const result = await backupCore.backup({
collections: ['users']
});

Best Practices

// For scheduled backups (storage optimization)
const archiveConfig = {
algorithm: 'brotli',
level: 9
};

// For on-demand backups (speed priority)
const quickConfig = {
algorithm: 'gzip',
level: 4
};

// For real-time replication
const realtimeConfig = {
algorithm: 'gzip',
level: 1
};

Compression Order

Always compress before encrypting:

Why? Encrypted data is random and incompressible. Compressing first maximizes storage savings.

Skip Compression for Small Data

const service = new CompressionService({
minSize: 1024 // Don't compress data < 1KB
});

// Small data may actually get larger after compression
// due to compression headers and metadata