Skip to main content

Backup Capacity Calculator

Enter your Dataset Size, Data Change Rate, and Retention Policy — the number of full and incremental backups to keep per week, per month, and in total. Set your Compression and Deduplication ratios, your target Backup Windows, and your Storage Cost per TB/month. Click Calculate to see total backup storage requirements broken down by full, incremental, weekly, monthly, and total capacity — compared across non-compressed, compressed, deduped, and combined reduction scenarios. Required throughput and estimated backup duration over common link speeds are also computed.

Backup Configuration

Percentage of data changed between backups

e.g. 4 weekly + 12 monthly + 1 yearly = 17

: 1
: 1

GFS Retention Policy Reference

Retention Strategy Full/Week Full/Month Total Fulls Incr/Week Incr/Month Total Incr
GFS — 1 Year 151762525
GFS — 7 Year Archive 158962525
Daily + Weekly (1 month) 11462525
Incremental Forever (CBT) 00140030

GFS 1-Year: 4 weekly + 12 monthly + 1 yearly = 17 full backups. Incremental Forever (CBT): 1 initial full + 12 monthly synthetic + 1 yearly synthetic = 14 fulls, 30 daily incrementals.

Understanding Backup Storage Requirements

A backup capacity calculator is essential for storage architects and IT administrators who need to estimate how much disk space is required to store all backup data across a defined retention policy. Total backup storage requirements depend on the source dataset size, daily data change rate, number of full and incremental backups retained across each retention tier, and any data reduction techniques such as backup compression and deduplication. Accurately sizing your backup storage prevents both under-provisioning — which causes backup jobs to fail — and costly over-provisioning that wastes infrastructure budget. See also: estimate backup window duration.

The Grandfather-Father-Son (GFS) backup rotation scheme is the most widely used retention model in enterprise environments. In a GFS retention policy, daily incremental backups (sons) roll into weekly full backups (fathers), which are retained and eventually superseded by monthly archives (grandfathers), and finally by annual or long-term retention copies. This tiered approach balances backup storage efficiency against recovery point objectives (RPO). When calculating backup capacity, each GFS retention tier must be counted separately because backup generations are held for different durations before expiration.

Backup compression ratio and deduplication ratio are the two primary data reduction levers that determine how much physical storage is actually consumed on backup disk or tape. A 2:1 compression ratio means the backup software compresses each backup stream to half its original size before writing to the backup repository. A 3:1 deduplication ratio means that across all stored backups in the repository, only one-third of the total raw backup data is stored on disk — eliminating redundant data blocks shared between backup generations. Combined, these data reduction ratios can shrink backup storage requirements by 80–90% for structured data such as databases and virtual machine images, dramatically reducing backup infrastructure costs. Check out our calculate backup storage cost per TB.

Backup throughput — typically measured in GB/hour or Mbit/s — determines whether your backup jobs can complete within the available backup window. If the required backup throughput exceeds your storage network or target device capacity, backups will miss their window and create gaps in recovery point coverage. The backup window for a full backup is calculated by dividing the total dataset size by the target throughput. Always calculate required backup throughput alongside total storage capacity when designing new backup infrastructure, selecting a backup appliance, or evaluating cloud backup solutions. Related: Convert backup throughput units.

Key Concepts

GFS (Grandfather-Father-Son): A hierarchical backup retention scheme. Daily (son) backups roll into weekly (father) fulls, which roll into monthly (grandfather) archives retained for extended periods.

Incremental Backup: Captures only data changed since the last backup — either full or incremental. Requires chaining previous backups to restore. Smaller and faster than full backups.

Incremental Forever (CBT): One initial full backup followed entirely by incrementals. Synthetic fulls are assembled from existing backup data without re-reading the source. Common in VMware vSphere and Veeam environments using Change Block Tracking.

Data Change Rate: The percentage of total data modified between backup cycles. Typical enterprise workloads: file servers 2–5%, databases 5–15%, virtual machines 3–8% per day.

Backup Window: The scheduled time slot for backups to complete without impacting production. Required throughput = dataset size ÷ backup window hours.

Deduplication vs Compression: Deduplication removes duplicate data blocks across the entire backup repository (cross-backup). Compression reduces each backup stream individually using lossless algorithms. They stack multiplicatively.

StorageMath.org — Free data storage calculators and unit converters for storage professionals. Convert GB to TB, Mbps to MB/s, calculate RAID capacity, IOPS, transfer time, storage cost per TB, and deduplication ratios. Supports decimal (SI) and binary (IEC) standards.