A hybrid NAS and cloud setup is the most practical data protection strategy for home users and small businesses in 2026. By the end of this guide, you will have a clear framework for deciding which data belongs on your NAS and which belongs in the cloud, a step-by-step process for setting up automated cloud backup from Synology and QNAP devices, and a tested restore workflow so you know it actually works when you need it.
In short: Keep frequently accessed, large, and locally streamed data on your NAS. Push irreplaceable data, documents, photos, and critical files to the cloud as offsite backup. The NAS is your fast local storage tier. The cloud is your protection against the NAS failing. Neither one alone is a complete backup strategy.
Why Hybrid NAS and Cloud Is the Right Architecture
Pure cloud storage is too slow and too expensive for large data sets. Accessing 10TB of photos and video from a cloud service is painful, and monthly subscription costs for that much cloud storage add up fast. Pure NAS storage is fast and cheap per GB, but a NAS in your home is vulnerable to everything that can affect your home: theft, fire, flood, and hardware failure.
The hybrid model solves both problems by using each tier for what it does best. The NAS provides fast local access for active files, media streaming, and large repositories that would be impractical to access remotely. The cloud stores encrypted copies of irreplaceable data as an offsite safety net. You do not need to back up everything to the cloud, and you do not need everything fast. Understanding that distinction is what makes a hybrid setup both affordable and effective.
The planning framework has three steps: categorise your data by access pattern and replaceability, assign each category to the right storage tier, and automate the workflow so it runs without you thinking about it.
Step 1: Categorise Your Data
Before configuring anything, categorise your data across four dimensions:
- How frequently do you access it? Files you open daily behave differently to annual tax records.
- How large is it? A 10TB Plex library has different cost implications than 50GB of documents.
- How irreplaceable is it? Purchased media can be re-downloaded. Your children's photos cannot be recreated.
- How fast does access need to be? Streaming 4K video requires local throughput that cloud cannot reliably provide.
Running these dimensions against your actual data produces three categories: data that belongs primarily on the NAS, data that belongs in the cloud or synced to both, and data that needs to exist everywhere with full redundancy.
What to Keep on the NAS (Local Storage Tier)
The following data types belong on your NAS as primary storage, with cloud backup as the protection layer rather than active sync:
- Plex and media libraries: Large, frequently streamed, slow to upload. 4K content in particular requires local access to avoid buffering. A 20Mbps NBN upload is nowhere near fast enough to stream 4K remotely from cloud storage. Keep your media on the NAS.
- Virtual machine images and Docker containers: Large, frequently read and written, and extremely slow to restore from cloud. Run these locally, snapshot them periodically, and back up the snapshots.
- Active projects: Files you work on daily, creative projects in progress, large CAD or video editing projects. These need the low latency and throughput only local storage provides.
- Downloaded content and software archives: Re-downloadable content, ISO libraries, software installers, and game backups. This data is large but can be replaced if lost. It is a NAS-first candidate with optional cloud backup depending on your budget.
- Surveillance footage: High write rates and large volume make cloud storage expensive for continuous recording. Store footage on the NAS, use motion-triggered clips as the cloud upload target if needed.
What to Push to the Cloud (Offsite Backup Tier)
The following data types should be backed up to cloud object storage as a priority, because they are either irreplaceable or small enough to upload efficiently:
- Personal photos and videos: Irreplaceable. Upload all photos and home videos to cloud object storage with versioning enabled. On a typical NBN 100 connection, 100GB of compressed photos uploads in under 12 hours. This is the most important cloud backup category for most home users.
- Documents and financial records: Tax returns, legal documents, insurance policies, contracts. Small in size and irreplaceable. These should be in cloud backup with a long version history. 10GB of documents costs pennies per month on any object storage service.
- Password vaults and authentication data: Sync to a dedicated cloud service (Bitwarden, 1Password) rather than relying on NAS backup. These need real-time sync, not nightly batch backup.
- System configuration and backup metadata: NAS configuration exports, Hyper Backup repository indices, and Hypervisor configurations. Small, critical, and often overlooked. Back these up to cloud alongside your data.
- Important email archives: If you archive email locally via IMAP, include that archive in your cloud backup scope.
The 3-2-1 rule applies here: 3 copies of irreplaceable data, on 2 different media types, with 1 copy offsite. Your NAS is one copy. An external drive or USB backup device is the second (different media). Cloud object storage is the third, offsite. If your NAS and your home are affected by the same event, the cloud copy survives.
What Can Optionally Go to Both (Sync or Backup Depending on Need)
Some data types sit between the two tiers and the right approach depends on your budget and risk tolerance:
- Photo and video archives beyond the most recent few years: If your library is in the terabytes, you may choose to back up only the last 2-3 years to cloud and keep older archives NAS-only with local redundancy. The most recent data is typically the most important.
- Purchased media (ripped Blu-rays, legally obtained content): Replaceable but time-consuming to re-rip. Cloud backup is optional depending on storage cost tolerance.
- Shared family or team files: Consider Synology Drive or QNAP Qsync for active collaboration files. These can sync to the cloud as part of a backup job while also being accessible remotely via the NAS.
Step 2: Choose Your Cloud Object Storage Provider
For NAS backup, object storage is the right cloud tier. The three leading options for home NAS use are Backblaze B2, Wasabi, and Cloudflare R2. Each has distinct pricing characteristics.
Backblaze B2 is the most NAS-friendly option: Synology's Hyper Backup has a native B2 connector, setup takes minutes, and B2 has supported versioning and object lock for years. Storage costs approximately USD $0.006/GB/month, with free egress to the Cloudflare network. For most home users backing up under 1TB, the monthly cost is under AUD $10.
Wasabi has no egress fees and a Sydney region, making it the better choice for larger archives where you expect to do regular restores, or for users who want their data onshore. The 90-day minimum storage charge per object means it is less suitable for backup jobs with high object churn and short retention windows.
Cloudflare R2 has permanently free egress and no minimum storage duration, but lacks object lock support as of early 2026, which limits ransomware protection. It suits technically confident users with QNAP devices (QNAP's Hybrid Backup Sync handles the S3 path more reliably than Synology Hyper Backup for R2).
For most home users starting out, start with Backblaze B2 and revisit Wasabi when your repository exceeds 1TB.
Step 3: Set Up Automated Cloud Backup on Synology
Synology's Hyper Backup is the primary tool for configuring automated cloud backup. The following process uses Backblaze B2 as the destination, but the same workflow applies with modified credentials for Wasabi or R2.
- Create your Backblaze account and bucket. In the B2 account, create a private bucket. Note the bucket name and the endpoint URL shown on the bucket details page. Create an Application Key with Read and Write access scoped to that specific bucket. Save the Key ID and Application Key before closing the window (you cannot retrieve the Application Key again after creation).
- Open Hyper Backup on your Synology. In the Backup Wizard, select Backblaze B2 as the backup destination. Enter your Key ID, Application Key, and bucket name. Click Test Connection to verify access.
- Select your backup source folders. Include your photos, documents, and other irreplaceable data. Exclude large media libraries unless you have the budget and upload bandwidth to support them.
- Configure client-side encryption. Set a strong passphrase. Store it in a password manager and write it down in a physical location separate from your NAS. Losing this passphrase means losing access to your backup permanently.
- Set your backup schedule. Daily incremental backup at 2am is a reasonable default for most home users. Hyper Backup only uploads changed blocks after the initial seed, so daily jobs are fast once the repository is established.
- Configure retention policy. Keep at least 30 days of versions. This ensures that if ransomware encrypts your NAS and triggers a backup job, you can restore from a version before the encryption. 90 days is better for important data if your storage costs allow.
Hyper Backup tip: After your first backup completes, use Hyper Backup's Restore function to do a test restore of a small folder to a separate location on your NAS (not the original source location). Confirm the files are intact and readable. This takes 10 minutes and eliminates the risk of discovering a configuration error only when you desperately need to restore.
Step 4: Set Up Automated Cloud Backup on QNAP
QNAP's Hybrid Backup Sync (HBS 3) handles cloud backup from QTS or QuTS Hero NAS devices. The process is similar to Synology but uses S3-compatible endpoint configuration for most providers.
- Open Hybrid Backup Sync from the QNAP App Center. If not installed, install it from the App Center. It is free.
- Create a Storage Space. In HBS 3, navigate to Storage Spaces and add a new space. Select S3 as the type, enter the provider's endpoint URL, and supply your access key and secret key. For Backblaze B2, the S3-compatible endpoint is listed in your bucket settings. For Wasabi Sydney, the endpoint is s3.ap-southeast-2.wasabisys.com.
- Create a Backup Job. Go to Backup, create a new job, select the local source folders, and select your cloud storage space as the destination. Configure encryption, scheduling, and versioning.
- Enable Smart Versioning. QTS's Smart Versioning allows configuring a daily, weekly, and monthly snapshot count separately. For most home users, 30 daily versions and 12 monthly versions is a reasonable policy.
- Test the connection and run an initial manual backup to confirm credentials and permissions are correct before leaving it to run on schedule.
QNAP users on QuTS Hero benefit from ZFS snapshot integration. Configure scheduled ZFS snapshots on your storage pool alongside the Hybrid Backup Sync job. ZFS snapshots provide near-instant point-in-time recovery for most accidental deletions, while HBS 3 provides the offsite copy for disaster scenarios.
NBN Upload Speeds: Planning Your Initial Seed
Your NBN upload speed determines how long initial seeding takes. NBN 100 plans typically deliver 17-20 Mbps upload. NBN 250 plans typically deliver 20-25 Mbps upload. Higher-tier plans on FTTP with 50+ Mbps upload are available but less common for home users.
| 100GB at 20 Mbps upload (NBN 100) | Approx 11 hours |
|---|---|
| 500GB at 20 Mbps upload (NBN 100) | Approx 56 hours (2.3 days) |
| 1TB at 20 Mbps upload (NBN 100) | Approx 111 hours (4.6 days) |
| 1TB at 50 Mbps upload (NBN 1000 FTTP) | Approx 44 hours (1.8 days) |
| Note | These are compressed data estimates. Hyper Backup and HBS 3 compress data before upload. Actual upload volume will be less than source data size. |
For large initial seeds, throttle the upload speed in your backup application to avoid saturating your connection during business hours. Hyper Backup's bandwidth control settings let you limit upload speed to a specified Mbps. Set it to 50-70% of your available upload during the initial seed to keep your connection usable for other purposes, then remove the throttle for ongoing nightly incremental jobs.
Users on CGNAT connections (common with some NBN providers and all mobile broadband) are not affected here. CGNAT limits inbound connections to your NAS for remote access purposes, but outbound upload to cloud storage works normally. If you are on CGNAT and want to access your NAS remotely while also backing up to cloud, the backup jobs are unaffected. It is only the remote access that needs a workaround (Tailscale, Synology QuickConnect, or similar).
Step 5: Verify Your Backup Is Working
A backup that has never been tested is a hypothesis, not a backup. Run through this verification checklist after initial setup and schedule a repeat every six months:
- Check job completion logs. In Hyper Backup or HBS 3, review the last 30 days of job history. Every job should show Completed or Completed with warnings. Failed jobs need immediate investigation.
- Verify bucket contents. Log into your cloud storage provider's console and confirm objects are present in your bucket with recent modification dates.
- Test a partial restore. Select 3-5 random files from your backup, restore them to a different local folder (not the original source), and confirm they open correctly. Do this in Hyper Backup's Restore function or HBS 3's Restore mode.
- Test version rollback. If versioning is enabled, restore a file from 7 days ago and confirm that older version is accessible and intact. This verifies your ransomware protection is functional.
- Confirm the encryption passphrase. Attempt a test restore using your stored passphrase to confirm it is correct. A wrong passphrase discovered during a real emergency is a serious problem.
Hardware Considerations for Hybrid Backup
Most current NAS devices are capable of running cloud backup jobs without impacting other tasks. Synology DS425+ (from $785 at Mwave, Scorptec, and others) and DS925+ (from $980) run Hyper Backup efficiently even during active use. The DS225+ (from $538) suits smaller home setups with up to 2 bays.
On the QNAP side, the TS-433 (from $639 at Mwave Australia, Scorptec) handles concurrent backup jobs and local file serving without issue. For users who want additional protection through ZFS snapshots, QNAP models running QuTS Hero (require minimum 8GB RAM) provide an extra local recovery layer. The TS-464 (from $989) supports QuTS Hero and has enough RAM headroom to run deduplication and compression alongside backup jobs.
The NAS hardware choice matters less for cloud backup than for other use cases. Any NAS with a working cloud backup client and a reliable network connection will handle the task. Where hardware matters more is recovery speed: restoring 500GB from cloud to a fast NAS over a gigabit home network is much faster than restoring to a slower single-core device.
When purchasing a NAS from Australian retailers like Scorptec, PLE Computers, or Mwave, the pre-sales guidance available from specialist retailers is worth using. The backup configuration questions you have are the same questions their staff answer regularly. For first-time NAS buyers in particular, buying from a specialist retailer rather than a generic platform gives you a point of contact when configuration questions arise, which they will.
Troubleshooting Common Issues
Backup job fails with authentication error: Your API key permissions are likely too restrictive. The key needs Read, Write, Delete, and List access on the target bucket. Regenerate a new key with full bucket access and update the backup job credentials.
First backup job runs but subsequent incremental jobs upload far more data than expected: Check whether your NAS is writing temporary files or caches into the backup source folders. Plex database files, Synology package data, and Docker container logs should be excluded from backup scope. In Hyper Backup, use the exclusion filter to add directories that should not be backed up.
Upload speed is consistently much lower than expected: Check whether another process on your NAS is competing for bandwidth (e.g., a Plex transcoder, active file sync, or another backup task). Also confirm your router's QoS settings are not limiting the NAS's outbound traffic. If using Wi-Fi, a wired connection to the NAS is significantly more reliable for large backup jobs.
Restore test fails with decryption error: The passphrase stored in your password manager may have an extra space, incorrect capitalisation, or other subtle error. Type the passphrase manually rather than pasting to eliminate clipboard formatting issues. If the passphrase is genuinely wrong, there is no recovery path for an encrypted Hyper Backup repository. This is why testing early is critical.
Job reports completed but file count in bucket seems low: Object storage counts vary from what you expect because Hyper Backup and HBS 3 store data in a proprietary repository format with deduplication, not as individual files. The object count in the bucket will not match your file count. Verify integrity using the built-in verify function within the backup application, not by comparing file counts.
Do I need to back up my entire NAS to the cloud?
No. Back up what is irreplaceable first: personal photos, home videos, documents, financial records, passwords. Large media libraries (movies, TV shows, purchased games) can be excluded unless the cost and upload time are acceptable to you. The goal is to protect data that cannot be recreated if lost. Re-downloadable content and re-rippable media are lower priority. Start with your most critical folders and expand the backup scope over time as you assess cost and upload impact.
Can I use Google Drive, OneDrive, or Dropbox instead of object storage?
You can, but consumer cloud sync services are not designed for large encrypted backup repositories. They sync individual files rather than deduplicated chunks, which is inefficient for large data sets. They also have per-account storage limits that become expensive at scale. Object storage (B2, Wasabi, R2) is purpose-built for this use case and is cheaper per GB at scale. That said, for a small document and photo backup on a tight budget, Synology's native Google Drive and OneDrive connectors in Hyper Backup are functional and reasonable for under 100GB of data.
How do I back up my NAS configuration, not just my data?
Both Synology DSM and QNAP QTS include a configuration backup feature. On Synology, go to Control Panel, Update and Restore, and export a configuration backup. On QNAP, use the Backup/Restore section in the System Administration panel. Save these configuration files to both your NAS and include them in your cloud backup scope. In a hardware failure scenario, restoring the configuration to a replacement NAS is much faster than rebuilding from scratch. Store your Hyper Backup or HBS 3 encryption passphrase with this configuration export.
What happens to my cloud backup if my NAS is stolen or destroyed?
Your cloud backup is unaffected. Object storage is offsite, independent of your NAS hardware. With a new NAS, install Hyper Backup or HBS 3, configure it to connect to your existing bucket with the same credentials and encryption passphrase, and restore your data. The restore process to a new NAS over a typical NBN connection is slow (recovering 500GB at 100 Mbps download takes roughly 11 hours), but all your irreplaceable data is accessible. This is exactly the scenario the hybrid setup is designed to handle.
Should I use Synology C2 or QNAP-specific cloud services instead of third-party object storage?
Synology C2 and QNAP's cloud services are tightly integrated with their respective NAS platforms and simpler to configure. The trade-off is vendor lock-in and typically higher per-GB cost than independent object storage providers. If simplicity is the priority and you are committed to staying with one NAS brand long-term, the first-party service is a reasonable choice. If you want platform independence, lower cost at scale, or the ability to access your bucket from tools other than your NAS's backup application, independent object storage (B2, Wasabi, R2) is the better option.
Is my data protected if I only have it on the NAS with RAID?
No. RAID protects against drive failure. It does not protect against the NAS controller failing, ransomware encrypting all drives simultaneously, accidental deletion, fire, theft, or flood. A NAS with RAID is one copy of your data in one physical location. That is not a backup. The 3-2-1 rule exists because single-location storage fails in too many real-world scenarios. Cloud backup is the minimum offsite protection layer. A NAS alone, regardless of RAID configuration, is not a complete data protection strategy.
Comparing Backblaze B2, Wasabi, and Cloudflare R2 in detail? Read the full object storage comparison to choose the right cloud backup tier for your NAS.
Compare B2 vs Wasabi vs R2