Working on a NAS setup that replicates data every 3 hours to a secondary unit, but it's got me thinking about the risks. If a file gets created or changed right after a backup and the main pool fails, that data could be gone for good.
This involves standard replication methods, and I'm weighing options like more frequent syncs or automated snapshots to cut down on potential loss. For context, the goal is to hit a good balance between safety and not overloading the system, maybe something that aligns with common practices for homelab storage.
What's everyone else doing in their setups? Any experiences with tools like rsync for quicker transfers, or setting up versioning to recover specific files? How do you decide on backup intervals based on your own usage, and have you run into performance hits?
Open to tips and comparisons to make this more solid!