from Hacker News

Ask HN: What backup strategy do you use for your websites and databases?

by pankratiev on 7/14/11, 12:38 PM with 9 comments

  • by agj on 7/20/11, 12:57 PM

    For backups, I use rdiff-backup across 60G+ of user data, nightly. rdiff-backup uses librsync to transfer files, but also handles incremental backups and seems to be fairly efficient at storing increments. Nightly, maybe 10%, maximum, of the user data changes and backups complete in less than 2 hours. Load is low enough I could run several times a day if I needed.

    Databases should be properly dumped to a file before back up.

    I've also tried BackupPC, which was a great project, but probably not the best fit for this case. I was running it in a virtualized container, and I ran into a lot of memory issues backing up large servers. This issue likely came down to memory -- not necessarily BackupPC itself -- but I dropped it because backups commonly took around 6-8h, if they didn't silently hang on me.

  • by ScottWhigham on 7/14/11, 2:24 PM

    I make local backups and then have a routine that downloads those backups. I tried one too many times relying on web hosts for backups...
  • by adyus on 7/14/11, 2:35 PM

    Hmm, given that the question was asked on HN, there could be a good business idea there... File and DB backups offered as SaaS.
  • by latch on 7/22/11, 10:03 AM

    I actually check that I can restore from my backups once a week (it's the first thing I do when I wake up Saturday morning).
  • by sander on 7/16/11, 12:33 PM

    There actually is a service that backups your website through FTP regularly, just trying to think of the name...
  • by europa on 7/26/11, 1:46 PM

  • by timdev on 7/20/11, 1:10 PM

    rsnapshot with some scripts to prepare database dumps to grab ensure mysqldata is included. Works very well.
  • by Zakuzaa on 7/17/11, 10:04 PM

    I rsync my backups to bqbackup.
  • by LaggedOut on 7/16/11, 5:12 PM

    Backups, who needs them ;)