I've got a standard laptop/desktop system with a single hard drive and a CD Burner. I can't backup my entire system onto a CD nor do I really want to. I would like to backup enough stuff so that I can rebuild my system without too much trouble. Here are a few things I know I want to backup:
But I know there are many other config files lying around that might be non-trivial to restore.
What is your backup policy? What do you backup?
I've seen several messages posted here from people who trashed important configuration files with etc-update and a bit too quick of a trigger finger. It seems to me that it would make sense to take regular snapshots of /etc especially right before an etc-update. There is a Snapshot Howto that shows how to do this (and more).
Anyway, it seems to me that most desktop users are probably not backing up as well or as often as they should. I've been thinking that a generic desktop backup ebuild might be very useful. All the tools are already in ebuilds. The backup ebuild would mostly be glue. At the very least it should take regular snapshots of /etc and if a CD burner is available, remind the user (once a week?) to insert a blank cd and run some preconfigured program.
After Perl everything else is just assembly language.
only one guy answered. this is pretty sad i'd say.
either noone else is using backups or they do it without a clear policy..
c'mon now, give a newbie (me some hints about how, when, where and what should he backup in various environments.
I have a 120Gb drive and a 60Gb drive. The 60Gb drive holds tarballs of /, /usr/, opt, /home, and all of /downloads except movies. The only reason I have /usr and /opt backed up is becaue they're rieser4. / is backed up because I thought I was going to switch it reiser4, but it turns out I can't very easily (no /boot partition). I do my backups manually when I think enough stuff changed that it's worth backing up again.
I adapted that tutorial that sets up daily and hourly historical snapshots to a server. Anytime I go anywhere now I sync the laptop beforehand via ssh.. (and burn down the server every week or so.. ). Also, cronjobs do some stuff.
I was just using rsync before, but that tutorial is cool, it hard links anything that hasn't changed.
It is so convenient, every backup after the first is an incremental (i.e., fast) but appears to be a full backup.
I have two backup servers, each backs up all my important dirs (/root, /home, /etc, /data) every 4 hours, staggered, so everything gets backed up somewhere every 2 hours.
when restores are needed, you don't have to hassle with tar, just use 'ls -l' to find the file you want, and 'cp' to restore it.
I wrote up some perl scripts to automate all this, they're fired by cron. I should post my scripts somewhere sometime...
I ended up making one script for my local lan and one script for when I was out. I think making a script that checks ifconfig and then fires off the local one would be a good idea, though I haven't gotten around to it. It could be something as simple as:
if ifconfig | grep 192.168.2.21; then doit.sh; fi
.. but I haven't gotten around to doing it up. Are you a laptop user?
I personally back up /etc, /usr/portage, and /home. Why? Well, /home is my users, obviously. /etc for my configuration stuff that I'd just be too lazy to restore. /usr/portage because thats where portage lives of course. The restore procedure is as follows:
Un-tar stage3 tarball. Un-tar backups. Re-merge everything I had installed w/ -k option:
I like the newer k3b and gaim better If there is actually an easier way to do this I would love to know. I was actually wondering if copying your world file over and doing 'emerge -ke world' would work or not...
Great post! I just lost a drive over the weekend and lost /etc /usr /home and /var... I even tried sticking the drive in the freezer for a while... long story short, I re-installed...
I was looking around for backup systems and policies that didn't involve tar'ing my whole drive... that rsync tutorial sounds perfect! Incremental backups so that you're not backing up things that haven't changed is brilliant.
Thanks for the pointer guys, this thread's timing is perfect
there's a link to one guy who set up the scripts to go the reverse direction, so he runs a command on his laptop and it pushes the data to the backup server. still uses the rsync hard linking idea, it just runs on demand.
yes, that's what I do. It's convenient but you have to remember to do it... what I do is ssh into the backup server. But I'd only want it to happen in a cron job if I'm local. I'll check out that link, thankyou.
Oh, another thing I'm using with the backup server is shfs. It's a great module in general. For this purpose it's very convenient. On another computer on the lan (not the laptop), a backup script runs that backs up key directories to a home directory (that's really another partitition..). The backup server mounts that "user"s home folder over ssh and does the snapshot thing and then releases.
Shfs is pretty cool for working with remote files with all your favorite programs, too, it's very convenient.
I have cronjobs to backup my gnucash (every other day) and /home, /etc once a week. Wrote a little script using tar for cron to execute and save it with the date in the name to a backup partition.
System backups when I feel like it. I think I am going to start with a little more frequency on that though, since just got snagged by a bad backup. My last one one 9/9/03 and didn't want to go back that far. Thankfully, I had the others backed up with cronjob though.
Haven't had any luck with mondoarchive. It always fails. Backups made, but get shared file errors on the cd when try to type: compare,....
Partimage is starting to be a problem also, can copy the image to CD, cmp fine, but it will get a screwup on the magic string a lot and won't let you bypass that.
I am with some of the others and going to use a home-grown solution now.
I have succeeded in using afio to backup the system so far to a spare partition. Using a little of the mondo creed and breaking it down to pieces.
So far compressed system of 5.976251M to 2.069096M in 43m14s and did a verify in 23m34s. That is just with gzipping it with afio and while still on the system. Using bzip2 instead of gzip with afio, got 7.157571M to 1.963610M but it took 83m2s for compression. So didn't really save much in space for the time it took. Will keep the script with afio's default of gzip.
If my calculations are correct then gzip with afio gave a 65.37% reduction and bzip2 gave a 72.56% reduction. There is a mistake in it though cause distfiles were included in the system size, but not in the compressed files. Rsynced a few files, then removed a few. So the system size with the bzip is probably closer to the gzipped one. So that means even worse for bzip2 and it is probably closer to 67.14% reduction (fudge factor by not having the distfiles in the compressed ones, but should still be close, don't have that much in there.) So will probably stick with the gzip in afio.
Going to remove the distfiles and recalculate to see if correct.
Last edited by Decibels on Sat Jan 31, 2004 2:09 pm, edited 2 times in total.
Support bacteria – they’re the only culture some people have.”
I use 3 external 200 GB SATA disks where I take backup using robocopy to mirror the servers to the external disk every day, at the end of the week I change the disk so I always have 3 weeks off backup.
The disk is shared so it is easy to get the files back if you deleted one by mistake or for some other reason.
And yes, it is a Windows Server 2000 using Microsofts Robocopy from the Source Kit.
(It will change to Gentoo In future when I have more experience with it)
experience will come as you do the stuff with it.
november 2003 i had zero (almost) experience with linux, now i have two servers in production (with different roles) and they are doing great.
so, my advice: learn as you do, BACKUP and firewall
Backup policy? Since installing gentoo on my desktop my policies died out quickly... Before learning the system and linux in general (I'm a complete n00b) I had to reinstall, e.g. wipe out the whole system , several times.
I wrote a small backup script which saves the backups into my homedir tertian. Every now and then I save this to my MO-Drive. It should be more or lesss self-explanatory. All items in the compress array are saved to the given target-dir...