Backing Up Linux Files For Archiving


Recommended Posts

I have a quick question:

I have a linux host where I want to archive files with TAR - what I want to do is:

1. Take ALL files in /home/bsbaker (including the dotfiles) and tar them up and give the file a name like KUS-home-xx-xx-xx.tar (with xx-xx-xx being the date of the backup file)

2. Take ALL files in /home/bsbaker/public_html/ (including the dotfiles) tar them up and give the file a name like KUS-web-xx-xx-xx.tar (with xx-xx-xx being the date of the backup file)

3. Take ALL files in /home/bsbaker/public_html/secure/ (including the dotfiles) tar them up and give the file a name like KUS-secure-xx-xx-xx.tar (with xx-xx-xx being the date of the backup file)

4. Take ALL files in /home/bsbaker/public_html/secure/dke/ (including the dotfiles) tar them up and give the file a name like DKE-web-xx-xx-xx.tar (with xx-xx-xx being the date of the backup file)

5. When competed, have a message mailed to my email on the machine - telling me that the job is done. I want to do this MONTHLY - then download the tar files to my machine by going in and using SFTP to do this.

Is there a script that I can make that will allow the process to commence? NOTE: I am NOT root on this machine - just a user - this script should allow me to do a backup with just a script run.

Any idas Team??

Brian

Link to post
Share on other sites

a bash script would work great for this. Linux really makes automating stuff relatively easy. Do you know how to do this manually, if so, just put the commands that you normally use into a list(or script). I could help you script it, if you could show me for example what tar command you are using.

Link to post
Share on other sites

Questions:

  1. When you say "all files", does that include the files that will be in the other tarballs? Frex, does the backup of /home/bsbaker include /home/bsbaker/public_html?
  2. Do you really need four tarballs? Could you tar up the whole $HOME and break it apart later if needed?
  3. What's the root directory of the tarballs? Frex, should DKE-web expand to /home/bsbaker/public_html/secure/dke/, or public_html/secure/dke/, or dke/, or what?
  4. Does your host provide a less stupid way to backup user accounts? ;) rsync or something?

Edited by jcl
Link to post
Share on other sites
Questions:
  1. When you say "all files", does that include the files that will be in the other tarballs? Frex, does the backup of /home/bsbaker include /home/bsbaker/public_html?
  2. Do you really need four tarballs? Could you tar up the whole $HOME and break it apart later if needed?
  3. What's the root directory of the tarballs? Frex, should DKE-web expand to /home/bsbaker/public_html/secure/dke/, or public_html/secure/dke/, or dke/, or what?
  4. Does your host provide a less stupid way to backup user accounts? ;) rsync or something?

This is what I wanted.

I have a home directory tar file that with everything I have its 64 MEG. I have a web directory public_html that is about 40 MEG and a secure directory that is about 5 megs. I work for D&K Enterpises, and the secure/dke directory is his website that I am running as a test on my linux host. I have to back this up as well.

Rather then to take the whole 64 megs every time, I want to break it down so that /home/bsbaker is ONE file, /home/bsbaker/public_html/ is ANOTHER file, and /home/bsbaker/public_html/secure is a third file. if necessary I will add /home/bsbaker/public_html/secure/dke/ so that I pick up all th files in his website.

That way, it does not tun into 100 megs of tar file that I could lose, and iif i wanted to restore just the web files, I could upload that TAR file and viola!!

Hope this makes sense :)

Brian

Edited by baker7
Link to post
Share on other sites
a bash script would work great for this. Linux really makes automating stuff relatively easy. Do you know how to do this manually, if so, just put the commands that you normally use into a list(or script). I could help you script it, if you could show me for example what tar command you are using.

Cool - a script is exactly what I need - I need to know how to build it though :)

the command I use is as follows (assumes home directory /home/bsbaker/

tar -cvf KUS-home-xx-xx-xx.tar * (Tells system to do ALL files, however, I am missing the Dotfiles [xx-xx-xx is date of file]

I want to also grab the dotfiles and have this script email me when it is completed

then I do the same for the public_html directory and the public_html/secure/dke directory, because I am IN thm whn I tar up the files - aft that, I download all of these :)

I use tcsh as default shell as well, but think I could use bash for this script :)

thanks shanenin :>

Brian

Edited by baker7
Link to post
Share on other sites

The tar commands would be similar to

tar -cf KUS-home-${date}.tar --exclude 'public_html' ${dir}
tar -cf KUS-web-${date}.tar --exclude 'secure' ${dir}/public_html
tar -cf KUS-secure-${date}.tar --exclude 'dke' ${dir}/public_html/secure
tar -cf DKE-web-${date}.tar ${dir}/public_html/secure/dke

where

# sh
date=$(date +%F)
dir=/home/bsbaker
# tcsh
set date=`date +%F`
set dir=/home/bsbaker

except that the --exclude options have to be tweaked.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...