backup - bash script to tar multiple domain folders

07
2014-07
  • Heihachi

    !/bin/bash

    sitedir="$HOME/domains"
    logs="$HOME/site_backups/log"
    
    tbackups="$HOME/site_backups/today"
    ybackups="$HOME/site_backups/yesterday"
    
    echo "`date`"  > $logs/backups.log
    
    for i in `ls`; do
        cd $sitedir/$i
        tar -czf $tbackups/$i".tar.gz" /public_html >> $logs/backups.log
    done
    exit 0
    

    I want to backup every public_html folder inside a domain name ($home/domains/site.com/public_html) and make a tar in a $HOME/site_backups/today with the domain name. However, when i run my bash scrip i am getting errors:

    test.sh: line 12: cd: /home/user/domains/log: No such file or directory
    test.sh: line 12: cd: /home/user/domains/test.sh: No such file or directory
    test.sh: line 12: cd: /home/user/domains/today: No such file or directory
    test.sh: line 12: cd: /home/user/domains/yesterday: No such file or directory
    

    Why it cd's to /home/user/domains/.. ? I have specified $sitedir/$i which means /home/user/domains/domain.name

  • Answers
  • John1024

    ls runs in whatever the current directory is. From your results, it must have been /home/user/domains/. If you want to work on the directories in $sitedir, change to that directory first:

    cd "$sitedir"
    for i in */ ; do
        ( cd "$i" && tar -czf $tbackups/${i%%/}".tar.gz" public_html >> $logs/backups.log ; )
    done
    

    By using */ in places of ls, the shell returns a list of directories, which seems to be what you want, rather than just any file.

    Lastly, I put the processing of "$1" in parenthesis (a subshell) so that, when the processing is over, the script is back in "$sitedir" and ready for the next loop.


  • Related Question

    ubuntu - FTP from bash script running from cron not working
  • TheVillageIdiot

    I have a script that is run by cron to create backup of a MySQL database and some files. After creating a tar ball and encrypting it with openSSH I have to put it on a remote ftp server. Following is the code for ftp part:

        HOST='abcd.dyndns.biz'
        USER='username'
        PASSWD='password'
        FILE='myBack-'${LOCAL_HOST}'-'${DATENAME}'.enc.tar.gz'
        DIRNAME='/usr/local/backups/'
    
        cd ${DIRNAME}
    
        ftp -n ${HOST} <<END_SCRIPT
        quote USER ${USER}
        quote PASS ${PASSWD}
        cd backup
        lcd ${DIRNAME}
        put ${FILE}
        quit
    
    END_SCRIPT
    

    If i directly run the script from command line > sudo ./mybackup.sh it runs smoothly and put the backup file on ftp, but when it is running from cron it has never put the file on ftp though other pre-ftp and post-ftp (like consolidating log file and emailing outcome) steps work fine. I am not able to get anything from any log files also or to pinpoint the cause.

    NOTE:- Our dyndns.biz ip does not change as we have paid plan.


  • Related Answers
  • Teddy

    Use Curl's upload functionality instead:

    curl --upload-file "$FILE" --user "$USER:$PASSWD" "ftp://$HOST/backup/"
    
  • Ярослав Рахматуллин

    Umm.. I see several points why it could fail.

    1. you don't know that ftp has opened a connection when you are sending input, but maybe it takes care of that for you (im not familiar with your ftp prog.. it could be anything afaik).
    2. you are not separating your ftp:// commands with semicolons or newlines - try adding \n or ;. again idk what ur app wants

    3. you are not checking that the file actually exists before trying to put it.