Shell script to copy all files recursively and upload them to remote FTP server

in Categories Backup, Ftp last updated April 9, 2008
#!/bin/bash
# Shell script to copy all files recursively and upload them to 
# remote FTP server (copy local all directories/tree to remote ftp server)
#
# If you want to use this script in cron then make sure you have
# file pointed by $AUTHFILE (see below) and add lines to it:
# host ftp.mycorp.com
# user myftpuser
# pass mypassword
# 
# This is a free shell script under GNU GPL version 2.0 or above
# Copyright (C) 2005 nixCraft
# Feedback/comment/suggestions : http://cyberciti.biz/fb/
# -------------------------------------------------------------------------
# This script is part of nixCraft shell script collection (NSSC)
# Visit http://bash.cyberciti.biz/ for more information.
# -------------------------------------------------------------------------
 
FTP="/usr/bin/ncftpput"
CMD=""
AUTHFILE="/root/.myupload"
 
if [ -f $AUTHFILE ] ; then 
  # use the file for auth
  CMD="$FTP -m -R -f $AUTHFILE $myf $remotedir $localdir"
else
  echo "*** To terminate at any point hit [ CTRL + C ] ***"
  read -p "Enter ftpserver name : " myf
  read -p "Enter ftp username : " myu
  read -s -p "Enter ftp password : " myp
  echo ""
  read -p "Enter ftp remote directory [/] : " remotedir
  read -p "Enter local directory to upload path [.] : " localdir
  [ "$remotedir" == "" ] && remotedir="/" || :
  [ "$localdir" == "" ] && localdir="." || :
  CMD="$FTP -m -R -u $myu -p $myp $myf $remotedir $localdir"
fi
 
$CMD

Share this on:

18 comment

  1. Hi All,
    We are using commaand as,

    /usr/bin/ncftpput -u $username -p $passwd

    But it does not overwrite file on remote server if it already present.
    Any suggestions???

  2. Is there a way to skip sir, it’s very important as I’m doing massive amount of files, and our destination server’s connection isn’t reliable. Cheers

  3. hi , I have shell script to sftp the files from one server to other server. When executed manually it works fine but when scheduled the script using crontab it fails.
    Below is the piece of code. Please help me in solving this issue. Thanks in advance.

     #!/bin/bash
     USER=username
     PASSWORD=password
     HOST=hostname
     dt=`date +%Y%m%d`
     fldt=`date --date "$dt 1 days ago" +%Y%m%d`
     ldir=/home/localdir
     
    sudo -u xyz sftp $USER@$HOST <<EOF 
    $PASSOWRD
     cd rmtdir
     mget filename_$fldt*.dat.txt $ldir
     quit
     EOF
     if [ $? -ne 0 ]; then
     echo "sftp failed" | mailx -s "`date` :Sftp failed" pavan@ymail.com
     else
     echo "sftp success"| mailx -s "`date` :Sftp success" pavan@ymail.com
     fi
     exit 0
    
  4. Hello,
    I hade to delete “$myf” in the line below:
    CMD=”$FTP -m -R -f $AUTHFILE $myf $remotedir $localdir”

    Then it all went well.

    Thank you for an good and simple script!
    /Pelle

  5. Can we add this script to cron and have it auto backup? I am looking to back up entire dir every week, like at 3am, when there is the least amount of traffic, but I don’t want to have to be around for this to happen.

  6. I am taking server name from a file and everything working fine buy my problem is if a server in the list is not connecting it is not going to next available server. How to get the ftp status in the script if ftp is not connecting how to skip to next available server .

  7. If using the AUTHFILE option, don’t forget to set the following variables in this script

    myf=”my-ftp-server.com”
    remotedir=”/my/remote/dir”
    localdir=”/my/local/dir”

    And yes, you will need ncftp installed, if it is not already
    For OS X with mac ports,
    sudo port install ncftp

    Have a question? Post it on our forum!