Mar 11

Move a complete directory from one server to another (with tar)

Category: Linux   — Published by tengo on March 11, 2008 at 11:52 am

There are multiple ways of moving folders from one machine to another. You could connect via FTP, download all stuff and then upload the folder to the other machine. But this approach would suffer from slow connections and a step in-between that is not really needed. So the best and easiest way ist to trasnfer the data directly between servers.

In order to do so, use this set of commands. To prepare for the process, connect via SSH on each of the machines, then on the target machine, go to the directory where the files need to end up.

Next prepare the source machine: go to the directory you'd like to copy and do this:

tar --dereference -cvf archive.tar . && gzip archive.tar

or via tar switch directly (capital "Z" is for gzip):

tar --dereference -cvfZ archive.tar.gz

Here we use --dereference, as for example in the /etc dir, many files are just symlinks, and the target machine may look different, and we may end up with many empty files in the archive - so me make sure, the data is there, loosing the "just a link" information.
Also, you might get an error, when you try to wrap up too many files. In this case use this variant:

find . -iname '*' > tarfiles.list
tar -cvf archive.tar --files-from tarfiles.list && gzip archive.tar

The result will be a tar archive, which we will now move over to the destination directory on the other machine. To do so, switch your ssh session to the destination directory and enter:

wget http://<name of source machine/<source directory>/archive.tar.gz

A progress dialog will show and the file will arrive. After that you need to extract the files:

tar -xvpf archive.tar.gz

(the -p switch tries to recreate the original permissions) On older systems (this guide was tested on debian 4.0), sometimes two steps are needed to achive the same:

gunzip archive.tar.gz && tar -xvf archive.tar

Done. Finally clean up your temporary files on both machines with:

rm archive.tar.gz tarfiles.list

A note at the end

Once I had the problem that the source server refused to hand out the file via wget (403 forbidden). When you encounter this problem there are two different ways to move the data:

1.

Do the move via FTP, but a FTP session you start on the destination machine (only possible if you have an ftp daemon running on the source machine):

To make things easier move the archive file from the subdir to the ftp root

mv archive.tar.gz /path/to/ftp/root/archive.tar.gz

then connect via FTP:

ftp
FTP> open sourceserver.com
FTP> *username*
FTP> *password*
FTP> get archive.tar.gz

2.

Also possible is the move via securecopy scp, this approach skips the tar-ing of the dir altogether:

Change into destination dir (on the target machine), then:

scp -rp source_username@<source ip>:/path/to/directory/to/move .

(the "." at the end says: move the data into the current directory)

Please remember that scp will create all files locally under the current user and thus you might want to switch via su into the correct user beforehand

In case you want to resume a formerly interrupted sync, or if you don't want to compress the files beforehand, read this post to find out how to use rsync to transfer a directory.

More file archiving/compression posts: