The fastest (easiest) way to move files from server-to-server.
Uploading an entire directory of files can take a long time with the traditional method, but if you have SSH access into your server and use SCP instead it will be faster. You'll need to know where on file system(s) or network connection “remote” machine (the one receiving directories/files) has been saved in so as not to overwrite any other stashed data from there like source code for instance!
This way requires more technical skills which may deter some people given how often nowadays we are using these tools intuitively without thinking about their complexity…but at least this procedure allows us do our jobs efficiently while staying ahead by utilizing all available resources such that wasted hours aren't spent waiting around
It would take you hours to download a 20GB file if we were using FTP. But with wget, the downloading and uploading can be done in just minutes!
Steps to using WGET:
- Make file publicly available on source server
Which means placing somewhere within public_html (RHEL/Centos) or www (Debian/Ubuntu).
Placing it in a public directory makes it available from frontend url (e.g. https://domain.com/backup.tar.gz).
- Download file from destination server.
From receiving server, navigate to the directory you’d like to download it to.
Type wget https://domain.com/backup.tar.gz – to download external file to your current directory.
(Of course, your file path and filename may vary.)
Wait for it to finish transferring. Should be super quick.
- Change file ownership to match correct USER:GROUP.
Type ls -la to see what what other files are owned by. You’ll probably see other files listed as jimbob:jimbob or jimbob:nobody, while the one you just downloaded was probably root:root (or whatever linux sudo user you had).
Type chown YOURuser:YOURgroup backup.tar.gz – using the correct user, group, and filename.
Don’t forget to do this. Otherwise, you won’t be able to extra the archive later!
You can use wget 123.123.123.123/wp-content/backup.tar.gz --header "Host: domain.com – nice little trick to download specifically from a certain IP. Useful during DNS changes that haven’t propagated.
If you're unable to compress files on the source server, use rsync instead. I have two guides that will help get your downloads going:
-Rsync Guide #1 -How To Compress Files Using Rsync For Downloading Large Amounts Of Data In Seconds (Without Losing Speed)slow and even disconnect before you download the whole file. You can try rate-limiting your wget.
Top comments (0)