Hey fellow Linux enthusiasts! Today, I want to share my favorite command-line tools for downloading and sharing files. As someone who practically lives in the terminal, I've found these tools to be indispensable in my daily workflow.
Downloading Files: wget and curl
When it comes to grabbing files from the web, two tools always come to mind: wget and curl.
wget: My Reliable Downloader
I love wget for its simplicity and power. Here's how I typically use it:
# Basic download
wget https://example.com/file.zip
# Resuming an interrupted download
wget -c https://example.com/large-file.iso
# Downloading with a custom filename
wget -O my-file.zip https://example.com/file-with-long-name.zip
Pro tip: If you're downloading a file on a flaky connection, try wget --tries=0 --retry-connrefused
. It'll keep trying until the download succeeds.
curl: More Than Just Downloads
While wget is great, curl is my Swiss Army knife for web interactions:
# Basic file download
curl -O https://example.com/file.txt
# Following redirects (super handy!)
curl -L -O https://example.com/redirected-file.txt
# Downloading with authentication
curl -u username:password -O https://example.com/secure-file.zip
I often use curl when I need to inspect headers or make specific HTTP requests alongside downloads.
Sharing Files: scp and sftp
When it comes to moving files between machines, security is key. That's why I stick with scp and sftp.
scp: Quick and Secure File Transfers
scp is my go-to for quick file transfers:
# Copying a local file to a remote system
scp /path/to/local/file user@remote:/path/to/destination
# Copying from remote to local
scp user@remote:/path/to/remote/file /path/to/local/destination
# Transferring entire directories
scp -r /local/directory user@remote:/remote/path
I find scp particularly useful for its simplicity – it works just like cp, but over SSH.
sftp: When You Need More Interaction
For more complex file management tasks, I turn to sftp:
# Connect to remote system
sftp user@remote
# Once connected:
get remotefile.txt # Download file
put localfile.txt # Upload file
ls # List remote files
lls # List local files
I love sftp for its interactive nature. It's perfect when I need to browse remote directories or perform multiple file operations.
Bonus Tip: Downloading from a List of URLs
Sometimes, I find myself needing to download multiple files from a list of URLs. Here's a neat trick I use with both wget and curl to handle this scenario:
Using wget with a file list
If you have a file (let's call it urls.txt
) containing URLs, each on a separate line, you can use wget like this:
wget -i urls.txt
This command tells wget to read the URLs from the file and download each one. It's super handy when you're dealing with a bunch of files!
Using curl with a file list
Curl can also handle this task, though it requires a bit more setup:
xargs -n 1 curl -O < urls.txt
This command uses xargs
to feed each line from urls.txt
to curl, which then downloads the file.
My personal preference
Honestly, I tend to lean towards wget for this task. It's simpler and doesn't require additional commands. Plus, wget has some cool features for this scenario:
# Download files in the background
wget -b -i urls.txt
# Limit the download speed (useful to avoid network saturation)
wget --limit-rate=500k -i urls.txt
# Continue partially downloaded files
wget -c -i urls.txt
These options make wget incredibly flexible when working with file lists.
Wrapping Up
These four tools – wget, curl, scp, and sftp – cover almost all of my file transfer needs in Linux. They're powerful, secure, and incredibly versatile. Plus, using them from the terminal just feels so much more efficient than clicking around in a GUI.
What are your favorite file transfer tools in Linux? Do you use these, or do you have some hidden gems I should know about? Drop a comment below – I'm always eager to learn new tricks!
Until next time, happy file transferring!
Top comments (0)