A few years ago, I ran a simple script to delete old log files.
At least, that’s what I thought I was doing.
In reality, a tiny mistake wiped an entire directory, causing hours of recovery work. If you've ever seen a Bash command go horribly wrong, you're not alone.
New Bash users often fall into avoidable traps—from missing quotes to unexpected variable expansions. Some mistakes cause silent failures, while others can delete important files in seconds.
Here are five common Bash mistakes, how they break scripts, and the right way to fix them.
1. Forgetting to Quote Variables (Breaks Scripts in Unexpected Ways)
Skill Level: Beginner 🔹
The Problem
Leaving variables unquoted leads to word splitting and glob expansion, causing unpredictable results.
What Can Go Wrong?
filename="my file.txt"
rm $filename # ❌ Breaks if filename contains spaces
If filename="my file.txt"
, Bash interprets this as:
rm my file.txt # ❌ Tries to delete "my" and "file.txt" separately
The Fix
Always quote variables to prevent unwanted splitting:
rm "$filename" # ✅ Safe, deletes "my file.txt" as expected
💡 Real-World Debugging Example:
A friend once wrote a backup script that copied files based on a user-provided variable:
cp $source_dir /backup/
It worked fine—until someone entered a directory name with spaces, and nothing got copied.
2. Using rm -rf *
Without Thinking (A Disaster Waiting to Happen)
Skill Level: Beginner 🔹
The Problem
New Bash users sometimes run dangerous commands without safeguards.
What Can Go Wrong?
cd /var/www
rm -rf * # ❌ Destroys everything in the directory, no confirmation
Accidentally running this in the wrong directory can delete critical files.
The Fix
- Use
rm -i
for an interactive prompt:
rm -i *
- Prevent accidental deletions by enabling
noclobber
mode:
set -o noclobber
- Instead of deleting files permanently, use a trash command:
sudo apt install trash-cli # Install trash-cli
trash-put important_file.txt
💡 Real-World Debugging Example:
An engineer at a startup once ran:
rm -rf / var/log
The space between /
and var/log
was missed, resulting in a full system wipe instead of just clearing logs.
3. Looping Over ls
Output (Why This Breaks Scripts)
Skill Level: Intermediate 🔹🔹
The Problem
Parsing ls
output in a loop causes unexpected failures, especially with filenames that contain spaces or special characters.
What Can Go Wrong?
for file in $(ls *.txt); do
echo "Processing $file"
done
Why This Fails:
-
ls
doesn’t handle filenames with spaces properly. - Newline-separated filenames break the loop.
The Fix
Use find
or globbing
instead:
for file in *.txt; do
echo "Processing $file"
done
Or, using find
:
find . -name "*.txt" -print0 | while IFS= read -r -d '' file; do
echo "Processing $file"
done
💡 Real-World Debugging Example:
A DevOps engineer once wrote a script to bulk rename logs but found that files with spaces were being mangled in the process. Switching to find -print0
solved it.
4. Assuming [
and [[
Work the Same Way (They Don’t)
Skill Level: Intermediate 🔹🔹
The Problem
Many beginners assume [
and [[
are identical—but they have critical differences.
What Can Go Wrong?
file="myfile.txt"
if [ -f $file ]; then # ❌ Breaks if $file is empty or contains spaces
echo "File exists"
fi
The Fix
Use [[ ... ]]
instead of [ ... ]
whenever possible:
if [[ -f "$file" ]]; then # ✅ Safe, avoids expansion issues
echo "File exists"
fi
💡 Real-World Debugging Example:
A bug in a backup script resulted in deleting the wrong files because a test failed silently due to [ ... ]
behaving unexpectedly.
5. Not Handling Errors Properly (set -e
Can Save You Hours of Debugging)
Skill Level: Advanced 🔹🔹🔹
The Problem
By default, Bash scripts keep running even if a command fails, leading to unexpected behavior.
What Can Go Wrong?
#!/bin/bash
mkdir /backup
cp important_file.txt /backup
echo "Backup complete!"
If cp
fails, the script still prints "Backup complete!", misleading the user.
The Fix
Use set -e
to exit on errors:
#!/bin/bash
set -e
mkdir /backup
cp important_file.txt /backup
echo "Backup complete!"
💡 Real-World Debugging Example:
A CI/CD pipeline once deployed broken code because a failed command was ignored, causing a major production outage.
Final Thoughts: Avoid These Bash Pitfalls and Write Better Scripts
These mistakes waste hours of debugging time, but once you learn to avoid them, your scripts will be more reliable and secure.
Quick Recap:
✅ Quote variables to prevent unexpected splitting
✅ Use [[ ... ]]
instead of [ ... ]
for safer conditionals
✅ Avoid rm -rf *
without safeguards
✅ Use set -e
to catch errors early
✅ Never loop over ls
output—use find
or globbing
instead
🚀 Master Bash Faster with This Cheat Book!
Want to boost your productivity and avoid Googling the same Bash commands over and over? My Bash Scripting Cheat Book is the ultimate quick-reference guide for everyday tasks like:
- File handling, process management, and networking
- Regex, text manipulation, and troubleshooting techniques
-
Essential Bash utilities (
jq
,find
,grep
,awk
) explained concisely
👉 Get the Bash Cheat Sheet for just $3.99
Discussion: What’s the Worst Bash Mistake You’ve Ever Made?
Drop a comment below and share your biggest Bash scripting fail—so others can learn from it!
Top comments (0)