Table of Contents
- Not Checking Command Exit Statuses
- Incorrect Variable Handling and Unquoted Expansions
- Misusing
set -e(Exit on Error) - Poor Loop and File Iteration Practices
- Hardcoding Paths and Assumptions About the Environment
- Silent Failures: Lack of Error Messaging
- Ignoring Edge Cases (e.g., Empty Inputs, Undefined Variables)
- Security Risks: Command Injection and Unsanitized Inputs
- Neglecting Testing and Validation
- Conclusion
1. Not Checking Command Exit Statuses
The Pitfall
Bash scripts often execute commands sequentially, but by default, Bash ignores the success or failure of individual commands. A single failed command (e.g., a missing file, permission denied) can cause subsequent commands to run with invalid data, leading to silent failures or data corruption.
Why It Happens
Many users assume commands “just work” or forget to explicitly check if a command succeeded. For example:
#!/bin/bash
cp importantfile /backup/ # If this fails, the script proceeds anyway
rm importantfile # Now we’ve lost the file!
The Fix
Always check the exit status of critical commands. Bash uses $? to store the exit status of the last command (0 = success, non-zero = failure). For stricter control:
- Use
set -e(orset -o errexit) to make the script exit immediately if any command fails. - Use
set -o pipefailto ensure a pipeline fails if any command in the pipeline fails (not just the last one). - Explicitly check exit statuses with
ifstatements for granular control.
Example: Using set -euo pipefail
#!/bin/bash
set -euo pipefail # Exit on error, undefined variable, or pipeline failure
cp importantfile /backup/ # Script exits here if cp fails
rm importantfile # Only runs if cp succeeded
Example: Explicit Check with if
#!/bin/bash
if ! cp importantfile /backup/; then
echo "Error: Failed to copy importantfile to /backup/" >&2 # Log to stderr
exit 1 # Exit with non-zero status to indicate failure
fi
rm importantfile
2. Incorrect Variable Handling and Unquoted Expansions
The Pitfall
Bash variables are prone to unexpected behavior when mishandled, especially with spaces, special characters, or empty values. Unquoted variables undergo word splitting and glob expansion, leading to broken commands.
Common Mistakes
-
Unquoted variables with spaces:
FILES="file 1.txt file 2.txt" rm $FILES # Expands to "rm file 1.txt file 2.txt" (tries to delete "file", "1.txt", etc.)This deletes unintended files!
-
Glob expansion:
PATTERN="*.txt" rm $PATTERN # Expands to all .txt files, even if PATTERN was meant to be literal
The Fix
Always quote variables with double quotes ("$VAR") to preserve spaces and prevent glob expansion. Use single quotes ('$VAR') only if you want to literalize the variable name.
Corrected Example
FILES="file 1.txt file 2.txt"
rm "$FILES" # Expands to "rm 'file 1.txt' 'file 2.txt'" (correctly handles spaces)
When to Avoid Quoting:
Only omit quotes if you intentionally want word splitting (e.g., iterating over a list of space-separated values):
ARGS="--verbose --force"
command $ARGS # Expands to "command --verbose --force" (desired here)
3. Misusing set -e (Exit on Error)
The Pitfall
While set -e (exit on error) is a powerful safety net, it has edge cases that can lead to false positives or missed failures. For example:
- Commands with non-zero exit statuses in
ifconditions (e.g.,if grep "pattern" file; then ...) are ignored byset -e, which is intentional—but beginners may forget this. - Commands like
grep(which exits with1if no match is found) can cause unintended exits.
The Fix
- Use
set -ejudiciously, and pair it withset -o pipefailandset -u(discussed later). - For commands where failure is expected, append
|| trueto suppress the error:grep "optional-pattern" file.txt || true # Script won’t exit if grep fails - Use explicit
ifchecks for commands where failure requires action:if ! grep "critical-pattern" file.txt; then echo "Critical pattern missing!" >&2 exit 1 fi
4. Poor Loop and File Iteration Practices
The Pitfall
Looping over files with for loops and ls is a common anti-pattern. ls outputs filenames as a space-separated list, which breaks if filenames contain spaces, newlines, or special characters.
Bad Practice:
for file in $(ls *.txt); do # Breaks if filenames have spaces (e.g., "my file.txt")
echo "Processing $file"
done
The Fix
Use glob patterns or find with while loops for safe iteration:
1. Glob Patterns (Simplest)
for file in *.txt; do # Correctly handles spaces in filenames
[ -f "$file" ] || continue # Skip if no files match the glob
echo "Processing $file"
done
2. find with while Loop (For Recursive/Complex Cases)
find ./docs -name "*.md" -print0 | while IFS= read -r -d '' file; do
echo "Processing $file" # Handles newlines/spaces in filenames
done
-print0and-d ''use null bytes to separate filenames, avoiding splitting issues.
5. Hardcoding Paths and Environment Assumptions
The Pitfall
Scripts often assume paths like /usr/local/bin exist, or that tools like python are in the PATH. Hardcoding paths or relying on unvalidated environment variables leads to portability issues.
Examples of Bad Assumptions:
~/scripts/backup.sh(tilde~may not expand in all contexts; use"$HOME/scripts/backup.sh"instead).python3 myscript.py(assumespython3is inPATH; check withcommand -v python3first).
The Fix
-
Use variables for paths and validate them:
BACKUP_DIR="$HOME/backups" if [ ! -d "$BACKUP_DIR" ]; then echo "Error: $BACKUP_DIR does not exist!" >&2 exit 1 fi -
Check if dependencies exist with
command -v:if ! command -v ffmpeg &> /dev/null; then echo "Error: ffmpeg is not installed!" >&2 exit 1 fi
6. Silent Failures: Lack of Error Messaging
The Pitfall
Scripts that fail without explanation waste debugging time. For example:
cp source.txt dest.txt # Fails silently if dest.txt is read-only
The Fix
-
Log errors to
stderr(file descriptor2) with>&2to separate them from regular output:if ! cp source.txt dest.txt; then echo "Error: Failed to copy source.txt to dest.txt" >&2 exit 1 fi -
Use
set -x(orset -o xtrace) to debug by printing commands as they execute (add at the top of the script or run withbash -x script.sh).
7. Ignoring Edge Cases (e.g., Empty Inputs, Undefined Variables)
The Pitfall
Scripts often fail when given unexpected inputs: empty files, zero arguments, or undefined variables.
Common Edge Cases:
- Undefined variables:
echo "Hello $USERNAME"(fails ifUSERNAMEis unset). - No input files:
process_files.shcalled with no arguments.
The Fix
-
Use
set -u(orset -o nounset) to exit on undefined variables:set -u echo "Hello $USERNAME" # Exits with "script.sh: line 2: USERNAME: unbound variable" -
Validate inputs explicitly:
if [ $# -eq 0 ]; then # Check if no arguments were provided echo "Usage: $0 <input-file>" >&2 exit 1 fi INPUT_FILE="$1" if [ ! -f "$INPUT_FILE" ]; then echo "Error: $INPUT_FILE is not a valid file!" >&2 exit 1 fi
8. Security Risks: Command Injection and Unsanitized Inputs
The Pitfall
Scripts that accept user input (e.g., CLI arguments, environment variables) are vulnerable to command injection if inputs are not sanitized.
Example of a Vulnerable Script:
read -p "Enter a filename: " FILENAME
rm "$FILENAME" # Safe? No—what if the user enters "file.txt; rm -rf /"?
If the user inputs file.txt; rm -rf /, the script executes rm file.txt; rm -rf /—disaster!
The Fix
-
Sanitize inputs with parameter expansion or validation:
read -p "Enter a filename: " FILENAME # Allow only letters, numbers, and underscores/dashes if [[ ! "$FILENAME" =~ ^[a-zA-Z0-9_-]+\.txt$ ]]; then echo "Error: Invalid filename!" >&2 exit 1 fi rm "$FILENAME" # Now safe -
Avoid
evalunless absolutely necessary (it executes arbitrary input as code).
9. Neglecting Testing and Validation
The Pitfall
Even well-written scripts fail without proper testing. Skipping validation (e.g., checking file permissions) or testing with only “happy path” inputs leads to unexpected failures.
The Fix
-
Use
shellcheck: A static analysis tool for Bash scripts that catches syntax errors, undefined variables, and bad practices.shellcheck myscript.sh # Install with "sudo apt install shellcheck" (Debian/Ubuntu) -
Test with
set -x: Runbash -x myscript.shto print commands as they execute (debugging). -
Dry Runs: Add a
--dry-runflag to simulate actions without making changes:DRY_RUN=0 if [[ "$1" == "--dry-run" ]]; then DRY_RUN=1 shift fi ACTION="rm" if [ $DRY_RUN -eq 1 ]; then ACTION="echo [DRY RUN] rm" fi $ACTION "$FILE" # Either runs "rm" or "echo [DRY RUN] rm"
Conclusion
Bash automation is powerful, but its pitfalls can turn scripts into sources of frustration (or worse). By adopting proactive habits—validating inputs, quoting variables, checking exit statuses, and testing rigorously—you can write scripts that are reliable, secure, and easy to maintain.
Remember: The best Bash scripts are those that fail loudly, handle edge cases, and assume nothing about their environment. Pair these practices with tools like shellcheck and thorough testing, and you’ll avoid most common pitfalls.