Table of Contents
- Introduction
- Script Structure: The Foundation of Readability
- 1.1 The Shebang Line
- 1.2 Clear Comments and Documentation
- 1.3 Logical Organization
- Error Handling: Avoid Silent Failures
- 2.1 Exit on Errors with
set -e - 2.2 Catch Undefined Variables with
set -u - 2.3 Fail Fast with
set -o pipefail - 2.4 Clean Up with Traps
- 2.1 Exit on Errors with
- Variables: Quoting, Naming, and Scope
- 3.1 Always Quote Variables
- 3.2 Use Consistent Naming Conventions
- 3.3 Limit Scope with Local Variables
- Input Handling: Validate and Sanitize
- 4.1 Check for Required Arguments
- 4.2 Parse Options with
getopts - 4.3 Validate Input Early
- Loops and Conditionals: Write Clean, Safe Logic
- 6.1 Prefer
[[ ]]Over[ ]for Conditionals - 6.2 Avoid Word Splitting in Loops
- 6.1 Prefer
- Security: Protect Against Common Risks
- 7.1 Avoid
evaland Unsafe Commands - 7.2 Sanitize User Input
- 7.3 Restrict File Permissions
- 7.1 Avoid
- Testing and Debugging: Catch Issues Early
- 8.1 Use
set -xfor Tracing - 8.2 Lint with
shellcheck - 8.3 Test with Dry Runs
- 8.1 Use
- Performance: Optimize for Speed and Efficiency
- 8.1 Minimize Subshells
- 8.2 Use Built-in Commands
- Documentation: Make Your Scripts User-Friendly
- 9.1 Add a
--helpOption - 9.2 Write a README
- 9.1 Add a
- Conclusion
- References
1. Script Structure: The Foundation of Readability
A well-structured script is easier to understand, debug, and maintain. Start with these basics:
1.1 The Shebang Line
Always begin your script with a shebang line to specify the interpreter. For bash scripts, use:
#!/bin/bash
Avoid #!/bin/sh, as sh may point to a minimal POSIX shell (e.g., dash on Debian/Ubuntu) that lacks bash-specific features (e.g., arrays, [[ ]] conditionals).
Tip: Verify the path to bash with which bash (typically /bin/bash or /usr/bin/bash).
1.2 Clear Comments
Comments explain why (not just what) the code does. Include a header comment with:
- Script purpose
- Author/contact
- Dependencies (e.g.,
jq,curl) - Usage examples
Example Header:
#!/bin/bash
# Purpose: Automates daily backups of /var/www to an S3 bucket
# Author: Jane Doe ([email protected])
# Dependencies: aws-cli, gzip
# Usage: ./backup.sh [--force] [--bucket my-bucket]
Add inline comments for complex logic, but avoid redundant comments (e.g., # increment counter above i=$((i+1))).
1.3 Logical Organization
Structure scripts into sections:
- Configuration: Define constants (e.g.,
BACKUP_DIR="/var/www"). - Functions: Encapsulate reusable logic (e.g.,
log_error(),validate_input()). - Main Execution: Orchestrate the workflow by calling functions.
Example Structure:
#!/bin/bash
set -euo pipefail
# Configuration
BACKUP_DIR="/var/www"
S3_BUCKET="my-backups"
# Functions
log() { echo "[$(date +'%Y-%m-%d %H:%M:%S')] $*"; }
# Main
log "Starting backup..."
tar -czf /tmp/backup.tar.gz "$BACKUP_DIR"
aws s3 cp /tmp/backup.tar.gz "s3://$S3_BUCKET/"
log "Backup complete."
2. Error Handling: Avoid Silent Failures
Bash scripts often fail silently by default (e.g., a command fails, but the script continues). Use these practices to catch errors early.
2.1 Exit on Errors with set -e
The set -e flag makes the script exit immediately if any command fails (returns a non-zero exit code).
Example:
#!/bin/bash
set -e # Exit on first error
echo "Creating file..."
touch /root/secret.txt # Fails if run as non-root
echo "File created." # This line won’t run if `touch` fails
2.2 Catch Undefined Variables with set -u
set -u (or set -o nounset) treats undefined variables as errors, preventing bugs from typos (e.g., $BAKUP_DIR instead of $BACKUP_DIR).
Example:
#!/bin/bash
set -u
echo "Backup dir: $BACKUP_DIR" # Fails if BACKUP_DIR is undefined
2.3 Fail Fast with set -o pipefail
By default, a pipeline (e.g., cmd1 | cmd2 | cmd3) returns the exit code of the last command. set -o pipefail makes the pipeline return the exit code of the first failed command, ensuring failures in earlier commands are not ignored.
Example:
#!/bin/bash
set -o pipefail
# Without pipefail: exits 0 (last command `grep` succeeds)
# With pipefail: exits 1 (first command `false` fails)
false | echo "Hello" | grep "World"
2.4 Clean Up with Traps
Use trap to run commands on script exit (e.g., clean up temporary files, release resources).
Example: Clean Up Temp Files
#!/bin/bash
set -euo pipefail
TMP_FILE=$(mktemp)
trap 'rm -f "$TMP_FILE"' EXIT # Delete TMP_FILE on exit (success or failure)
# Script logic that uses $TMP_FILE...
echo "Data" > "$TMP_FILE"
Common traps:
EXIT: Run on script exit (success/failure).ERR: Run when a command fails (requiresset -e).SIGINT: Run onCtrl+C(interrupt).
3. Variables: Quoting, Naming, and Scope
3.1 Always Quote Variables
Unquoted variables are subject to word splitting (spaces split into multiple arguments) and globbing (wildcards expand to filenames). Quote variables with "$VAR" to avoid this.
Example: Word Splitting
FILE="my file.txt"
cat $FILE # Fails: `cat` tries to open "my" and "file.txt"
cat "$FILE" # Works: `cat` opens "my file.txt"
Example: Globbing
PATTERN="*.txt"
echo $PATTERN # Expands to all .txt files in the current directory
echo "$PATTERN" # Prints "*.txt" literally
3.2 Use Consistent Naming Conventions
- UPPERCASE: Environment variables (e.g.,
PATH,HOME). - lowercase: Local variables and function names (e.g.,
backup_dir,log_message). - SNAKE_CASE: Multi-word names (e.g.,
tmp_file_path).
Avoid reserved names (e.g., PATH, USER) and single-letter variables (except loop counters like i).
3.3 Limit Scope with Local Variables
Use local in functions to restrict variable scope, preventing accidental overwrites of global variables.
Example:
#!/bin/bash
count=0
increment() {
local count=100 # Local to the function
count=$((count + 1))
echo "Inside function: $count" # Output: 101
}
increment
echo "Outside function: $count" # Output: 0 (global unchanged)
4. Input Handling: Validate and Sanitize
Scripts often rely on user input (arguments, options, or data from files). Validate and sanitize input to avoid crashes or security risks.
4.1 Check for Required Arguments
Use positional parameters ($1, $2, etc.) for inputs, and check that they exist.
Example: Require a Filename Argument
#!/bin/bash
set -euo pipefail
if [ $# -eq 0 ]; then
echo "Error: No filename provided."
echo "Usage: $0 <filename>"
exit 1
fi
FILE="$1"
if [ ! -f "$FILE" ]; then
echo "Error: File $FILE does not exist."
exit 1
fi
echo "Processing $FILE..."
4.2 Parse Options with getopts
For scripts with flags/options (e.g., --force, -v), use getopts to parse them cleanly.
Example: Parse -f (force) and -b (bucket) Options
#!/bin/bash
set -euo pipefail
FORCE=0
BUCKET="default-bucket"
# Parse options: -f (no arg), -b (with arg)
while getopts "fb:" opt; do
case $opt in
f) FORCE=1 ;;
b) BUCKET="$OPTARG" ;;
\?) echo "Invalid option: -$OPTARG" >&2; exit 1 ;;
:) echo "Option -$OPTARG requires an argument." >&2; exit 1 ;;
esac
done
echo "Force mode: $FORCE"
echo "Bucket: $BUCKET"
Run with: ./script.sh -f -b my-bucket
4.3 Validate Input Early
Check input types (e.g., numbers, paths) and ranges (e.g., “port must be 1-65535”) before using them.
Example: Validate a Port Number
validate_port() {
local port=$1
if ! [[ "$port" =~ ^[0-9]+$ ]]; then
echo "Error: Port must be a number." >&2
return 1
elif [ "$port" -lt 1 ] || [ "$port" -gt 65535 ]; then
echo "Error: Port must be between 1 and 65535." >&2
return 1
fi
}
PORT=8080
if validate_port "$PORT"; then
echo "Valid port: $PORT"
fi
5. Loops and Conditionals: Write Clean, Safe Logic
5.1 Prefer [[ ]] Over [ ] for Conditionals
The [[ ]] construct (bash-specific) is more powerful and safer than the POSIX [ ] (or test). It supports:
- Pattern matching (e.g.,
[[ "$str" == *"substring"* ]]). - Logical operators (
&&,||instead of-a,-o). - No need to quote variables (though quoting is still good practice).
Example: Check if a String Contains a Substring
str="hello world"
if [[ "$str" == *"world"* ]]; then
echo "Contains 'world'"
fi
Example: Compare Numbers
age=20
if [[ "$age" -ge 18 ]]; then # -ge = greater than or equal
echo "Adult"
fi
5.2 Avoid Word Splitting in Loops
Loop over arrays or use find -print0 | xargs -0 to handle filenames with spaces.
Bad: Word Splitting
# Fails if files have spaces (e.g., "my file.txt")
for file in $(ls *.txt); do
cat "$file"
done
Good: Array Loop
files=(*.txt) # Store filenames in an array
for file in "${files[@]}"; do
cat "$file"
done
6. Security: Protect Against Common Risks
6.1 Avoid eval and Unsafe Commands
eval executes strings as code, which is dangerous if the string contains untrusted input (e.g., user input with rm -rf /).
Unsafe:
user_input="rm -rf /"
eval "$user_input" # Executes the malicious command!
Avoid setuid scripts (files with chmod u+s), as they run with the owner’s privileges and are prone to abuse.
6.2 Sanitize User Input
If using user input in commands (e.g., ssh "$user@host"), sanitize it to remove dangerous characters (e.g., ;, &, |).
Example: Sanitize a Username
sanitize_user() {
local user="$1"
# Allow only letters, numbers, and underscores
if [[ "$user" =~ ^[a-zA-Z0-9_]+$ ]]; then
echo "$user"
else
echo "Invalid username: $user" >&2
exit 1
fi
}
user_input="alice; rm -rf /"
safe_user=$(sanitize_user "$user_input") # Fails with error
6.3 Restrict File Permissions
Set scripts to chmod 700 (read/write/execute for owner only) to prevent unauthorized modification or execution.
chmod 700 ./backup.sh
7. Testing and Debugging: Catch Issues Early
7.1 Use set -x for Tracing
set -x (or set -o xtrace) prints each command before execution, making it easy to trace errors.
Example:
#!/bin/bash
set -x # Enable tracing
echo "Step 1: Create dir"
mkdir /tmp/mydir
echo "Step 2: Write file"
echo "data" > /tmp/mydir/file.txt
Output includes commands with + prefixes:
+ echo 'Step 1: Create dir'
Step 1: Create dir
+ mkdir /tmp/mydir
+ echo 'Step 2: Write file'
Step 2: Write file
+ echo data
7.2 Lint with shellcheck
ShellCheck is a linter that identifies bugs, syntax errors, and bad practices. Install it with apt install shellcheck (Debian/Ubuntu) or brew install shellcheck (macOS).
Example: Run ShellCheck
shellcheck ./backup.sh
It will flag issues like unquoted variables, undefined variables, or incorrect conditional syntax.
7.3 Test with Dry Runs
Add a --dry-run option to preview actions without making changes (e.g., print aws s3 cp ... instead of running it).
Example: Dry Run Flag
dry_run=0
if [[ "$1" == "--dry-run" ]]; then
dry_run=1
shift
fi
cmd="aws s3 cp /tmp/file s3://bucket/"
if [ "$dry_run" -eq 1 ]; then
echo "[Dry Run] $cmd"
else
$cmd
fi
8. Performance: Optimize for Speed and Efficiency
8.1 Minimize Subshells
Subshells (created with (...) or pipes) are slow. Use { ... } for command groups (no subshell) when possible.
Slow (Subshell):
result=$(echo "hello" | tr '[:lower:]' '[:upper:]') # Subshell + pipe
Faster (No Subshell):
{ echo "hello"; } | tr '[:lower:]' '[:upper:]' # Uses a pipe but no subshell
8.2 Use Built-in Commands
Bash built-ins (e.g., echo, read, [[ ]]) are faster than external commands (e.g., ls, grep).
Slow (External grep):
if echo "$str" | grep -q "pattern"; then ...
Faster (Built-in [[ ]]):
if [[ "$str" == *"pattern"* ]]; then ... # No external command
9. Documentation: Make Your Scripts User-Friendly
9.1 Add a --help Option
Include a --help flag to print usage instructions, options, and examples.
Example: Help Message
usage() {
echo "Usage: $0 [OPTIONS] SOURCE DEST"
echo "Sync files from SOURCE to DEST."
echo
echo "Options:"
echo " -f, --force Overwrite existing files"
echo " -v, --verbose Show detailed output"
echo " -h, --help Show this help message"
}
if [[ "$1" == "--help" || "$1" == "-h" ]]; then
usage
exit 0
fi
9.2 Write a README
Include a README.md with:
- Purpose of the script
- Installation steps (dependencies)
- Usage examples
- Troubleshooting tips
Conclusion
Bash scripting is a powerful tool for Linux automation, but its flexibility can lead to messy, error-prone code. By following these best practices—structured organization, strict error handling, secure input validation, and thorough documentation—you’ll write scripts that are reliable, maintainable, and secure.
Start small: adopt set -euo pipefail in new scripts, quote variables, and use shellcheck to lint. Over time, these habits will simplify your automation workflow and reduce debugging headaches.