The One Line That Matters

set -euo pipefail -- put this at the top of every script. If you remember nothing else from this article, remember that.

It catches unset variables. It stops execution on errors. It fails pipelines properly. Without it, your script keeps running after something breaks, silently corrupting whatever comes next.

Everything below is organized by what you are trying to do. Making scripts that will not silently break. Processing files without losing data. Automating deploys. And knowing when to stop using Bash entirely.

Scripts That Will Not Silently Break

Before quoting rules, before variables, before anything clever: the shebang line. #!/usr/bin/env bash. It tells the kernel which interpreter to run.

Quoting is where scripts go wrong first. Here is the full picture before you see any code. Double quotes expand variables but prevent word splitting -- use them around every variable reference, always. Single quotes treat everything as a literal string. $(...) does command substitution. Backticks do the same thing but are harder to read and do not nest. Forget they exist.

basics.sh
#!/usr/bin/env bash# A simple script demonstrating variables and quoting# Variable assignment (no spaces around the equals sign)APP_NAME="my-web-app"VERSION="2.4.1"DEPLOY_DIR="/opt/apps/${APP_NAME}"# Double quotes preserve spaces and expand variablesecho"Deploying ${APP_NAME} version ${VERSION}..."echo"Target directory: ${DEPLOY_DIR}"# Single quotes treat everything literallyecho'This will print ${APP_NAME} literally, no expansion.'# Command substitution with $()CURRENT_DATE=$(date +%Y-%m-%d)
GIT_HASH=$(git rev-parse --short HEAD 2>/dev/null || echo "unknown")
echo"Build date: ${CURRENT_DATE}"echo"Git commit: ${GIT_HASH}"# Default values with parameter expansionPORT="${1:-8080}"# Use first arg, default to 8080ENV="${2:-development}"# Use second arg, default to developmentecho"Starting on port ${PORT} in ${ENV} mode"

${VAR} vs $VAR? Same thing when the variable stands alone. But $APP_NAME_backup.tar.gz looks for a variable called APP_NAME_backup. Braces make boundaries explicit. Just always use them.

${VAR:-default} gives you a fallback when the variable is unset. ${VAR:=default} does the same but also assigns it. Memorize both. They show up in every script that takes optional arguments, and they replace five lines of if/else checking.

Conditionals and Test Expressions

[[ ... ]] for new scripts. Always. The old [ ... ] (literally the test command) has quoting landmines that [[ ]] eliminates. Only use single brackets if you need strict POSIX portability on some ancient system.

Pre-deployment check script showing several patterns at once:

conditionals.sh
#!/usr/bin/env bashCONFIG_FILE="./config/deploy.yml"MIN_DISK_MB=500# File existence checksif [[ ! -f "${CONFIG_FILE}" ]]; thenecho"Error: Config file not found at ${CONFIG_FILE}" >&2
 exit 1
fi# String comparisonBRANCH=$(git branch --show-current)
if [[ "${BRANCH}" != "main" && "${BRANCH}" != "master" ]]; thenecho"Warning: You are deploying from branch '${BRANCH}', not main."read -p "Continue? (y/n): " confirm
 if [[ "${confirm}" != "y" ]]; thenecho"Deployment cancelled."exit 0
 fifi# Numeric comparisonAVAIL_DISK=$(df -m / | awk'NR==2 {print $4}')
if (( AVAIL_DISK < MIN_DISK_MB )); thenecho"Error: Only ${AVAIL_DISK}MB free. Need at least ${MIN_DISK_MB}MB." >&2
 exit 1
fi# Pattern matching with [[ ]]FILENAME="release-2.4.1.tar.gz"if [[ "${FILENAME}" == *.tar.gz ]]; thenecho"Extracting gzipped tarball..."tar -xzf "${FILENAME}"elif [[ "${FILENAME}" == *.zip ]]; thenecho"Extracting zip archive..."unzip"${FILENAME}"elseecho"Unknown archive format." >&2
 exit 1
fiecho"All pre-flight checks passed. Proceeding with deployment."

Error messages go to stderr via >&2. Non-negotiable. Callers need clean stdout.

(( ... )) for arithmetic. Readable. The alternative -- [[ $AVAIL_DISK -lt $MIN_DISK_MB ]] -- works but looks like it was designed to annoy you. And glob patterns like == *.tar.gz inside double brackets do not need quoting on the right side. That pattern matching is the entire reason [[ ]] exists.

Processing Files Without Losing Data

for when you have a list. while when you are waiting.

That covers 90% of it. until exists too but while ! ... does the same thing and reads better. The script below hits a list of servers, processes log files by glob, and waits for a database to start. Three real patterns.

loops.sh
#!/usr/bin/env bash# For loop: iterate over a list of serversSERVERS=("web01.example.com""web02.example.com""web03.example.com")
for server in"${SERVERS[@]}"; doecho"Checking health of ${server}..."ifcurl -sf "http://${server}/health" > /dev/null 2>&1; thenecho" [OK] ${server} is healthy"elseecho" [FAIL] ${server} is not responding"fidone# For loop: iterate over files matching a patternecho""echo"Processing log files..."for logfile in /var/log/app/*.log; doif [[ ! -f "${logfile}" ]]; thencontinue# skip if glob did not match anythingfiLINE_COUNT=$(wc -l < "${logfile}")
 echo" ${logfile}: ${LINE_COUNT} lines"done# While loop: wait for a service to become availableecho""echo"Waiting for database to accept connections..."MAX_RETRIES=30
RETRY_COUNT=0
while ! pg_isready -h localhost -p 5432 > /dev/null 2>&1; do
 (( RETRY_COUNT++ ))
 if (( RETRY_COUNT >= MAX_RETRIES )); thenecho"Database did not start within ${MAX_RETRIES} seconds." >&2
 exit 1
 fisleep 1
doneecho"Database is ready after ${RETRY_COUNT} seconds."# C-style for loop: useful for numeric rangesecho""echo"Creating worker processes..."for (( i=1; i<=4; i++ )); doecho" Starting worker-${i}"done

File globs in for loops are a trap. No matches? Bash hands you the literal glob string. The script checks [[ ! -f "${logfile}" ]] before processing because of this. Or put shopt -s nullglob at the top so unmatched globs expand to nothing.

while IFS= read -r line; do ... done < file.txt for reading files line by line. The IFS= preserves leading whitespace, -r prevents backslash mangling. Skip either and you get corrupted output on lines with special characters. This has bitten me on config files with indentation that mattered.

Functions and Return Values

Past fifty lines without functions? Good luck maintaining that.

return sets exit status (0-255). echo sends text to stdout. They do completely different things and people mix them up constantly. Need a computed value back from a function? Echo it and capture with $(...). Need pass/fail? Use return 0 or return 1.

functions.sh
#!/usr/bin/env bash# A logging function with severity levelslog() {
 local level="${1}"local message="${2}"local timestamp
 timestamp=$(date +"%Y-%m-%d %H:%M:%S")
 echo"[${timestamp}] [${level}] ${message}"
}
# A function that returns a value via echoget_app_version() {
 local package_file="${1:-package.json}"if [[ ! -f "${package_file}" ]]; thenecho"0.0.0"return 1
 figrep -o '"version": "[^"]*"'"${package_file}" | head -1 | cut -d'"' -f4
}
# A validation function that uses return for pass/failvalidate_environment() {
 local errors=0
 if ! command -v docker > /dev/null 2>&1; thenlog"ERROR""Docker is not installed"
 (( errors++ ))
 fiif ! command -v node > /dev/null 2>&1; thenlog"ERROR""Node.js is not installed"
 (( errors++ ))
 fiif [[ ! -f ".env" ]]; thenlog"WARN"".env file not found, using defaults"fireturn"${errors}"
}
# Using the functionslog"INFO""Starting build pipeline..."ifvalidate_environment; thenlog"INFO""Environment validation passed"elselog"ERROR""Environment validation failed"exit 1
fiVERSION=$(get_app_version)
log"INFO""Building version ${VERSION}"

local on every variable inside functions. Without it, variables leak into global scope. Two functions sharing a variable called result will stomp on each other. Maddening to debug.

command -v to check whether a program exists. Not which. which behaves differently on different systems and sometimes prints noise to stdout. command -v is a shell built-in. Works everywhere.

Text Processing: grep, sed, awk

grep finds lines. sed replaces text in streams. awk pulls columns out of structured text. Three tools, and they cover most of what you will need.

If you can't read the pipeline below, break it into separate commands. No shame in that.

text-processing.sh
#!/usr/bin/env bash# grep: find all error lines in a log fileecho"=== Recent Errors ==="grep -i "error" /var/log/app/application.log | tail -20
# grep: count how many 500 errors occurred todayTODAY=$(date +%Y-%m-%d)
ERROR_COUNT=$(grep -c "${TODAY}.*HTTP 500" /var/log/nginx/access.log)
echo"500 errors today: ${ERROR_COUNT}"# sed: replace a configuration value in-placesed -i 's/^DB_HOST=.*/DB_HOST=db.production.internal/' .env
sed -i 's/^DEBUG=true/DEBUG=false/' .env
# sed: remove all comment lines and blank lines from a configsed'/^#/d; /^$/d' config.ini > config.clean.ini
# awk: parse disk usage and flag partitions over 80%echo""echo"=== Disk Usage Warnings ==="df -h | awk'NR>1 {
 usage = int($5)
 if (usage > 80) {
 printf " WARNING: %s is at %s (%s used)\n", $6, $5, $3
 }
}'# awk: generate a summary of response times from access logsecho""echo"=== Response Time Summary ==="awk'{
 sum += $NF; count++
 if ($NF > max) max = $NF
 if (min == 0 || $NF < min) min = $NF
}
END {
 if (count > 0)
 printf " Avg: %.2fms | Min: %.2fms | Max: %.2fms | Samples: %d\n",
 sum/count, min, max, count
}' /var/log/app/response_times.log

Do not go deep on awk. It is a full programming language and some people write entire programs in it. Bad idea. If your awk gets hard to read, pipe into Python instead. Or use jq for JSON. I have seen awk one-liners nobody on the team could parse, including the person who wrote them a week later. Column extraction and simple field math. That is awk's sweet spot. Everything beyond that belongs in a language with actual error messages.

Cross-platform trap: sed -i on macOS needs sed -i '' 's/old/new/' file. GNU sed on Linux does not require the empty string argument. If your scripts run on both, detect the OS or install GNU sed via Homebrew. Or just stop writing scripts that need to run on both. Pick one.

Error Handling and Debugging

Your pg_dump fails because somebody rotated the password. Script moves on. tar packages up an empty directory. You find out six weeks later when you need to restore from backup and every archive is empty. This happens more often than people admit.

set -euo pipefail again. -e exits on any command failure. -u treats unset variables as errors. -o pipefail means a pipeline fails if any command in the chain fails, not just the last one. Three flags. Most silent failures gone.

For cleanup, trap with the EXIT signal:

error-handling.sh
#!/usr/bin/env bashset -euo pipefail
# Create a temporary directory and ensure it gets cleaned upWORK_DIR=$(mktemp -d)
echo"Working in temporary directory: ${WORK_DIR}"# trap ensures cleanup runs even if the script failscleanup() {
 local exit_code=$?
 echo"Cleaning up temporary files..."rm -rf "${WORK_DIR}"if (( exit_code != 0 )); thenecho"Script failed with exit code ${exit_code}" >&2
 fiexit"${exit_code}"
}
trap cleanup EXIT INT TERM
# A retry function with exponential backoffretry() {
 local max_attempts="${1}"local delay=1
 shiftfor (( attempt=1; attempt<=max_attempts; attempt++ )); doif"$@"; thenreturn 0
 fiecho" Attempt ${attempt}/${max_attempts} failed. Retrying in ${delay}s..." >&2
 sleep"${delay}"
 (( delay *= 2 ))
 doneecho" All ${max_attempts} attempts failed." >&2
 return 1
}
# Download an artifact with retry logicecho"Downloading build artifact..."retry 3 curl -fSL -o "${WORK_DIR}/app.tar.gz" \
 "https://releases.example.com/app-latest.tar.gz"# Extract and verifytar -xzf "${WORK_DIR}/app.tar.gz" -C "${WORK_DIR}"echo"Build artifact extracted successfully."# Debugging tip: use set -x to trace execution# set -x # Uncomment to see every command before it runs# set +x # Turn tracing back off

The EXIT trap fires no matter what. Normal exit, error exit, kill signal. Like finally in any other language.

set -x for debugging. Prints every command before it runs, with variables expanded. Put it before the suspicious section, set +x after. Targeted trace.

And quote your variables. Always. Unquoted variable with a space? Word splitting. Containing *? Filesystem glob expansion. Both can be destructive. Dead simple rule: if it is a variable, double-quote it.

Automating Deploys and Backups

Copy these. Modify them. They work.

Automated Backup Script

Timestamped backup of a directory and a PostgreSQL database, compressed, with automatic cleanup of old archives.

backup.sh
#!/usr/bin/env bashset -euo pipefail
# ConfigurationAPP_DIR="/opt/apps/my-web-app"BACKUP_DIR="/backups"DB_NAME="app_production"RETENTION_DAYS=7
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_NAME="backup_${TIMESTAMP}"log() { echo"[$(date +%H:%M:%S)] $1"; }
# Create backup directorymkdir -p "${BACKUP_DIR}/${BACKUP_NAME}"# Backup application fileslog"Backing up application files..."tar -czf "${BACKUP_DIR}/${BACKUP_NAME}/app_files.tar.gz" \
 -C "$(dirname "${APP_DIR}")" \
 "$(basename "${APP_DIR}")"# Backup databaselog"Dumping database..."pg_dump"${DB_NAME}" | gzip > "${BACKUP_DIR}/${BACKUP_NAME}/db_dump.sql.gz"# Create a final archive and remove the directorylog"Compressing final archive..."tar -czf "${BACKUP_DIR}/${BACKUP_NAME}.tar.gz" \
 -C "${BACKUP_DIR}""${BACKUP_NAME}"rm -rf "${BACKUP_DIR}/${BACKUP_NAME}"# Clean up old backupslog"Removing backups older than ${RETENTION_DAYS} days..."find"${BACKUP_DIR}" -name "backup_*.tar.gz" -mtime +"${RETENTION_DAYS}" -delete
# Report final sizeSIZE=$(du -sh "${BACKUP_DIR}/${BACKUP_NAME}.tar.gz" | cut -f1)
log"Backup complete: ${BACKUP_NAME}.tar.gz (${SIZE})"

Deployment Script

deploy.sh
#!/usr/bin/env bashset -euo pipefail
APP_NAME="my-web-app"DEPLOY_ROOT="/opt/apps/${APP_NAME}"RELEASES_DIR="${DEPLOY_ROOT}/releases"CURRENT_LINK="${DEPLOY_ROOT}/current"RELEASE_ID=$(date +%Y%m%d%H%M%S)
RELEASE_DIR="${RELEASES_DIR}/${RELEASE_ID}"KEEP_RELEASES=5
log() { echo"[deploy] $1"; }
# Create release directory and extract new codemkdir -p "${RELEASE_DIR}"log"Extracting release ${RELEASE_ID}..."tar -xzf "./build/${APP_NAME}.tar.gz" -C "${RELEASE_DIR}"# Install dependencies and buildlog"Installing dependencies..."cd"${RELEASE_DIR}" && npm ci --production
# Swap the symlink atomicallylog"Switching to new release..."ln -sfn "${RELEASE_DIR}""${CURRENT_LINK}"# Restart the applicationlog"Restarting service..."systemctl restart "${APP_NAME}"# Health checksleep 3
ifcurl -sf http://localhost:3000/health > /dev/null; thenlog"Health check passed. Deployment successful!"elselog"Health check FAILED. Rolling back..."PREVIOUS=$(ls -1t "${RELEASES_DIR}" | sed -n '2p')
 ln -sfn "${RELEASES_DIR}/${PREVIOUS}""${CURRENT_LINK}"systemctl restart "${APP_NAME}"exit 1
fi# Prune old releases, keeping only the latest Nlog"Pruning old releases (keeping ${KEEP_RELEASES})..."ls -1t "${RELEASES_DIR}" | tail -n +$(( KEEP_RELEASES + 1 )) | \
 while IFS= read -r old_release; dorm -rf "${RELEASES_DIR}/${old_release}"log" Removed release: ${old_release}"donelog"Deployment complete."

System Monitoring Script

Lightweight check of CPU, memory, disk, and key services. For small deployments where Prometheus would be overkill.

Schedule with crontab: 0 * * * * /opt/scripts/monitor.sh >> /var/log/monitor.log 2>&1 for hourly checks.

Other things worth scripting:

When to Stop Using Bash

Past 100 lines, switch to Python. Past the point where you need data structures more complex than an array, switch to Python. Past the point where you are parsing JSON without jq, definitely switch to Python.

Bash is glue. Good glue. But still glue.

One more thing. Install ShellCheck and run it on every script. It catches quoting bugs, portability issues, and subtle mistakes that would take hours to track down manually. Free. Fast. It will teach you more about Bash than reading documentation ever will.

Anurag Sinha

Anurag Sinha

Full Stack Developer & Technical Writer

Anurag is a full stack developer and technical writer. He covers web technologies, backend systems, and developer tools for the Codertronix community.