liminfo

Shell Commands

Free reference guide: Shell Commands

33 results

About Shell Commands

This Shell Script Reference is a searchable cheat sheet for Bash shell scripting, organized into six categories: Variables, Conditionals, Loops, Functions, File Handling, and Processes. Each entry includes a working code snippet you can paste directly into a terminal or script file. Whether you are writing deployment scripts, log analysis one-liners, or system administration automation, this page provides the syntax patterns you reach for daily.

Bash remains the default shell on most Linux distributions and macOS, making shell scripting an essential skill for developers, system administrators, and DevOps engineers. The reference covers the constructs that appear in virtually every script: variable expansion with ${VAR:-default}, conditional tests with [ ] and [[ ]], file existence checks (-f, -d, -x), while-read loops for line-by-line processing, and the strict mode trio set -euo pipefail that turns implicit failures into explicit errors.

The File Handling section brings together the Unix text-processing power tools: grep for pattern searching, sed for stream editing and in-place replacement, awk for column extraction and field-based computation, find for recursive file discovery, and xargs for feeding results into other commands. The Processes section covers background execution, signal trapping with trap, pipe chaining, command substitution, and AND/OR logic chains — the glue that ties individual commands into robust automation workflows.

Key Features

  • Variable assignment, expansion, default values (${VAR:-default}), special variables ($1, $#, $@, $?), export, and readonly
  • Conditional expressions with [ ], extended tests [[ ]] for regex and glob matching, and numeric comparison operators
  • File test operators (-f, -d, -e, -r, -w, -x, -s) for checking existence, type, and permissions
  • Loop constructs — for-in, C-style for(()), while-read for file processing, until, and break/continue
  • Function definitions with local variables, return codes, and value capture via command substitution
  • Text processing pipeline — grep (pattern search), sed (stream edit), awk (column processing), find (file search), xargs (stdin-to-args)
  • Process management with background execution (&), trap for signal handling, and set -euo pipefail strict mode
  • Pipe chaining and command substitution patterns for composing multi-step data processing pipelines

Frequently Asked Questions

Which shell does this reference target?

The examples target Bash (Bourne Again Shell), which is the default shell on most Linux distributions and macOS prior to Catalina. The syntax is largely compatible with sh (POSIX shell) except for Bash-specific features like [[ ]], arrays, and brace expansion {1..10}.

What does set -euo pipefail do and should I always use it?

set -e exits the script immediately if any command returns a non-zero exit code. set -u treats unset variables as errors instead of expanding to empty strings. set -o pipefail causes a pipeline to return the exit code of the last failed command rather than just the final command. Together they catch common scripting bugs early and are recommended at the top of most production scripts.

What is the difference between [ ] and [[ ]]?

[ ] is a POSIX-compatible test command — it works in sh and all Bourne-compatible shells. [[ ]] is a Bash built-in that adds pattern matching (== glob*), regex matching (=~ regex), and logical operators (&&, ||) without needing to quote variables as carefully. Use [[ ]] in Bash scripts for safety and readability; use [ ] when POSIX portability is required.

How do I read a file line by line in Bash?

Use while IFS= read -r line; do ... done < file.txt. The IFS= prevents leading/trailing whitespace from being trimmed, and -r prevents backslash interpretation. This is the standard idiom for processing files line by line without spawning a subshell for each line.

When should I use awk vs sed for text processing?

Use sed for simple find-and-replace operations (s/old/new/g) and line-range extraction. Use awk when you need to split lines into fields and perform column-based operations, arithmetic, or conditional logic. awk is essentially a small programming language for structured text data, while sed is a stream editor optimized for pattern-based transformations.

How do I pass the output of one command as arguments to another?

Use xargs to convert standard input into command arguments. For example, find . -name "*.log" | xargs grep "ERROR" searches all log files for the word ERROR. Use -n 1 to pass one argument at a time, or -0 with find -print0 to handle filenames containing spaces.

What is the difference between $@ and $* in a function?

When quoted, "$@" expands each positional parameter as a separate word, preserving arguments with spaces. "$*" expands all parameters as a single word, joined by the first character of IFS (usually a space). In virtually all cases, "$@" is what you want when passing arguments to another command.

How do I handle errors gracefully with trap?

trap allows you to run a cleanup function when the script receives a signal or exits. trap cleanup EXIT runs cleanup on normal exit and most termination signals. trap "echo Interrupted" INT catches Ctrl+C. This is essential for removing temporary files, releasing locks, or printing error messages before the script terminates.