Tools: Linux Command Line: The Complete Cheat Sheet
Navigation & Directory Management
File & Directory Operations
Viewing & Editing Files
Text Search & Processing
File Permissions & Ownership
Permission Reference
Process Management
Signal Reference
Disk & Filesystem
Networking
Users & Groups
Package Management
apt (Debian/Ubuntu)
yum / dnf (RHEL/CentOS/Fedora)
Compression & Archives
Redirection & Pipes
Scheduling & Automation (Cron)
Terminal Shortcuts
SSH Key Management
System Services (systemd)
Common Gotchas
Pro Tips If you work in tech, you will use the Linux command line. It's not optional. Whether you're SSH-ing into a server, debugging a Docker container, or just trying to find that one log file buried three directories deep, these commands are the bedrock of your workflow. I've distilled 100+ essential commands into this single reference. Print it, tape it to your monitor, and watch your Stack Overflow searches drop by half. If you found this useful, share it with a colleague who needs it. Subscribe for more developer resources every week. Linux CLI Cheat Sheet — $6.99 on Gumroad Get the complete, downloadable version. Perfect for bookmarking, printing, or sharing with your team. Get it now on Gumroad → If you found this useful, drop a ❤️ and follow for more developer resources every week. Templates let you quickly answer FAQs or store snippets for re-use. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse
$ pwd # Print working directory
cd ~ # Go to home directory
cd - # Go to previous directory
cd ../.. # Go up two levels
ls # List files
ls -la # List all files (including hidden) with details
ls -lh # Human-readable file sizes
ls -lt # Sort by modification time (newest first)
tree # Visual directory tree
tree -L 2 # Tree limited to 2 levels deep
pushd /var/log # Push directory onto stack & cd into it
popd # Pop directory off stack & return
pwd # Print working directory
cd ~ # Go to home directory
cd - # Go to previous directory
cd ../.. # Go up two levels
ls # List files
ls -la # List all files (including hidden) with details
ls -lh # Human-readable file sizes
ls -lt # Sort by modification time (newest first)
tree # Visual directory tree
tree -L 2 # Tree limited to 2 levels deep
pushd /var/log # Push directory onto stack & cd into it
popd # Pop directory off stack & return
pwd # Print working directory
cd ~ # Go to home directory
cd - # Go to previous directory
cd ../.. # Go up two levels
ls # List files
ls -la # List all files (including hidden) with details
ls -lh # Human-readable file sizes
ls -lt # Sort by modification time (newest first)
tree # Visual directory tree
tree -L 2 # Tree limited to 2 levels deep
pushd /var/log # Push directory onto stack & cd into it
popd # Pop directory off stack & return
touch file.txt # Create empty file / -weight: 500;">update timestamp
mkdir -p a/b/c # Create nested directories
cp file.txt backup.txt # Copy file
cp -r dir/ newdir/ # Copy directory recursively
mv file.txt /tmp/ # Move file
mv old.txt new.txt # Rename file
rm file.txt # Delete file
rm -rf dir/ # Force-delete directory recursively
ln -s /path/to/file link # Create symbolic link
find . -name "*.log" # Find files by name
find . -mtime -7 # Files modified in last 7 days
find . -size +100M # Files larger than 100MB
find . -name "*.tmp" -delete # Find and delete
locate filename # Fast file search (uses DB)
touch file.txt # Create empty file / -weight: 500;">update timestamp
mkdir -p a/b/c # Create nested directories
cp file.txt backup.txt # Copy file
cp -r dir/ newdir/ # Copy directory recursively
mv file.txt /tmp/ # Move file
mv old.txt new.txt # Rename file
rm file.txt # Delete file
rm -rf dir/ # Force-delete directory recursively
ln -s /path/to/file link # Create symbolic link
find . -name "*.log" # Find files by name
find . -mtime -7 # Files modified in last 7 days
find . -size +100M # Files larger than 100MB
find . -name "*.tmp" -delete # Find and delete
locate filename # Fast file search (uses DB)
touch file.txt # Create empty file / -weight: 500;">update timestamp
mkdir -p a/b/c # Create nested directories
cp file.txt backup.txt # Copy file
cp -r dir/ newdir/ # Copy directory recursively
mv file.txt /tmp/ # Move file
mv old.txt new.txt # Rename file
rm file.txt # Delete file
rm -rf dir/ # Force-delete directory recursively
ln -s /path/to/file link # Create symbolic link
find . -name "*.log" # Find files by name
find . -mtime -7 # Files modified in last 7 days
find . -size +100M # Files larger than 100MB
find . -name "*.tmp" -delete # Find and delete
locate filename # Fast file search (uses DB)
cat file.txt # Print file to stdout
cat -n file.txt # Print with line numbers
less file.txt # Paginate file (q to quit)
head -n 20 file.txt # First 20 lines
tail -n 50 file.txt # Last 50 lines
tail -f /var/log/syslog # Follow file in real-time
wc -l file.txt # Count lines
diff -u file1 file2 # Unified diff (patch-friendly)
cat file.txt # Print file to stdout
cat -n file.txt # Print with line numbers
less file.txt # Paginate file (q to quit)
head -n 20 file.txt # First 20 lines
tail -n 50 file.txt # Last 50 lines
tail -f /var/log/syslog # Follow file in real-time
wc -l file.txt # Count lines
diff -u file1 file2 # Unified diff (patch-friendly)
cat file.txt # Print file to stdout
cat -n file.txt # Print with line numbers
less file.txt # Paginate file (q to quit)
head -n 20 file.txt # First 20 lines
tail -n 50 file.txt # Last 50 lines
tail -f /var/log/syslog # Follow file in real-time
wc -l file.txt # Count lines
diff -u file1 file2 # Unified diff (patch-friendly)
# grep
grep "pattern" file.txt # Search for pattern
grep -i "pattern" file.txt # Case-insensitive
grep -r "pattern" ./dir/ # Recursive search
grep -n "pattern" file.txt # Show line numbers
grep -v "pattern" file.txt # Invert match (exclude)
grep -c "pattern" file.txt # Count matching lines
grep -A 3 "pattern" file.txt # 3 lines after match
grep -E "pat1|pat2" file.txt # Extended regex (OR) # sed (Stream Editor)
sed 's/old/new/' file.txt # Replace first occurrence per line
sed 's/old/new/g' file.txt # Replace all occurrences
sed -i 's/old/new/g' file.txt # Edit file in-place
sed -n '5,10p' file.txt # Print lines 5-10
sed '/pattern/d' file.txt # Delete lines matching pattern # awk
awk '{print $1}' file.txt # Print first column
awk -F: '{print $1}' /etc/passwd # Use : as delimiter
awk '{sum+=$1} END{print sum}' # Sum first column # Other text tools
cut -d: -f1 /etc/passwd # Cut first field using : delimiter
sort file.txt # Sort alphabetically
sort -n numbers.txt # Sort numerically
sort -u file.txt # Sort and -weight: 500;">remove duplicates
uniq -c file.txt # Count duplicate lines
tr 'a-z' 'A-Z' < file.txt # Translate lowercase to uppercase
# grep
grep "pattern" file.txt # Search for pattern
grep -i "pattern" file.txt # Case-insensitive
grep -r "pattern" ./dir/ # Recursive search
grep -n "pattern" file.txt # Show line numbers
grep -v "pattern" file.txt # Invert match (exclude)
grep -c "pattern" file.txt # Count matching lines
grep -A 3 "pattern" file.txt # 3 lines after match
grep -E "pat1|pat2" file.txt # Extended regex (OR) # sed (Stream Editor)
sed 's/old/new/' file.txt # Replace first occurrence per line
sed 's/old/new/g' file.txt # Replace all occurrences
sed -i 's/old/new/g' file.txt # Edit file in-place
sed -n '5,10p' file.txt # Print lines 5-10
sed '/pattern/d' file.txt # Delete lines matching pattern # awk
awk '{print $1}' file.txt # Print first column
awk -F: '{print $1}' /etc/passwd # Use : as delimiter
awk '{sum+=$1} END{print sum}' # Sum first column # Other text tools
cut -d: -f1 /etc/passwd # Cut first field using : delimiter
sort file.txt # Sort alphabetically
sort -n numbers.txt # Sort numerically
sort -u file.txt # Sort and -weight: 500;">remove duplicates
uniq -c file.txt # Count duplicate lines
tr 'a-z' 'A-Z' < file.txt # Translate lowercase to uppercase
# grep
grep "pattern" file.txt # Search for pattern
grep -i "pattern" file.txt # Case-insensitive
grep -r "pattern" ./dir/ # Recursive search
grep -n "pattern" file.txt # Show line numbers
grep -v "pattern" file.txt # Invert match (exclude)
grep -c "pattern" file.txt # Count matching lines
grep -A 3 "pattern" file.txt # 3 lines after match
grep -E "pat1|pat2" file.txt # Extended regex (OR) # sed (Stream Editor)
sed 's/old/new/' file.txt # Replace first occurrence per line
sed 's/old/new/g' file.txt # Replace all occurrences
sed -i 's/old/new/g' file.txt # Edit file in-place
sed -n '5,10p' file.txt # Print lines 5-10
sed '/pattern/d' file.txt # Delete lines matching pattern # awk
awk '{print $1}' file.txt # Print first column
awk -F: '{print $1}' /etc/passwd # Use : as delimiter
awk '{sum+=$1} END{print sum}' # Sum first column # Other text tools
cut -d: -f1 /etc/passwd # Cut first field using : delimiter
sort file.txt # Sort alphabetically
sort -n numbers.txt # Sort numerically
sort -u file.txt # Sort and -weight: 500;">remove duplicates
uniq -c file.txt # Count duplicate lines
tr 'a-z' 'A-Z' < file.txt # Translate lowercase to uppercase
chmod 755 file # rwxr-xr-x
chmod 644 file # rw-r--r--
chmod +x script.sh # Add execute permission
chmod -R 755 dir/ # Apply recursively
chown user:group file # Change owner and group
chown -R user:group dir/ # Recursive ownership change
umask 022 # Default mask (files 644, dirs 755)
chmod 755 file # rwxr-xr-x
chmod 644 file # rw-r--r--
chmod +x script.sh # Add execute permission
chmod -R 755 dir/ # Apply recursively
chown user:group file # Change owner and group
chown -R user:group dir/ # Recursive ownership change
umask 022 # Default mask (files 644, dirs 755)
chmod 755 file # rwxr-xr-x
chmod 644 file # rw-r--r--
chmod +x script.sh # Add execute permission
chmod -R 755 dir/ # Apply recursively
chown user:group file # Change owner and group
chown -R user:group dir/ # Recursive ownership change
umask 022 # Default mask (files 644, dirs 755)
ps aux # All running processes
ps aux | grep nginx # Find process by name
pgrep nginx # Get PID by process name
top # Interactive process viewer
htop # Better interactive viewer
kill PID # Send SIGTERM (graceful -weight: 500;">stop)
kill -9 PID # Send SIGKILL (force -weight: 500;">stop)
killall nginx # Kill all processes named nginx
bg # Resume job in background
fg # Bring background job to foreground
jobs # List background jobs
nohup command & # Run immune to hangups
ps aux # All running processes
ps aux | grep nginx # Find process by name
pgrep nginx # Get PID by process name
top # Interactive process viewer
htop # Better interactive viewer
kill PID # Send SIGTERM (graceful -weight: 500;">stop)
kill -9 PID # Send SIGKILL (force -weight: 500;">stop)
killall nginx # Kill all processes named nginx
bg # Resume job in background
fg # Bring background job to foreground
jobs # List background jobs
nohup command & # Run immune to hangups
ps aux # All running processes
ps aux | grep nginx # Find process by name
pgrep nginx # Get PID by process name
top # Interactive process viewer
htop # Better interactive viewer
kill PID # Send SIGTERM (graceful -weight: 500;">stop)
kill -9 PID # Send SIGKILL (force -weight: 500;">stop)
killall nginx # Kill all processes named nginx
bg # Resume job in background
fg # Bring background job to foreground
jobs # List background jobs
nohup command & # Run immune to hangups
df -h # Disk space usage (human-readable)
du -sh dir/ # Size of directory
du -sh * | sort -h # All items sorted by size
lsblk # List block devices as tree
mount # Show all mounted filesystems
stat file.txt # Detailed file metadata
file mystery.bin # Identify file type
df -h # Disk space usage (human-readable)
du -sh dir/ # Size of directory
du -sh * | sort -h # All items sorted by size
lsblk # List block devices as tree
mount # Show all mounted filesystems
stat file.txt # Detailed file metadata
file mystery.bin # Identify file type
df -h # Disk space usage (human-readable)
du -sh dir/ # Size of directory
du -sh * | sort -h # All items sorted by size
lsblk # List block devices as tree
mount # Show all mounted filesystems
stat file.txt # Detailed file metadata
file mystery.bin # Identify file type
# Connectivity
ping -c 4 google.com # Ping exactly 4 times
traceroute google.com # Trace network path
-weight: 500;">curl https://example.com # Fetch URL content
-weight: 500;">curl -I https://example.com # Fetch headers only
-weight: 500;">curl -o file.zip URL # Download to file
-weight: 500;">wget URL # Download file # Network info
ip addr # Show IP addresses (modern)
ip route # Show routing table
ss -tuln # Show listening ports (modern)
ss -tulnp # Include process names # DNS
nslookup domain.com # DNS lookup
dig domain.com # Detailed DNS lookup # SSH & Remote
ssh user@host # Connect to host
ssh -p 2222 user@host # Connect on custom port
scp file.txt user@host:/path # Copy file to remote
rsync -avz src/ user@host:dst/ # Sync files (fast, resumable)
# Connectivity
ping -c 4 google.com # Ping exactly 4 times
traceroute google.com # Trace network path
-weight: 500;">curl https://example.com # Fetch URL content
-weight: 500;">curl -I https://example.com # Fetch headers only
-weight: 500;">curl -o file.zip URL # Download to file
-weight: 500;">wget URL # Download file # Network info
ip addr # Show IP addresses (modern)
ip route # Show routing table
ss -tuln # Show listening ports (modern)
ss -tulnp # Include process names # DNS
nslookup domain.com # DNS lookup
dig domain.com # Detailed DNS lookup # SSH & Remote
ssh user@host # Connect to host
ssh -p 2222 user@host # Connect on custom port
scp file.txt user@host:/path # Copy file to remote
rsync -avz src/ user@host:dst/ # Sync files (fast, resumable)
# Connectivity
ping -c 4 google.com # Ping exactly 4 times
traceroute google.com # Trace network path
-weight: 500;">curl https://example.com # Fetch URL content
-weight: 500;">curl -I https://example.com # Fetch headers only
-weight: 500;">curl -o file.zip URL # Download to file
-weight: 500;">wget URL # Download file # Network info
ip addr # Show IP addresses (modern)
ip route # Show routing table
ss -tuln # Show listening ports (modern)
ss -tulnp # Include process names # DNS
nslookup domain.com # DNS lookup
dig domain.com # Detailed DNS lookup # SSH & Remote
ssh user@host # Connect to host
ssh -p 2222 user@host # Connect on custom port
scp file.txt user@host:/path # Copy file to remote
rsync -avz src/ user@host:dst/ # Sync files (fast, resumable)
whoami # Current username
id # User ID, group ID, groups
useradd -m -s /bin/bash username # Create user with home dir
usermod -aG -weight: 500;">docker username # Add user to group
passwd username # Change user password
groupadd developers # Create group
-weight: 600;">sudo command # Run as root
-weight: 600;">sudo -i # Interactive root shell
whoami # Current username
id # User ID, group ID, groups
useradd -m -s /bin/bash username # Create user with home dir
usermod -aG -weight: 500;">docker username # Add user to group
passwd username # Change user password
groupadd developers # Create group
-weight: 600;">sudo command # Run as root
-weight: 600;">sudo -i # Interactive root shell
whoami # Current username
id # User ID, group ID, groups
useradd -m -s /bin/bash username # Create user with home dir
usermod -aG -weight: 500;">docker username # Add user to group
passwd username # Change user password
groupadd developers # Create group
-weight: 600;">sudo command # Run as root
-weight: 600;">sudo -i # Interactive root shell
-weight: 500;">apt -weight: 500;">update # Refresh package index
-weight: 500;">apt -weight: 500;">upgrade # Upgrade all packages
-weight: 500;">apt -weight: 500;">install nginx # Install package
-weight: 500;">apt -weight: 500;">remove nginx # Remove package
-weight: 500;">apt search keyword # Search for package
-weight: 500;">apt -weight: 500;">update # Refresh package index
-weight: 500;">apt -weight: 500;">upgrade # Upgrade all packages
-weight: 500;">apt -weight: 500;">install nginx # Install package
-weight: 500;">apt -weight: 500;">remove nginx # Remove package
-weight: 500;">apt search keyword # Search for package
-weight: 500;">apt -weight: 500;">update # Refresh package index
-weight: 500;">apt -weight: 500;">upgrade # Upgrade all packages
-weight: 500;">apt -weight: 500;">install nginx # Install package
-weight: 500;">apt -weight: 500;">remove nginx # Remove package
-weight: 500;">apt search keyword # Search for package
-weight: 500;">dnf -weight: 500;">update # Update all packages
-weight: 500;">dnf -weight: 500;">install nginx # Install package
-weight: 500;">dnf -weight: 500;">remove nginx # Remove package
-weight: 500;">dnf -weight: 500;">update # Update all packages
-weight: 500;">dnf -weight: 500;">install nginx # Install package
-weight: 500;">dnf -weight: 500;">remove nginx # Remove package
-weight: 500;">dnf -weight: 500;">update # Update all packages
-weight: 500;">dnf -weight: 500;">install nginx # Install package
-weight: 500;">dnf -weight: 500;">remove nginx # Remove package
tar -czvf archive.tar.gz dir/ # Create gzip-compressed tar
tar -xzvf archive.tar.gz # Extract gzip tar
tar -tf archive.tar # List contents without extracting
tar -xzvf archive.tar.gz -C /tmp/ # Extract to specific directory
zip -r archive.zip dir/ # Zip directory
unzip archive.zip # Extract zip
tar -czvf archive.tar.gz dir/ # Create gzip-compressed tar
tar -xzvf archive.tar.gz # Extract gzip tar
tar -tf archive.tar # List contents without extracting
tar -xzvf archive.tar.gz -C /tmp/ # Extract to specific directory
zip -r archive.zip dir/ # Zip directory
unzip archive.zip # Extract zip
tar -czvf archive.tar.gz dir/ # Create gzip-compressed tar
tar -xzvf archive.tar.gz # Extract gzip tar
tar -tf archive.tar # List contents without extracting
tar -xzvf archive.tar.gz -C /tmp/ # Extract to specific directory
zip -r archive.zip dir/ # Zip directory
unzip archive.zip # Extract zip
command > file.txt # Redirect stdout (overwrite)
command >> file.txt # Redirect stdout (append)
command 2> error.log # Redirect stderr
command 2>&1 # Merge stderr into stdout
command &> file.txt # Redirect both stdout + stderr
command1 | command2 # Pipe stdout to next command
tee file.txt # Write to file AND stdout # Useful pipe combos
ps aux | grep nginx | grep -v grep
find . -name "*.log" | xargs rm -f
command > file.txt # Redirect stdout (overwrite)
command >> file.txt # Redirect stdout (append)
command 2> error.log # Redirect stderr
command 2>&1 # Merge stderr into stdout
command &> file.txt # Redirect both stdout + stderr
command1 | command2 # Pipe stdout to next command
tee file.txt # Write to file AND stdout # Useful pipe combos
ps aux | grep nginx | grep -v grep
find . -name "*.log" | xargs rm -f
command > file.txt # Redirect stdout (overwrite)
command >> file.txt # Redirect stdout (append)
command 2> error.log # Redirect stderr
command 2>&1 # Merge stderr into stdout
command &> file.txt # Redirect both stdout + stderr
command1 | command2 # Pipe stdout to next command
tee file.txt # Write to file AND stdout # Useful pipe combos
ps aux | grep nginx | grep -v grep
find . -name "*.log" | xargs rm -f
crontab -e # Edit crontab
crontab -l # List crontab # Cron syntax: minute hour day month weekday command
0 2 * * * /backup.sh # Daily at 2:00 AM
*/15 * * * * /check.sh # Every 15 minutes
0 9 * * 1-5 /standup.sh # Weekdays at 9 AM
crontab -e # Edit crontab
crontab -l # List crontab # Cron syntax: minute hour day month weekday command
0 2 * * * /backup.sh # Daily at 2:00 AM
*/15 * * * * /check.sh # Every 15 minutes
0 9 * * 1-5 /standup.sh # Weekdays at 9 AM
crontab -e # Edit crontab
crontab -l # List crontab # Cron syntax: minute hour day month weekday command
0 2 * * * /backup.sh # Daily at 2:00 AM
*/15 * * * * /check.sh # Every 15 minutes
0 9 * * 1-5 /standup.sh # Weekdays at 9 AM
ssh-keygen -t ed25519 -C "[email protected]" # Generate key (recommended)
ssh-copy-id user@host # Copy public key to remote
eval "$(ssh-agent -s)" # Start agent
ssh-add ~/.ssh/id_ed25519 # Add key to agent
ssh-keygen -t ed25519 -C "[email protected]" # Generate key (recommended)
ssh-copy-id user@host # Copy public key to remote
eval "$(ssh-agent -s)" # Start agent
ssh-add ~/.ssh/id_ed25519 # Add key to agent
ssh-keygen -t ed25519 -C "[email protected]" # Generate key (recommended)
ssh-copy-id user@host # Copy public key to remote
eval "$(ssh-agent -s)" # Start agent
ssh-add ~/.ssh/id_ed25519 # Add key to agent
-weight: 500;">systemctl -weight: 500;">start nginx # Start -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">stop nginx # Stop -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">restart nginx # Restart -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">enable nginx # Start on boot
-weight: 500;">systemctl -weight: 500;">status nginx # Show -weight: 500;">service -weight: 500;">status
journalctl -u nginx -f # Follow -weight: 500;">service logs
-weight: 500;">systemctl -weight: 500;">start nginx # Start -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">stop nginx # Stop -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">restart nginx # Restart -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">enable nginx # Start on boot
-weight: 500;">systemctl -weight: 500;">status nginx # Show -weight: 500;">service -weight: 500;">status
journalctl -u nginx -f # Follow -weight: 500;">service logs
-weight: 500;">systemctl -weight: 500;">start nginx # Start -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">stop nginx # Stop -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">restart nginx # Restart -weight: 500;">service
-weight: 500;">systemctl -weight: 500;">enable nginx # Start on boot
-weight: 500;">systemctl -weight: 500;">status nginx # Show -weight: 500;">service -weight: 500;">status
journalctl -u nginx -f # Follow -weight: 500;">service logs - 7 (rwx) -- Read + Write + Execute
- 6 (rw-) -- Read + Write
- 5 (r-x) -- Read + Execute
- 4 (r--) -- Read only
- 0 (---) -- No permissions - SIGTERM (15) -- Graceful shutdown
- SIGKILL (9) -- Force kill (can't catch)
- SIGHUP (1) -- Reload config
- SIGINT (2) -- Ctrl+C - Ctrl+C -- Kill current process
- Ctrl+Z -- Suspend current process
- Ctrl+D -- Exit shell / EOF
- Ctrl+L -- Clear screen
- Ctrl+R -- Search command history (reverse)
- !! -- Repeat last command
- -weight: 600;">sudo !! -- Re-run last command with -weight: 600;">sudo - rm -rf has no recycle bin -- it's permanent. Always double-check before running.
- > overwrites, >> appends -- easy to lose data with the wrong one.
- Spaces in filenames need quoting -- rm -rf My Documents removes "My" AND "Documents".
- chmod -R 777 is almost always wrong -- massive security risk.
- kill -9 should be a last resort -- data may not be flushed. Try kill -15 (SIGTERM) first. - -weight: 600;">sudo !! -- Run last command as -weight: 600;">sudo
- touch file{1..5}.txt -- Create multiple files at once
- cmd1 && cmd2 -- Run cmd2 only if cmd1 succeeds
- python3 -m http.server 8080 -- Quick HTTP server in current directory
- watch -n 2 df -h -- Watch a command run every 2 seconds