13.1. Network Commands: Ping, Curl, and Wget#
13.1.1. Common Pitfalls#
1. Not handling network timeouts
# Bad: Hangs indefinitely on network issue
curl https://slow-server.com
# Good: Set explicit timeouts
curl --max-time 10 https://slow-server.com
curl --connect-timeout 5 --max-time 10 https://slow-server.com
2. Assuming success without checking exit codes
# Bad: Proceeds even if download fails
curl -O https://example.com/file.tar.gz
tar -xzf file.tar.gz # May extract invalid file
# Good: Check for errors
if curl -f -O https://example.com/file.tar.gz; then
tar -xzf file.tar.gz
else
echo "Download failed" >&2
exit 1
fi
3. Not handling URLs with special characters
# Bad: URL with spaces or special chars breaks
curl https://example.com/search?q=hello world
# Good: URL encode or quote properly
curl "https://example.com/search?q=hello%20world"
curl --data-urlencode "q=hello world" https://example.com/search
4. Exposing sensitive data in commands
# Bad: API key visible in process list
curl -H "Authorization: $API_KEY" https://api.example.com
# Better: Use stdin or files
curl -H @headers.txt https://api.example.com
# Or environment variables (less visible):
export AUTH_HEADER="Authorization: Bearer $API_KEY"
5. Not handling partial downloads
# Bad: Resuming after incomplete download
wget https://example.com/file.iso # Interrupted
wget https://example.com/file.iso # Starts from beginning
# Good: Resume with -c flag
wget -c https://example.com/file.iso # Resumes from last position
13.1.2. Bulk Downloads with Wget#
The wget command is designed for recursive downloads and batch file retrieval.
13.1.2.1. Basic Wget Usage#
#!/bin/bash
# Simple download
wget https://example.com/file.tar.gz
# Save with different name
wget -O archive.tar.gz https://example.com/file.tar.gz
# Resume partial download
wget -c https://example.com/large-file.iso
# Download to specific directory
wget -P /tmp/downloads https://example.com/file.zip
# Set timeout
wget --timeout=10 https://example.com/file.tar.gz
# Limit bandwidth
wget --limit-rate=100k https://example.com/large-file.iso
# Download multiple files
wget https://example.com/file1.txt https://example.com/file2.txt
13.1.2.2. Recursive and Batch Downloads#
#!/bin/bash
# Recursive download (mirror a website)
wget -r https://example.com/docs/
# -r: recursive
# -l: depth level (-l 1 for one level only)
# -p: get all prerequisites
# Mirror with restrictions
wget -m -l 2 -p -E -k https://example.com
# -m: mirror mode
# -l 2: max depth 2
# -p: page requisites
# -E: adjust filenames for HTML
# -k: convert links for offline browsing
# Download list from file
wget -i urls.txt
# Background download
wget -b https://example.com/large-file.iso
# Check progress:
tail -f wget-log
# Reject certain file types
wget -r --reject=.jpg,.png https://example.com/docs/
13.1.2.3. Batch Download Script#
#!/bin/bash
# Download multiple versions of a file
download_versions() {
local base_url=$1
local pattern=$2
local output_dir=${3:-.}
mkdir -p "$output_dir"
for version in {1..10}; do
local url="${base_url}/${pattern/VERSION/$version}"
local filename="$output_dir/file_v$version.tar.gz"
echo "Downloading: $url"
if wget -q -O "$filename" "$url"; then
echo "✓ Downloaded: $filename"
else
echo "✗ Failed: $url"
fi
done
}
download_versions "https://releases.example.com" "app-VERSION.tar.gz" "/tmp/releases"
13.1.3. Data Transfer with Curl#
The curl command is a versatile tool for transferring data using URLs, supporting HTTP, HTTPS, FTP, and more.
13.1.3.1. Basic Curl Operations#
#!/bin/bash
# Simple GET request
curl https://api.github.com/users/torvalds
# Save output to file
curl https://example.com > page.html
curl -o page.html https://example.com
curl -O https://example.com/file.tar.gz # Keep original filename
# Follow redirects
curl -L https://short.url
# POST request with data
curl -X POST -d "name=value&key=data" https://example.com/api
# JSON POST request
curl -X POST -H "Content-Type: application/json" \
-d '{"name":"John","age":30}' \
https://api.example.com/users
# Include response headers
curl -i https://example.com # Show headers and body
curl -I https://example.com # Headers only
# Set custom headers
curl -H "Authorization: Bearer TOKEN" https://api.example.com
curl -H "User-Agent: MyScript/1.0" https://example.com
# Set timeout
curl --max-time 5 https://example.com
curl --connect-timeout 3 https://example.com
13.1.3.2. Practical API Usage#
#!/bin/bash
# Check API health
check_api_health() {
local url=$1
local expected_code=${2:-200}
response=$(curl -s -o /dev/null -w "%{http_code}" "$url")
if [[ $response -eq $expected_code ]]; then
echo "✓ API healthy (HTTP $response)"
return 0
else
echo "✗ API unhealthy (HTTP $response)"
return 1
fi
}
check_api_health "https://api.github.com"
# Fetch and parse JSON
get_github_user_info() {
local username=$1
curl -s "https://api.github.com/users/$username" | \
jq '{name: .name, repos: .public_repos, followers: .followers}'
}
get_github_user_info "torvalds"
# Retry with backoff
curl_with_retry() {
local url=$1
local max_attempts=3
local attempt=1
while [[ $attempt -le $max_attempts ]]; do
if curl -s -f "$url" > /dev/null; then
curl -s "$url"
return 0
fi
echo "Attempt $attempt failed, retrying..." >&2
sleep $((2 ** (attempt - 1)))
((attempt++))
done
return 1
}
13.1.4. Testing Network Connectivity with Ping#
The ping command tests reachability and measures round-trip time to a host.
13.1.4.1. Basic Ping Usage#
#!/bin/bash
# Simple ping test
ping -c 4 google.com # -c: count (stop after 4 packets)
# Ping with timeout
ping -c 1 -W 2 example.com # -W: timeout in seconds
# Check if host is reachable
if ping -c 1 -W 2 8.8.8.8 > /dev/null 2>&1; then
echo "Host is reachable"
else
echo "Host is unreachable"
fi
# Measure latency
response_time=$(ping -c 1 google.com | grep 'time=' | awk -F'time=' '{print $2}' | awk '{print $1}')
echo "Response time: $response_time ms"
# Continuous ping (stop with Ctrl+C)
ping google.com
# Ping with packet size
ping -c 4 -s 1024 google.com # -s: packet size in bytes
13.1.4.2. Practical Network Diagnostics#
#!/bin/bash
# Check multiple hosts
declare -a hosts=("8.8.8.8" "1.1.1.1" "208.67.222.222")
for host in "${hosts[@]}"; do
if ping -c 1 -W 2 "$host" > /dev/null 2>&1; then
echo "✓ $host - reachable"
else
echo "✗ $host - unreachable"
fi
done
# Monitor connectivity with retry
retry_ping() {
local host=$1
local max_retries=3
local retry=0
while [[ $retry -lt $max_retries ]]; do
if ping -c 1 -W 2 "$host" > /dev/null 2>&1; then
echo "$host is reachable"
return 0
fi
echo "Attempt $((retry + 1))/$max_retries failed"
((retry++))
sleep 2
done
echo "$host is unreachable after $max_retries attempts"
return 1
}
retry_ping google.com