← Back to Hub

The Script Vault

Production-ready snippets. Don't write it if you can steal it.

Find Large Files (>100MB) BASH
find / -type f -size +100M -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
Simple HTTP Server PYTHON
# Run in any directory to serve files
python3 -m http.server 8000
List all S3 Buckets and Sizes AWS CLI
aws s3 ls | awk '{print $3}' | xargs -I {} sh -c 'echo {}; aws s3 ls --summarize --human-readable --recursive s3://{} | grep "Total Size"'
Find Duplicate Rows SQL
SELECT email, COUNT(*)
FROM users
GROUP BY email
HAVING COUNT(*) > 1;
Monitor Log File (Real-time with Search) BASH
# Watch log and highlight "ERROR"
tail -f /var/log/syslog | grep --line-buffered --color=auto "ERROR"