Most SEO tools stop when a website times out. Link Hunter goes deeper. This Python tool is designed for developers who want to build a powerful expired domain strategy. It recursively scans resource lists (like "Awesome" lists), handles connection timeouts intelligently, and verifies DNS records to find domains that are truly available.
- Smart Timeout Analysis: Distinguishes between "Server Down" (useless) and "NXDOMAIN" (No IP = Potentially Free).
- Deep Scan Mode: Automatically detects Hubs (like GitHub Repositories) and dives one level deeper to find hidden links in READMEs.
- Anti-Blocking: Rotates User-Agents and uses exponential backoff retries to mimic real browser behavior.
- Internal Filter: Automatically ignores internal links to keep your results clean.
- GitHub Actions Ready: Runs on a schedule for free, alerting you via Issues when gems are found.
- Add Targets: Put URLs you want to scan into
targets.txt. - Run:
python main.py
- Check Results: Findings are written to results.md with a specific reason code:
-
DNS_MISSING: The domain has no IP address (High probability of availability).
-
WHOIS_FREE: The domain exists but has no active WHOIS record.
# Enable Debug to see every request
hunter = ExpiredLinkHunter(debug=True)
# Set Deep Scan depth (1 = Scan target + linked GitHub Repos)
hunter = ExpiredLinkHunter(max_depth=1)pip install -r requirements.txtThis project is open source and available under the MIT License.
If you use this script in a commercial project or content, please include a link back to the original article:
Disclaimer: This tool is for educational purposes only. Use it responsibly. The author is not responsible for blocked IP addresses or misuse.