InnerLink Spider is a focused internal link analysis tool for SEO professionals, growth teams, and technical marketers. It analyzes a website URL or sitemap to uncover linking gaps that strengthen site architecture, reinforce topical clusters, and improve crawl paths. The output is execution-ready: source pages, target pages, suggested anchor text, and exportable CSV recommendations.
Most internal linking workflows are still buried inside technical crawlers, scripts, or general-purpose SEO platforms. InnerLink Spider stands out by delivering a dedicated, browser-based experience for internal link analysis, making it easier to demo, review, and act on opportunities without relying on terminal-heavy workflows.
Built for the web — Not just for power users comfortable with CLI tools
Single high-value workflow — Focused entirely on finding internal links worth adding
Action-ready output — Produces recommendations instead of raw crawl data alone
Fast to evaluate — Ideal for client work, content audits, and portfolio demonstrations
URL and Sitemap Analysis — Start with a domain homepage or sitemap to kick off internal link discovery
Link Opportunity Detection — Surfaces pages that should link to one another based on relevance and site structure
Anchor Text Suggestions — Recommends source pages, target pages, and suggested anchor phrases
CSV Export — Downloads opportunities in a handoff-friendly format for implementation
Demo Mode — Makes the product easy to evaluate with realistic sample data and polished UX
Clean Web UI — Presents findings in an approachable interface designed for marketers and SEO operators
The current public demo returns representative sample opportunities, but the intended internal link discovery workflow follows a straightforward analysis pipeline:
- Crawl and collect pages — Start from a homepage or sitemap, discover crawlable internal URLs, and normalize them to remove duplicates caused by protocol, trailing slash, or path variations.
- Build the internal link graph — Parse each page, extract its outgoing internal links, and map the site as a graph so the tool can understand which pages are connected, isolated, or underlinked.
- Identify gaps and opportunities — Compare page connectivity with page topics and on-page mentions to flag orphan-page candidates, weak content clusters, and source pages that mention concepts already covered elsewhere on the site but do not link to them.
- Recommend implementation-ready links — Generate suggested source-target pairs, anchor text, and contextual snippets so teams can prioritize the highest-value internal links and export the recommendations as CSV.
| Layer | Technology |
|---|---|
| Framework | Next.js 14 (App Router) |
| Language | TypeScript 5 |
| Styling | Tailwind CSS 3 |
- Node.js 18.17+ or later
- npm 9+ or later
git clone https://github.com/seankrux/innerlink-spider.git
cd innerlink-spider
npm install
npm run devOpen http://localhost:3000 in your browser.
npm run build
npm startsrc/
app/
layout.tsx # Root layout with metadata
page.tsx # Main page
globals.css # Global styles
components/
LinkFinder.tsx # Core analysis UI
lib/
linkAnalyzer.ts # Link analysis logic
vercel deployContributions are welcome, especially around crawl accuracy, graph analysis, UX polish, and export workflows.
- Fork the repository and create a focused feature branch.
- Install dependencies with
npm install. - Run the app locally with
npm run dev. - Validate production readiness with
npm run build. - Open a pull request with a concise summary of the problem solved and the approach taken.