Skip to content

Conversation

@samcunliffe
Copy link
Member

It might mask problems, but it's a balance.
During periods of several builds, I've noticed a few pages return 429 (too many requests).

@samcunliffe samcunliffe requested a review from Copilot December 17, 2025 21:25
@samcunliffe samcunliffe added the housekeeping Tidying up, fixing typos, and general maintenance label Dec 17, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR modifies the lychee link checker configuration in the GitHub Actions workflow to accept HTTP 429 (Too Many Requests) responses as valid, in addition to 200 (OK) responses. The author notes this is a trade-off to prevent build failures during periods of high activity when some pages return 429 errors.

Key Changes:

  • Added --accept 200,429 flag to the lychee link checker arguments to treat rate-limited responses as successful

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

# nice to check a few specific root-level markdown files (README.md etc) for valid links,
# but the main thing to check is the rendered website (in the `open-source` directory)
args: "--no-progress --verbose --github-token ${{ secrets.GITHUB_TOKEN }} -- README.md CONTRIBUTING.md open-source/"
args: "--accept 200,429 --no-progress --verbose --github-token ${{ secrets.GITHUB_TOKEN }} -- README.md CONTRIBUTING.md open-source/"
Copy link

Copilot AI Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Accepting 429 (Too Many Requests) as a valid response defeats the purpose of link checking. A 429 status indicates rate limiting, not that the link is valid. This approach masks potential issues where links may actually be broken or the site may be down - you won't know because the check passes.

Consider these alternatives instead:

  1. Use the lychee retry functionality with appropriate delays (e.g., --max-retries 3 --retry-wait-time 30) to handle temporary rate limiting
  2. Add specific URLs that are known to rate limit to the .lycheeignore file if they are persistent false positives
  3. Use the --max-concurrency flag to reduce the number of concurrent requests and avoid triggering rate limits
  4. Implement conditional logic to only run link checking on main branch pushes (not PRs) to reduce frequency

These approaches address rate limiting without compromising the integrity of the link validation.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

housekeeping Tidying up, fixing typos, and general maintenance

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants