Bad link report: fixing broken links and redirected URLs#2407
Bad link report: fixing broken links and redirected URLs#2407
Conversation
Signed-off-by: Lynette Miles <[email protected]>
|
Caution Review failedThe pull request is closed. ℹ️ Recent review infoConfiguration used: defaults Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
📝 WalkthroughWalkthroughThis PR updates and normalizes external documentation links and formatting across many markdown files—adjusting domains, trailing slashes, GitHub repo references, and some table formatting; no code, API, or control-flow changes were made. Changes
Sequence Diagram(s)(omitted — changes are documentation-only and do not introduce new multi-component control flow) Estimated code review effort🎯 2 (Simple) | ⏱️ ~12 minutes Possibly related PRs
Suggested reviewers
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
@esmerel caught red handed by the linter!!!! :-P |
Signed-off-by: Lynette Miles <[email protected]>
in my defense, that error was there before and if folks don't know what a CVE is... they probably aren't using Fluent Bit :D |
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (2)
pipeline/outputs/slack.md (1)
5-5: Optional wording trim for readability.Line 5 can be tightened by replacing “in conjunction with” with “with” to keep the sentence more direct.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@pipeline/outputs/slack.md` at line 5, Replace the phrase "in conjunction with the Stream Processor" with "with the Stream Processor" in the sentence that currently reads "Using this plugin in conjunction with the Stream Processor is a good combination for alerting." so it becomes more direct and reads "Using this plugin with the Stream Processor is a good combination for alerting."pipeline/outputs/stackdriver.md (1)
258-258: Consider standardizing Google Cloud domain usage.Line 258 uses both
cloud.google.com(for log-based metrics) anddocs.cloud.google.comin the same section. While both may be valid, consider whether all documentation references should consistently usedocs.cloud.google.comto minimize redirect chains.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@pipeline/outputs/stackdriver.md` at line 258, The section referencing "Log Router rules" and "log-based metrics" uses two different Google domains (docs.cloud.google.com and cloud.google.com); update the `log-based metrics` link to use the docs.cloud.google.com domain (matching the "Log Router rules" link) so both references consistently use docs.cloud.google.com and avoid unnecessary redirects—locate the links in the paragraph that mentions "integration with Log Router rules" and "controlling cardinality of log based metrics" and replace the cloud.google.com URL with its docs.cloud.google.com equivalent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@pipeline/outputs/dynatrace.md`:
- Line 86: Rename the Markdown link label that currently reads "[Blog: Easily
stream logs from Fluent Bit to Dynatrace]" to a neutral technical label like
"Documentation: Easily stream logs from Fluent Bit to Dynatrace" so it correctly
reflects that the target URL points to product documentation; update the link
text in the same line (the visible label portion before the URL) to remove the
"Blog" prefix and replace with "Documentation" (or "Docs") to keep the content
technical and non-promotional.
In `@pipeline/outputs/prometheus-remote-write.md`:
- Line 239: Replace the promotional Levitate URL in the markdown link "With
[Levitate](https://last9.io/)" so it points to the official Prometheus
integration docs; update the link target to
https://docs.last9.io/docs/integrations-prometheus/ while keeping the link text
"Levitate" and the surrounding sentence intact.
In `@pipeline/outputs/stackdriver.md`:
- Around line 272-273: The troubleshooting section currently displays clickable
links that point to documentation instead of the actual API endpoints; update
the two link entries for "https://www.googleapis.com" and
"https://logging.googleapis.com" so that their hrefs point to the actual
endpoints (https://www.googleapis.com and https://logging.googleapis.com) or
alternatively change the link text to indicate they are documentation (e.g.,
"Docs: https://cloud.google.com/apis" and "Docs:
https://docs.cloud.google.com/logging/docs/apis") and add a clarifying sentence
telling users to whitelist the real endpoints (https://www.googleapis.com and
https://logging.googleapis.com) in their firewall rules.
In `@pipeline/outputs/treasure-data.md`:
- Line 11: The Treasure Data console URL in the API key description is wrong;
update the URL in the table cell that mentions `API` (the line containing "`API`
| The Treasure Data API key... | _none_") to use the correct US console URL
`https://console.treasuredata.com/users/sign_in` instead of
`https://console.us01.treasuredata.com/users/sign_in`.
In `@pipeline/outputs/vivo-exporter.md`:
- Line 3: In the Vivo Exporter README intro sentence ("Vivo Exporter is an
output plugin... [Vivo project](https://github.com/chronosphereio/calyptia-vivo)
."), remove the stray space before the period so the link is immediately
followed by the period, and adjust the wording to keep a neutral, technical tone
(e.g., retain "is an output plugin that exposes logs, metrics, and traces
through an HTTP endpoint" without promotional phrasing); update the sentence in
the file content for "Vivo Exporter" accordingly.
---
Nitpick comments:
In `@pipeline/outputs/slack.md`:
- Line 5: Replace the phrase "in conjunction with the Stream Processor" with
"with the Stream Processor" in the sentence that currently reads "Using this
plugin in conjunction with the Stream Processor is a good combination for
alerting." so it becomes more direct and reads "Using this plugin with the
Stream Processor is a good combination for alerting."
In `@pipeline/outputs/stackdriver.md`:
- Line 258: The section referencing "Log Router rules" and "log-based metrics"
uses two different Google domains (docs.cloud.google.com and cloud.google.com);
update the `log-based metrics` link to use the docs.cloud.google.com domain
(matching the "Log Router rules" link) so both references consistently use
docs.cloud.google.com and avoid unnecessary redirects—locate the links in the
paragraph that mentions "integration with Log Router rules" and "controlling
cardinality of log based metrics" and replace the cloud.google.com URL with its
docs.cloud.google.com equivalent.
ℹ️ Review info
Configuration used: defaults
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (34)
CONTRIBUTING.mdREADME.mdabout/fluentd-and-fluent-bit.mdadministration/monitoring.mdinstallation/downloads.mdpipeline/filters/aws-metadata.mdpipeline/filters/kubernetes.mdpipeline/inputs/cpu-metrics.mdpipeline/inputs/kafka.mdpipeline/inputs/windows-event-log-winevtlog.mdpipeline/outputs/bigquery.mdpipeline/outputs/chronicle.mdpipeline/outputs/cloudwatch.mdpipeline/outputs/datadog.mdpipeline/outputs/dynatrace.mdpipeline/outputs/elasticsearch.mdpipeline/outputs/firehose.mdpipeline/outputs/gelf.mdpipeline/outputs/influxdb.mdpipeline/outputs/kafka.mdpipeline/outputs/loki.mdpipeline/outputs/observe.mdpipeline/outputs/openobserve.mdpipeline/outputs/parseable.mdpipeline/outputs/prometheus-remote-write.mdpipeline/outputs/s3.mdpipeline/outputs/slack.mdpipeline/outputs/stackdriver.mdpipeline/outputs/stackdriver_special_fields.mdpipeline/outputs/tcp-and-tls.mdpipeline/outputs/treasure-data.mdpipeline/outputs/vivo-exporter.mdpipeline/parsers/configuring-parser.mdpipeline/parsers/regular-expression.md
| ### Levitate | ||
|
|
||
| With [Levitate](https://last9.io/levitate-tsdb), you must use the Levitate cluster-specific write URL and specify the HTTP username and password for the token created for your Levitate cluster. | ||
| With [Levitate](https://last9.io/), you must use the Levitate cluster-specific write URL and specify the HTTP username and password for the token created for your Levitate cluster. |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Last9 Levitate Prometheus remote_write documentation setup
💡 Result:
To set up Prometheus remote_write to Last9 Levitate (TSDB), configure Prometheus to send samples to Last9’s remote-write endpoint and restart Prometheus.
1) Get your Last9 remote_write URL
A commonly documented URL template for Last9 is:
https://<user>:<password>@read-app-tsdb.last9.io/hot/v1/metrics/<PID>/sender/<account> [1]
(You’ll replace <PID>, <account>, and credentials with values from your Last9 workspace/org.)
2) Add remote_write to prometheus.yml
Use either inline user/pass in the URL (as above) or (recommended) basic_auth:
remote_write:
- name: last9-levitate
url: https://read-app-tsdb.last9.io/hot/v1/metrics/<PID>/sender/<account>
basic_auth:
username: <user>
password: <password>Prometheus supports remote_write.url, basic_auth, custom headers, write_relabel_configs, and other settings in this stanza. [2]
3) Restart Prometheus and verify
After restart, verify remote-write health via Prometheus logs and/or remote-write metrics (and tune queue_config if you’re back-pressuring or dropping data). [3]
Sources: Last9 endpoint template [1], Prometheus remote_write config reference [2], Prometheus remote_write behavior/tuning guidance [3].
🌐 Web query:
Last9 official documentation portal Levitate TSDB docs
💡 Result:
Official Last9 (Levitate TSDB) documentation portal and relevant Levitate/TSDB docs:
https://docs.last9.io/docs/
https://docs.last9.io/docs/integrations-prometheus/ (Prometheus remote_write → Last9/Levitate)
https://docs.last9.io/docs/change-events (Change Events, often used alongside Levitate)
https://last9.io/levitate-tsdb/ (Levitate TSDB product page)
[1] (docs.last9.io)
[2] (docs.last9.io)
[3] (last9.io)
Citations:
- 1: https://docs.last9.io/docs/integrations-prometheus/
- 2: https://docs.last9.io/docs/change-events
- 3: https://last9.io/levitate-tsdb/?utm_source=openai
Update Levitate link to official documentation.
Line 239 currently links to the company homepage (https://last9.io/), which is promotional rather than technical. Replace it with the official Prometheus integration documentation: https://docs.last9.io/docs/integrations-prometheus/
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@pipeline/outputs/prometheus-remote-write.md` at line 239, Replace the
promotional Levitate URL in the markdown link "With
[Levitate](https://last9.io/)" so it points to the official Prometheus
integration docs; update the link target to
https://docs.last9.io/docs/integrations-prometheus/ while keeping the link text
"Levitate" and the surrounding sentence intact.
Signed-off-by: Lynette Miles <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Lynette Miles <[email protected]>
This PR fixes most of the issues in #2403 by correcting the broken links (aside from what @alexakreizinger fixed in #2406). It also updates a whole bunch of links which were going through up to four (!!) redirects (mostly due to one company).
next month's link report should look much cleaner.
Summary by CodeRabbit