Skip to content

spidermedic

Crawl your site and catch broken links before they reach production.

What is spidermedic?

spidermedic is a website crawler that validates every page it finds. Give it a URL and it walks the entire site, reporting any page that returns a 4xx or 5xx status code. It exits with code 1 if errors are found — making it a natural fit for CI pipelines.

GitHub Action

Drop one step into any workflow and automatically verify your site after every deploy.

CLI tool

Run spidermedic locally during development or integrate it into any shell script.

Multiple output formats

Results in your terminal, as JSON, or as CSV — pipe them wherever they need to go.

Fast & configurable

Tune concurrency, request interval, crawl depth, and path prefix to match your site.

Quick start

As a GitHub Action

.github/workflows/check-links.yml
name: Link checker
on:
push:
branches: [main]
jobs:
check-links:
runs-on: ubuntu-latest
steps:
- uses: beekmanlabs/spidermedic@main
with:
url: 'https://example.com'

As a CLI

Terminal window
cargo install spidermedic
spidermedic --url=https://example.com