Skip to content

CLI

Installation

Terminal window
cargo install spidermedic

Usage

spidermedic --url=<URL> [OPTIONS]

Flags

FlagDefaultDescription
--urlrequiredStarting URL to crawl
--path/Restrict crawl to this URL path prefix
--port80Port override
--interval300Milliseconds between requests
--concurrency10Max parallel in-flight requests
--max-depth0Max depth (0 = unlimited)
--outputterminalterminal / json / csv

Examples

Terminal window
# Crawl a site with default settings
spidermedic --url=https://example.com
# Faster crawl with more concurrency, limited to 3 levels deep
spidermedic --url=https://example.com --concurrency=20 --max-depth=3
# Only crawl the /docs section
spidermedic --url=https://example.com/docs --path=/docs
# JSON output piped to jq
spidermedic --url=https://example.com --output=json | jq '.[] | select(.status >= 400)'
# CSV output saved to a file
spidermedic --url=https://example.com --output=csv > results.csv

Exit codes

CodeMeaning
0All crawled pages returned non-error responses
1One or more 4xx / 5xx responses were found

Use the exit code in shell scripts to gate deployments:

Terminal window
spidermedic --url=https://staging.example.com && deploy_to_production.sh