The Crawler Logs feature allows you to monitor and debug your crawler’s activity by recording detailed logs for each crawl run. Logs are useful for troubleshooting crawl issues, verifying site coverage, and monitoring crawler performance over time.
settingsBasic authentication header of the form Basic <encoded-value>, where <encoded-value> is the base64-encoded string username:password.
Crawler ID. Universally unique identifier (UUID) of the crawler.
"e0f6db8a-24f5-4092-83a4-1b2c6cb6d809"
Date 'from' filter. Unix string 'from' date.
1762264044
Date 'until' filter. Unix string 'until' date.
1762264044
Status to filter 'DONE', 'SKIPPED' or 'FAILED'. Crawled URL status.
For more information, see Troubleshooting by crawl status.
DONE, SKIPPED, FAILED Limit of the query results.
1 <= x <= 100010
Offset of the query results.
11
Order of the query 'ASC' or 'DESC'. Order of the query.
ASC, DESC