Skip to main content
Crawler logs help you troubleshoot crawl issues, verify site coverage, and monitor crawler performance over time. When enabled, the crawler records detailed logs for every crawl.

Enable logs

  1. Go to the Algolia dashboard and select your Algolia application.
  2. On the left sidebar, select Data sources > Crawler.
  3. Select the Crawler you want to configure.
  4. Go to Setup > Configuration and open the Security & Data Management tab.
  5. Set the Crawler Logs option to Enabled and click Save.
Crawler configuration logs After you enable logs, the crawler starts generating logs for new crawl runs. Logs arenโ€™t available for previous crawls.

View and download logs

After your next crawl completes, view or download its logs in the Logs Explorer.
  1. In the sidebar, select Status > Logs Explorer.
  2. Each row represents a crawl run. For each crawl, you can view:
    • URLs Crawled โ€“ total number of pages successfully processed.
    • URLs Ignored โ€“ pages skipped due to configuration or filters.
    • URLs Failed โ€“ pages that returned errors or couldnโ€™t be accessed.
    • Crawl Duration โ€“ how long the crawl took.
    • Log Size โ€“ file size of the generated log.
  3. Click the Download icon to export the full crawl log.
Crawler logs explorer