
Potential suggestions
Suggestion | Solution |
---|---|
Automatic crawls schedule is not set | To ensure your data is always up-to-date, schedule automatic crawls |
Robots.txt file is missing or Algolia Crawler is disallowed | The crawler encountered issues with your siteβs robots.txt file. Either the file is missing, or it doesnβt allow the Algolia Crawler |
Sitemap not found | Ensure efficient crawling by adding sitemaps to the crawler configuration |
Some HTML pages or documents are too big | Records exceed the maximum for your Algolia plan |
URLs ignored or failed | Review the crawl status |
Website HTML contains <meta name="robots" content="NOINDEX,NOFOLLOW"> tag | Remove these meta tags from the pages or ignore them in the crawler configuration |