Skip to main content
Issues can be indicated by:

Records exceed the maximum for your Algolia plan

Your Algolia plan has limits on the number of records and the size of records you can import. If you exceed these limits, the Crawler generates one or more error messages:
  • Algolia error: Record too big
  • Algolia's record quota exceeded
  • Extractors returned too many records
  • Records extracted are too big
For more information, see:

Solution

Reduce the number or size of records (with helpers.splitContentIntoRecords) extracted by the crawler or upgrade your Algolia plan.

Data isn’t sent to Algolia

If you notice that some information isn’t showing up as it should, first check that your data extraction actions are set up correctly.

Algolia access issues

Some indexing issues may be due to your Algolia permissions.

Solution

  • Ensure that the appId in the crawler configuration matches the Algolia application ID with the Crawler add on
  • Ensure that the apiKey in the crawler configuration has the necessary ACL and no additional index restrictions that would prevent it from accessing the crawler indices.
⌘I