saveObjects
method.
The problem is that you would perform a million individual network calls,
which would take way too long and saturate your Algolia cluster with indexing jobs.
A better approach is to split your collection of records into smaller collections, then send each chunk one by one.
For optimal indexing performance, aim for a batch size of about 10 MB,
representing between 1,000 and 10,000 records, depending on the average record size.
- Batching records doesn’t reduce your operations count. Algolia counts indexing operations per record, not per method call, so from a pricing perspective, batching records is the same as indexing records individually.
- Be careful when approaching your plan’s maximum number of records. If you’re close to the record limit, batch operations may fail. The error message “You have exceeded your Record quota” means the engine doesn’t know if the batch operation will update records or add new ones. If this happens, upgrade to a plan with a higher record limit or reduce your batch size.
Using the API
When using thesaveObjects
method,
the API client automatically chunks your records into batches of 1,000 objects.
Using the dashboard
You can also send your records in your Algolia dashboard.Add records manually
- Go to the Algolia dashboard and select your Algolia application.
- On the left sidebar, select Search.
- Select your Algolia index.
- Open the Add records menu and select Add manually.
- Paste your records in the JSON editor and click Save.
Upload a file
- Go to your dashboard and select your index.
- Click the Add records tab and select Upload file.
- Select the file you want to upload and click Upload.