Crawls the specified URLs, extracts records from them, and adds them to the index.
If a crawl is currently running (the crawler’s reindexing property is true),
the records are added to a temporary index.
This operation is rate-limited to 500 requests every 24 hours.
Basic authentication header of the form Basic <encoded-value>, where <encoded-value> is the base64-encoded string username:password.
Crawler ID. Universally unique identifier (UUID) of the crawler.
"e0f6db8a-24f5-4092-83a4-1b2c6cb6d809"
URLs to crawl.
["https://www.algolia.com/products/crawler/"]Whether the specified URLs should be added to the extraURLs property of the crawler configuration.
If unspecified, the URLs are added to the extraUrls field only if they haven't been indexed during the last reindex.
OK
Universally unique identifier (UUID) of the task.
"98458796-b7bb-4703-8b1b-785c1080b110"