algolia crawler crawl <crawler_id> --urls <url>... [flags]
Examples
Crawl the URLs βhttps://www.example.comβ and βhttps://www.example2.com/β for the crawler with the ID βmy-crawlerβ:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/
Crawl the URLs βhttps://www.example.comβ and βhttps://www.example2.com/β for the crawler with the ID βmy-crawlerβ and save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save
Crawl the URLs βhttps://www.example.comβ and βhttps://www.example2.com/β for the crawler with the ID βmy-crawlerβ and donβt save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save=false
When true, the URLs are added to your extraUrls (unless present in startUrls or sitemaps).
When false, the URLs arenβt added.
When unspecified, the URLs are added to your extraUrls (unless present in startUrls or sitemaps or they werenβt indexed during the preceding reindex).
The URLs to crawl (maximum 50).