Skip to main content
Usage
algolia crawler crawl <crawler_id> --urls <url>... [flags]

Examples

Crawl the URLs “https://www.example.com” and “https://www.example2.com/” for the crawler with the ID “my-crawler”:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/
Crawl the URLs “https://www.example.com” and “https://www.example2.com/” for the crawler with the ID “my-crawler” and save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save
Crawl the URLs “https://www.example.com” and “https://www.example2.com/” for the crawler with the ID “my-crawler” and don’t save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save=false

Flags

-s, --save
When true, the URLs are added to your extraUrls (unless present in startUrls or sitemaps). When false, the URLs aren’t added. When unspecified, the URLs are added to your extraUrls (unless present in startUrls or sitemaps or they weren’t indexed during the preceding reindex).
-u, --urls
The URLs to crawl (maximum 50).