Skip to main content
Usage
algolia crawler crawl <crawler_id> --urls <url>... [flags]

Examples

Crawl the URLs β€œhttps://www.example.com” and β€œhttps://www.example2.com/” for the crawler with the ID β€œmy-crawler”:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/
Crawl the URLs β€œhttps://www.example.com” and β€œhttps://www.example2.com/” for the crawler with the ID β€œmy-crawler” and save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save
Crawl the URLs β€œhttps://www.example.com” and β€œhttps://www.example2.com/” for the crawler with the ID β€œmy-crawler” and don’t save them in the configuration:
algolia crawler crawl my-crawler --urls https://www.example.com,https://www.example2.com/ --save=false

Flags

-s, --save
When true, the URLs are added to your extraUrls (unless present in startUrls or sitemaps). When false, the URLs aren’t added. When unspecified, the URLs are added to your extraUrls (unless present in startUrls or sitemaps or they weren’t indexed during the preceding reindex).
-u, --urls
The URLs to crawl (maximum 50).
⌘I