mirror of
https://github.com/webrecorder/browsertrix-crawler.git
synced 2025-10-19 06:23:16 +00:00
Make numRetries configurable (#754)
Add --numRetries param, default to 1 instead of 5.
This commit is contained in:
parent
f379da19be
commit
2e46140c3f
6 changed files with 30 additions and 12 deletions
|
@ -240,6 +240,9 @@ Options:
|
|||
s [boolean] [default: false]
|
||||
--writePagesToRedis If set, write page objects to redis
|
||||
[boolean] [default: false]
|
||||
--numRetries If set, number of times to retry a p
|
||||
age that failed to load
|
||||
[number] [default: 1]
|
||||
--failOnFailedSeed If set, crawler will fail with exit
|
||||
code 1 if any seed fails. When combi
|
||||
ned with --failOnInvalidStatus,will
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue