You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*[ ] Saves application state on CTRL-C for later pickup
25
25
26
26
# Options
27
-
28
-
|Option|Behavior|
29
-
|---|---|
30
-
|`-h, --help`|Displays help information|
31
-
|`-v, --verbose`|Activate Verbose output|
32
-
|`-d, --depth`|Specify the level of depth to go to when visiting the website. Default is -1 (infinity)|
33
-
|`--ext-depth`|Specify the level of depth to go to when visiting websites that have a different domain name. Default is 0 (ignore external links), -1 is infinity|
34
-
|`-j, --jobs`|Number of threads to use|
35
-
|`-o, --output`|Output directory where the downloaded files are written|
36
-
|`-t, --tries`|Number of times to retry when the downloading of a page fails|
37
-
|`-u, --user-agent`|User agent to be used for sending requests|
38
-
|`-i, --include`|Specify a regex to include pages that match this pattern|
39
-
|`-e, --exclude`|Specify a regex to exclude pages that match this pattern|
40
-
|`-a, --auth`|Provide usernames and passwords for the downloader to use|
41
-
|`--dry-run`|Do everything without saving the files to the disk|
27
+
```console
28
+
USAGE:
29
+
suckit [FLAGS] [OPTIONS] <url>
30
+
31
+
FLAGS:
32
+
-c, --continue-on-error Flag to enable or disable exit on error
33
+
--dry-run Do everything without saving the files to the disk
34
+
-h, --help Prints help information
35
+
-V, --version Prints version information
36
+
-v, --verbose Enable more information regarding the scraping process
37
+
--visit-filter-is-download-filter Use the dowload filter in/exclude regexes for visiting as well
38
+
39
+
OPTIONS:
40
+
-a, --auth <auth>...
41
+
HTTP basic authentication credentials space-separated as "username password host". Can be repeated for
42
+
multiple credentials as "u1 p1 h1 u2 p2 h2"
43
+
--delay <delay>
44
+
Add a delay in seconds between downloads to reduce the likelihood of getting banned [default: 0]
45
+
46
+
-d, --depth <depth>
47
+
Maximum recursion depth to reach when visiting. Default is -1 (infinity) [default: -1]
48
+
49
+
-e, --exclude-download <exclude-download>
50
+
Regex filter to exclude saving pages that match this expression [default: $^]
51
+
52
+
--exclude-visit <exclude-visit>
53
+
Regex filter to exclude visiting pages that match this expression [default: $^]
54
+
55
+
--ext-depth <ext-depth>
56
+
Maximum recursion depth to reach when visiting external domains. Default is 0. -1 means infinity [default:
57
+
0]
58
+
-i, --include-download <include-download>
59
+
Regex filter to limit to only saving pages that match this expression [default: .*]
60
+
61
+
--include-visit <include-visit>
62
+
Regex filter to limit to only visiting pages that match this expression [default: .*]
63
+
64
+
-j, --jobs <jobs> Maximum number of threads to use concurrently [default: 1]
65
+
-o, --output <output> Output directory
66
+
--random-range <random-range>
67
+
Generate an extra random delay between downloads, from 0 to this number. This is added to the base delay
68
+
seconds [default: 0]
69
+
-t, --tries <tries> Maximum amount of retries on download failure [default: 20]
70
+
-u, --user-agent <user-agent> User agent to be used for sending requests [default: suckit]
0 commit comments