Skip to content

Commit fb124a5

Browse files
committed
misc: Prepare for release
Bump crate version Update README (MSRV and usage)
1 parent b467ee8 commit fb124a5

File tree

3 files changed

+66
-32
lines changed

3 files changed

+66
-32
lines changed

Cargo.lock

Lines changed: 14 additions & 13 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Cargo.toml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
[package]
22
name = "suckit"
3-
version = "0.1.2"
3+
version = "0.2.0"
44
edition = "2018"
55
authors = ["Esteban \"Skallwar\" Blanc <estblcsk@gmail.com>",
6-
"Arthur \"CohenArthur\" Cohen <arthur.cohen@epita.fr>"]
6+
"Arthur \"CohenArthur\" Cohen <cohenarthur.dev@gmail.com>"]
77
license = "MIT OR Apache-2.0"
88
homepage = "https://github.com/skallwar/suckit"
99
repository = "https://github.com/skallwar/suckit"
@@ -20,7 +20,7 @@ include = [
2020
]
2121

2222
[package.metadata]
23-
msrv = "1.44.1"
23+
msrv = "1.49.0"
2424

2525
[lib]
2626
name = "suckit"

README.md

Lines changed: 49 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
[![Deps](https://deps.rs/repo/github/Skallwar/suckit/status.svg)](https://deps.rs/repo/github/Skallwar/suckit)
66
[![License](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
77
[![License](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
8-
![MSRV](https://img.shields.io/badge/MSRV-1.46.0-blue)
8+
![MSRV](https://img.shields.io/badge/MSRV-1.49.0-blue)
99

1010
# SuckIT
1111

@@ -24,21 +24,54 @@ your disk.
2424
* [ ] Saves application state on CTRL-C for later pickup
2525

2626
# Options
27-
28-
|Option|Behavior|
29-
|---|---|
30-
|`-h, --help`|Displays help information|
31-
|`-v, --verbose`|Activate Verbose output|
32-
|`-d, --depth`|Specify the level of depth to go to when visiting the website. Default is -1 (infinity)|
33-
|`--ext-depth`|Specify the level of depth to go to when visiting websites that have a different domain name. Default is 0 (ignore external links), -1 is infinity|
34-
|`-j, --jobs`|Number of threads to use|
35-
|`-o, --output`|Output directory where the downloaded files are written|
36-
|`-t, --tries`|Number of times to retry when the downloading of a page fails|
37-
|`-u, --user-agent`|User agent to be used for sending requests|
38-
|`-i, --include`|Specify a regex to include pages that match this pattern|
39-
|`-e, --exclude`|Specify a regex to exclude pages that match this pattern|
40-
|`-a, --auth`|Provide usernames and passwords for the downloader to use|
41-
|`--dry-run`|Do everything without saving the files to the disk|
27+
```console
28+
USAGE:
29+
suckit [FLAGS] [OPTIONS] <url>
30+
31+
FLAGS:
32+
-c, --continue-on-error Flag to enable or disable exit on error
33+
--dry-run Do everything without saving the files to the disk
34+
-h, --help Prints help information
35+
-V, --version Prints version information
36+
-v, --verbose Enable more information regarding the scraping process
37+
--visit-filter-is-download-filter Use the dowload filter in/exclude regexes for visiting as well
38+
39+
OPTIONS:
40+
-a, --auth <auth>...
41+
HTTP basic authentication credentials space-separated as "username password host". Can be repeated for
42+
multiple credentials as "u1 p1 h1 u2 p2 h2"
43+
--delay <delay>
44+
Add a delay in seconds between downloads to reduce the likelihood of getting banned [default: 0]
45+
46+
-d, --depth <depth>
47+
Maximum recursion depth to reach when visiting. Default is -1 (infinity) [default: -1]
48+
49+
-e, --exclude-download <exclude-download>
50+
Regex filter to exclude saving pages that match this expression [default: $^]
51+
52+
--exclude-visit <exclude-visit>
53+
Regex filter to exclude visiting pages that match this expression [default: $^]
54+
55+
--ext-depth <ext-depth>
56+
Maximum recursion depth to reach when visiting external domains. Default is 0. -1 means infinity [default:
57+
0]
58+
-i, --include-download <include-download>
59+
Regex filter to limit to only saving pages that match this expression [default: .*]
60+
61+
--include-visit <include-visit>
62+
Regex filter to limit to only visiting pages that match this expression [default: .*]
63+
64+
-j, --jobs <jobs> Maximum number of threads to use concurrently [default: 1]
65+
-o, --output <output> Output directory
66+
--random-range <random-range>
67+
Generate an extra random delay between downloads, from 0 to this number. This is added to the base delay
68+
seconds [default: 0]
69+
-t, --tries <tries> Maximum amount of retries on download failure [default: 20]
70+
-u, --user-agent <user-agent> User agent to be used for sending requests [default: suckit]
71+
72+
ARGS:
73+
<url> Entry point of the scraping
74+
```
4275

4376
# Example
4477

0 commit comments

Comments
 (0)