Skip to content

Commit a0cd9f9

Browse files
committed
update readme
Signed-off-by: abzcoding <[email protected]>
1 parent 99cd40a commit a0cd9f9

File tree

1 file changed

+8
-11
lines changed

1 file changed

+8
-11
lines changed

README.md

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,7 @@ Binary file will be built at ./bin/hget, you can copy to /usr/bin or /usr/local/
2727

2828
```bash
2929
hget [-n parallel] [-skip-tls false] [-rate bwRate] [-proxy proxy_server] [-file filename] [URL] # to download url, with n connections, and not skip tls certificate
30-
hget tasks # get interrupted tasks
31-
hget resume [TaskName | URL] # to resume task
30+
hget - resume TaskName # to resume task
3231
hget -proxy "127.0.0.1:12345" URL # to download using socks5 proxy
3332
hget -proxy "http://sample-proxy.com:8080" URL # to download using http proxy
3433
hget -file sample.txt # to download a list of urls
@@ -45,19 +44,17 @@ hget -resume "ubuntu-24.04.1-desktop-amd64.iso"
4544
[I] ➜ hget -h
4645
Usage of hget:
4746
-file string
48-
filepath that contains links in each line
47+
path to a file that contains one URL per line
4948
-n int
50-
connection (default 16)
49+
number of connections (default 12)
5150
-proxy string
52-
proxy for downloading, ex
53-
-proxy '127.0.0.1:12345' for socks5 proxy
54-
-proxy 'http://proxy.com:8080' for http proxy
51+
proxy for downloading, e.g. -proxy '127.0.0.1:12345' for socks5 or -proxy 'http://proxy.com:8080' for http proxy
5552
-rate string
56-
bandwidth limit to use while downloading, ex
57-
-rate 10kB
58-
-rate 10MiB
53+
bandwidth limit during download, e.g. -rate 10kB or -rate 10MiB
54+
-resume string
55+
resume download task with given task name (or URL)
5956
-skip-tls
60-
skip verify certificate for https (default true)
57+
skip certificate verification for https (default true)
6158
```
6259

6360
To interrupt any on-downloading process, just ctrl-c or ctrl-d at the middle of the download, hget will safely save your data and you will be able to resume later

0 commit comments

Comments
 (0)