Skip to content

Commit 558675c

Browse files
committed
Update README.md
1 parent c88b8b9 commit 558675c

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,11 +49,13 @@ Makes requests to `urls` and saves all files found with `sources` to `directory`
4949
- `filenameGenerator`: name of one of the bundled filenameGenerators, or a custom filenameGenerator function *(optional, default: 'byType')*
5050
- `defaultFilename`: filename for index page *(optional, default: 'index.html')*
5151
- `prettifyUrls`: whether urls should be 'prettified', by having the `defaultFilename` removed *(optional, default: false)*
52-
- `sources`: array of objects to load, specifies selectors and attribute values to select files for loading *(optional, see default value in `lib/config/defaults.js`)*
52+
- `sources`: array of objects to load, specifies selectors and attribute values to select files for loading *(optional, see example below)*
5353
- `subdirectories`: array of objects, specifies subdirectories for file extensions. If `null` all files will be saved to `directory` *(optional, see example below)*
5454
- `request`: object, custom options for [request](https://github.com/request/request#requestoptions-callback) *(optional, see example below)*
5555
- `recursive`: boolean, if `true` scraper will follow anchors in html files. Don't forget to set `maxDepth` to avoid infinite downloading *(optional, see example below)*
5656
- `maxDepth`: positive number, maximum allowed depth for dependencies *(optional, see example below)*
57+
58+
Default options you can find in [lib/config/defaults.js](https://github.com/s0ph1e/node-website-scraper/blob/master/lib/config/defaults.js).
5759

5860

5961
**callback** - callback function *(optional)*, includes following parameters:

0 commit comments

Comments
 (0)