-
|
With the current setup, the crawler reaches a maximum of three levels in depth. How can I setup the crawler to reach all pages on the website? For example, it reaches the following levels in the website tree. I would like the crawler to reach all pages at every level. My router is the following code router.addDefaultHandler(async ({ request, enqueueLinks, log }) => {
const domain = 'http://domain.com/';
if (domain) {
await enqueueLinks({
globs: [`${domain}**`],
label: 'detail',
});
}
});
router.addHandler('detail', async ({ request, response, page, log, parseWithCheerio }) => {
});Thank you in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Depends on what you want, maybe you don't need multiple routes at all. In that example code you have you'd just call |
Beta Was this translation helpful? Give feedback.
Depends on what you want, maybe you don't need multiple routes at all. In that example code you have you'd just call
enqueueLinksagain in your detail handler.