-
-
Notifications
You must be signed in to change notification settings - Fork 194
Open
Description
The current stop functionality via Crawler.Stop is insufficient for multiple reasons:
- Calling
Stoptwice results in a panic (because it would closec.stoptwice) - Inside the functions
Extender.VisitandExtender.Errorthere is right now no way of terminating the crawler. Often though, there is absolutely the need to terminate the crawler, for example if some limits are reached (like max amount of visited URLs, determined in a customExtender.Visitfunction) - In
Crawler.collectUrlswhenres.idleDeathis true, it may delete workers. However, if there are 0 workers it does NOT terminate the entire crawler, leaving it in limbo (memory leak). This I consider a bug.
I'll create a pull request with a fix. Since it affects the Extender functions, it will break compatibility, but that's a good thing here.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels