Skip to content

Conversation

jhg
Copy link

@jhg jhg commented Sep 20, 2024

I created it to avoid error about memory limit when it needs to crawl websites with many links. The downloader middleware late much to drop request, but this spider middleware drop it earlier then, I guess, it uses less memory. I would like to share it, maybe it can be useful for someone else.

@jhg jhg requested a review from ksassnowski as a code owner September 20, 2024 09:52
@jhg
Copy link
Author

jhg commented Sep 20, 2024

I see now the mistake writing the branch name, sorry.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant