I keep trying to scan some very large websites with this tool, but it keeps failing.
Is there a maximum number of links that a site can crawl? I'd love to know if I should just cap it at 5000 or if I can basically crawl a sub-domain and gather the results.
Right now I'm finding that the scans are just failing if I cross some threshold.