@@ -130,6 +130,16 @@ start_requests
130130 behavior. If this argument is present API will execute start_requests
131131 Spider method.
132132
133+ crawl_args
134+ - type: urlencoded JSON string
135+ - optional
136+
137+ Optional arguments for spider. This is same as you use when running
138+ spider from command line with -a argument, for example if you run
139+ spider like this: "scrapy crawl spider -a zipcode=14100" you can
140+ send crawl_args={"zipcode":"14100"} (urlencoded: crawl_args=%7B%22zipcode%22%3A%2014100%7D)
141+ and spider will get zipcode argument.
142+
133143If required parameters are missing api will return 400 Bad Request
134144with hopefully helpful error message.
135145
@@ -558,6 +568,18 @@ But if you still want to save all stdout to some file - you can create custom
558568approach described in `Python Logging HOWTO `_ or redirect stdout to a file using
559569`bash redirection syntax `_, `supervisord logging `_ etc.
560570
571+ Releases
572+ ========
573+ ScrapyRT 0.12 (2021-03-08)
574+ --------------------------
575+ - added crawl arguments for API
576+ - removed Python 2 support
577+ - added Python 3.9 support
578+ - docs clean up
579+ - removed superfluous requirements (demjson, six)
580+ - fixed API crash when spider returns bytes in items output
581+ - updated unit tests
582+ - development improvements, moved from Travis to Github Workflows
561583
562584.. _toscrape-css spider : https://github.com/scrapy/quotesbot/blob/master/quotesbot/spiders/toscrape-css.py
563585.. _Scrapy educational quotesbot project : https://github.com/scrapy/quotesbot
0 commit comments