Skip to content

Support more secure ways to declare the APIKEY #88

@BurnzZ

Description

@BurnzZ

BACKGROUND:

As of version 1.6.0, there are two (2) ways of adding the API KEYS:

  1. via the settings.py:
CRAWLERA_APIKEY = 'apikey'
  1. via spider attribute:
class SampleSpider(scrapy.Spider):
    crawlera_apikey = 'apikey'

When using Scrapy Cloud, we could also declare it via:

  1. via Spider/Project settings

image

  1. via Scrapy Cloud Crawlera add-on

image

PROBLEM

What actually happens in reality is that the API KEYS are being written inside the code and committed in the repo.

The best practice would be to avoid any sensitive keys to be coupled alongside the code. #3 and #4 above already fixes this problem as we have the option to only declare the keys inside Scrapy Cloud.

However, this becomes a problem when trying to run the spider locally during development as the keys might not be there.

OBJECTIVES

This issue aims to be a discussion ground on exploring better ways to handle it.

For starters, here are a couple of ways to approach it:

  • A. Set and retrieve the keys via environment variables.

  • B. Set and retrieve the keys via local file that is uncommited to the repo. - Examples would be similar to how SSH keys are stored in ~/.ssh and AWS Keys in ~/.aws.

Either way, it should support different API KEYs per spider.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions