Need a: - [ ] `robot.txt` file to control legitimate crawling of site. See: - http://www.robotstxt.org/robotstxt.html - https://www.ordinarycoders.com/blog/article/robots-text-file-django - [ ] Control annoying [referrer spam](https://en.wikipedia.org/wiki/Referrer_spam) (currently being pushed from `binance.com`) - Usually this would be done via `.htaccess` files but it seems [PythonAnywhere don't expose that functionality to customers](https://www.pythonanywhere.com/forums/topic/2154/) - Might need to add application-level blocking of certain source hosts/IPs instead. - Would be good to understand if pythonanywhere access logs are public or not.