Skip to content

Commit c1610e8

Browse files
authored
Update README.md
1 parent ff281b7 commit c1610e8

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

README.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,9 @@
22
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
33
[![Updates](https://pyup.io/repos/github/source-nerd/twitter-scraper/shield.svg)](https://pyup.io/repos/github/source-nerd/twitter-scraper/)
44

5-
A simple Python based Twitter Scraper with the ability of scripng tweets either by username or by a search query (Supports other search params as well). This program supports `STOP / RESUME` operation as it keeps a log of all the previous position id's.
5+
A simple Python based Twitter Scraper with the ability of scraping tweets either by username or by a search query (Supports other search params as well). This program supports `STOP / RESUME` operation as it keeps a log of all the previous position id's.
66
This project was created inorder to bypass twitter's 7 day policy since it doesn't allow to fetch tweets which are more than 7 days old as I needed some data for my research project.
7+
78
**Please Note: This is not an alternative for the official API's provided by the twitter**
89

910
This project is intented for students, researchers & all those who abide by twitter's data terms and conditions.
@@ -43,6 +44,7 @@ python main.py --searchquery github --since 2018-06-15 --until 2018-06-20 --op t
4344
## Output
4445
The output of the scraper is saved in the output file provided in the parameters. By default the outpfile file is `op.csv`.
4546
The program also keeps a log of all the previous search positions, and writes it to the logger file provided in the params. By default, the log file is `def_log.log`. This file is required inorder to resume the scraping operation if interrupted in between.
47+
4648
**Note: If you want to `RESUME` your previous incomplete scrape operation, make sure to provide the same log file as you did in the first instance.**
4749

4850
## Feedback & Final Thoughts

0 commit comments

Comments
 (0)