This is optional step, since processed data are already part of source tree
- Run
download_datasets.py
from root folder. - Then run
make_ds/sf-fire-build-ds.py
Run vanilla_rnn/rnn_predict_week.py
. This will just do "creative" prediction for one week
on test data. To train again constants at the top of the scripts needs to be changed:
'rebuild_artifacts' set to True.
Run lstm_rnn/sf-fire-lstm.py
to execute RNN and LSTM time series prediction.
The script runs training, performs test error validation and draws target vs prediction graph.
Optional arguments:
-n - number of neurons (default=100)
-l - number of layers (deafult=3)
-m - type of network, rnn or lstm (default rnn)
Run clustering by running clustering/location_clustering.py
.
Optional parameter '-n' - number of clusters (default = 10)
Example: python location_clustering.py -n 15
For classification, just go to
classification folder and run python classification.py
The ML model will be developed based on small portion of data