I'm training the LSTM with some 80 MB files with the specified hyperparameters ```bash python train.py --data_dir=./data --rnn_size 2048 --num_layers 2 --seq_length 256 --batch_size 128 --output_keep_prob 0.25 ``` but after few minutes the job gets killed. Is the file too big?