-
Notifications
You must be signed in to change notification settings - Fork 87
Open
Description
I followed README.md and ran
python train.py --data_path ./data
But then I got the following errors:
{'dropout': 0.0, 'lr_ae': 1, 'load_vocab': '', 'nlayers': 1, 'batch_size': 64, 'beta1': 0.5, 'gan_gp_lambda': 0.1, 'nhidden': 128, 'vocab_size': 30000, 'niters_gan_schedule': '', 'niters_gan_d': 5, 'lr_gan_d': 0.0001, 'grad_lambda': 0.01, 'sample': False, 'arch_classify': '128-128', 'clip': 1, 'hidden_init': False, 'cuda': True, 'log_interval': 200, 'device_id': '0', 'temp': 1, 'seed': 1111, 'maxlen': 25, 'lowercase': True, 'data_path': './data', 'lambda_class': 1, 'lr_classify': 0.0001, 'outf': 'yelp_example', 'noise_r': 0.1, 'noise_anneal': 0.9995, 'lr_gan_g': 0.0001, 'niters_gan_g': 1, 'arch_g': '128-128', 'z_size': 32, 'epochs': 25, 'niters_ae': 1, 'arch_d': '128-128', 'emsize': 128, 'niters_gan_ae': 1}
Original vocab 9599; Pruned to 9603
Number of sentences dropped from ./data/valid1.txt: 0 out of 38205 total
Number of sentences dropped from ./data/valid2.txt: 0 out of 25278 total
Number of sentences dropped from ./data/train1.txt: 0 out of 267314 total
Number of sentences dropped from ./data/train2.txt: 0 out of 176787 total
Vocabulary Size: 9603
382 batches
252 batches
4176 batches
2762 batches
Loaded data!
Seq2Seq2Decoder(
(embedding): Embedding(9603, 128)
(embedding_decoder1): Embedding(9603, 128)
(embedding_decoder2): Embedding(9603, 128)
(encoder): LSTM(128, 128, batch_first=True)
(decoder1): LSTM(256, 128, batch_first=True)
(decoder2): LSTM(256, 128, batch_first=True)
(linear): Linear(in_features=128, out_features=9603, bias=True)
)
MLP_G(
(layer1): Linear(in_features=32, out_features=128, bias=True)
(bn1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activation1): ReLU()
(layer2): Linear(in_features=128, out_features=128, bias=True)
(bn2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activation2): ReLU()
(layer7): Linear(in_features=128, out_features=128, bias=True)
)
MLP_D(
(layer1): Linear(in_features=128, out_features=128, bias=True)
(activation1): LeakyReLU(negative_slope=0.2)
(layer2): Linear(in_features=128, out_features=128, bias=True)
(bn2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activation2): LeakyReLU(negative_slope=0.2)
(layer6): Linear(in_features=128, out_features=1, bias=True)
)
MLP_Classify(
(layer1): Linear(in_features=128, out_features=128, bias=True)
(activation1): ReLU()
(layer2): Linear(in_features=128, out_features=128, bias=True)
(bn2): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activation2): ReLU()
(layer6): Linear(in_features=128, out_features=1, bias=True)
)
Training...
Traceback (most recent call last):
File "train.py", line 574, in <module>
train_ae(1, train1_data[niter], total_loss_ae1, start_time, niter)
File "train.py", line 400, in train_ae
output = autoencoder(whichdecoder, source, lengths, noise=True)
File "/localhome/imd/anaconda2/envs/Pytorch/lib/python3.5/site-packages/torch/nn/modules/module.py", line 491, in __call__
result = self.forward(*input, **kwargs)
File "/groups/branson/home/imd/Documents/project/ARAE/yelp/models.py", line 143, in forward
hidden = self.encode(indices, lengths, noise)
File "/groups/branson/home/imd/Documents/project/ARAE/yelp/models.py", line 160, in encode
batch_first=True)
File "/localhome/imd/anaconda2/envs/Pytorch/lib/python3.5/site-packages/torch/onnx/__init__.py", line 56, in wrapper
if not might_trace(args):
File "/localhome/imd/anaconda2/envs/Pytorch/lib/python3.5/site-packages/torch/onnx/__init__.py", line 130, in might_trace
first_arg = args[0]
IndexError: tuple index out of range
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels