Skip to content

Evaluation #2

@leepyone

Description

@leepyone

Hi, you are doing a great job. I would like to reproduce your work, where can I find the relevant code for evaluating the generated text.
I run
python test.py --length=50 --num_iterations=1 --temperature=1 --sample --gamma=1 --gm_scale=0.875 --kl_scale=0.01 --num_reviews=70.I changed num_reviews from 5 to 70
and there are some mistakes:

Written 70 records in the csv containing conditional sentences.
Traceback (most recent call last):
  File "test.py", line 612, in <module>
    run_pplm_example(**vars(args))
  File "test.py", line 305, in run_pplm_example
    kl_scale=kl_scale
  File "test.py", line 413, in full_text_generation
    kl_scale=kl_scale
  File "test.py", line 500, in generate_text_pplm
    device=device
  File "test.py", line 152, in perturb_past
    inputs_embeds=inputs_embeds
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 593, in forward
    inputs_embeds=inputs_embeds,
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 476, in forward
    hidden_states, layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask[i]
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 226, in forward
    self.ln_1(x), layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 189, in forward
    attn_outputs = self._attn(query, key, value, attention_mask, head_mask)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 146, in _attn
    w = w * b - 1e4 * (1 - b)
RuntimeError: The size of tensor a (1025) must match the size of tensor b (1024) at non-singleton dimension 3

Do you know the reason for this big error?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions