Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation #2

Open
leepyone opened this issue Sep 20, 2022 · 1 comment
Open

Evaluation #2

leepyone opened this issue Sep 20, 2022 · 1 comment

Comments

@leepyone
Copy link

leepyone commented Sep 20, 2022

Hi, you are doing a great job. I would like to reproduce your work, where can I find the relevant code for evaluating the generated text.
I run
python test.py --length=50 --num_iterations=1 --temperature=1 --sample --gamma=1 --gm_scale=0.875 --kl_scale=0.01 --num_reviews=70.I changed num_reviews from 5 to 70
and there are some mistakes:

Written 70 records in the csv containing conditional sentences.
Traceback (most recent call last):
  File "test.py", line 612, in <module>
    run_pplm_example(**vars(args))
  File "test.py", line 305, in run_pplm_example
    kl_scale=kl_scale
  File "test.py", line 413, in full_text_generation
    kl_scale=kl_scale
  File "test.py", line 500, in generate_text_pplm
    device=device
  File "test.py", line 152, in perturb_past
    inputs_embeds=inputs_embeds
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 593, in forward
    inputs_embeds=inputs_embeds,
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 476, in forward
    hidden_states, layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask[i]
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 226, in forward
    self.ln_1(x), layer_past=layer_past, attention_mask=attention_mask, head_mask=head_mask
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 189, in forward
    attn_outputs = self._attn(query, key, value, attention_mask, head_mask)
  File "/home/minhao/anaconda3/envs/RexPlug/lib/python3.7/site-packages/transformers/modeling_gpt2.py", line 146, in _attn
    w = w * b - 1e4 * (1 - b)
RuntimeError: The size of tensor a (1025) must match the size of tensor b (1024) at non-singleton dimension 3

Do you know the reason for this big error?

@zyrmj0212
Copy link

Hi, I have encountered the same issue. I think it's probably caused by the limited generated length of GPT-2 (reference link: ) .After I limit the candidate review and true review length in test_utils.py create_cond_df function , the problem is solved.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants