-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sg.sample_sequence returns context after pre-trained model #12
Comments
same problem |
also getting weird output like this. |
First of all, thank you for sharing your code! Helped me a lot starting with gpt2. output will only append zeros: If my sequence length is 512 — I will get 512 zeros (+3 above zero numbers because of my context). edit 1: edit 2: |
First ofd all, thanks for providing this amazing repository providing a possibility for tf2!
Secondly, I were using the Readme to pre-train my model and eventually using sequence_generator.py to pass some context to the model.
However, the response is always 1:1 the same as the context but the capital letters are being replaced with ??s. The question now is, what am I doing wrong? Have I maybe forgotten a thing? Is there maybe a edge case leading to this point that could be prevented?
Please let me know any additional information you might need! Thanks a lot!
The text was updated successfully, but these errors were encountered: