-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inf log-probs when using score.py #45
Comments
Just to tell you that this repo isn't maintained anymore. See the warning https://github.com/lisa-groundhog/GroundHog I'm not sure someone else will reply. There is another project that you can On Sat, Jan 23, 2016 at 11:07 PM, zhangdongxu [email protected]
|
Thank you. I notice that. But I concerned that has attention-based structure been employed in Block's translation system? |
Yes, the implementation in Blocks uses attention. On 25 January 2016 at 08:41, zhangdongxu [email protected] wrote:
|
Meet this issue when implement score.py. And all the log-probs are inf.
I test this on the training data, where in the training process, most target sentences can be recovered.
In my opinion, zero probability should not exist since in the softmax, there is inner product between two vector.
Any suggestions?
The text was updated successfully, but these errors were encountered: