Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于GlobalPointer的一点疑惑,还望解答 #14

Open
liyunhan opened this issue Jul 20, 2022 · 1 comment
Open

关于GlobalPointer的一点疑惑,还望解答 #14

liyunhan opened this issue Jul 20, 2022 · 1 comment

Comments

@liyunhan
Copy link

您好,非常有幸在CSDN看到了您的博文,关系抽取系列文章阅读下来后对您是无比的敬佩(尤其是附有pytorch和tf的实现)

但是有点关于GlobalPointer的小问题想要请教,相信您关于GlobalPointer的代码实现也参考了苏神的源码,我看到苏神在GlobalPointer的相关实现中用到了sequence_mask,即 logits = sequence_masking(logits, mask, '-inf', 2) logits = sequence_masking(logits, mask, '-inf', 3) 但是在调用GlobalPointer的时候却没有传入mask矩阵,即model = build_transformer_model(config_path, checkpoint_path) output = GlobalPointer(len(categories), 64)(model.output)请问这是为何?

@zhengyanzhao1997
Copy link
Owner

我理解model = build_transformer_model(config_path, checkpoint_path)的model.output中包含了output_embedding 和 mask

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants