We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,非常有幸在CSDN看到了您的博文,关系抽取系列文章阅读下来后对您是无比的敬佩(尤其是附有pytorch和tf的实现)
但是有点关于GlobalPointer的小问题想要请教,相信您关于GlobalPointer的代码实现也参考了苏神的源码,我看到苏神在GlobalPointer的相关实现中用到了sequence_mask,即 logits = sequence_masking(logits, mask, '-inf', 2) logits = sequence_masking(logits, mask, '-inf', 3) 但是在调用GlobalPointer的时候却没有传入mask矩阵,即model = build_transformer_model(config_path, checkpoint_path) output = GlobalPointer(len(categories), 64)(model.output)请问这是为何?
logits = sequence_masking(logits, mask, '-inf', 2) logits = sequence_masking(logits, mask, '-inf', 3)
model = build_transformer_model(config_path, checkpoint_path) output = GlobalPointer(len(categories), 64)(model.output)
The text was updated successfully, but these errors were encountered:
我理解model = build_transformer_model(config_path, checkpoint_path)的model.output中包含了output_embedding 和 mask
Sorry, something went wrong.
No branches or pull requests
您好,非常有幸在CSDN看到了您的博文,关系抽取系列文章阅读下来后对您是无比的敬佩(尤其是附有pytorch和tf的实现)
但是有点关于GlobalPointer的小问题想要请教,相信您关于GlobalPointer的代码实现也参考了苏神的源码,我看到苏神在GlobalPointer的相关实现中用到了sequence_mask,即
logits = sequence_masking(logits, mask, '-inf', 2) logits = sequence_masking(logits, mask, '-inf', 3)
但是在调用GlobalPointer的时候却没有传入mask矩阵,即model = build_transformer_model(config_path, checkpoint_path) output = GlobalPointer(len(categories), 64)(model.output)
请问这是为何?The text was updated successfully, but these errors were encountered: