论文笔记 – Inductive Relation Prediction by BERT

Hanwen Zha, Zhiyu Chen, and Xifeng Yan, “Inductive Relation Prediction by BERT”, arXiv 2103.07102v1

1 简介

  • 姓名:BERTRL
  • 机构:UCSB
  • 任务:Inductive logical reasoning in KG completion
  • 流派:BERT-based
  • 动机:Grail 只用了边上的关系信息,这篇文章用 BERT,就用上了文本的信息和预训练的先验知识
  • 方法:拼接 [reasoning path with prompt; entity pair] 输入 BERT,[CLS] 分类, e.g. Take Figure 1 as an example. It could be “[CLS] Question: Franklin Roosevelt work at what ? Is the correct answer Washington D.C. ? [SEP] Context: Franklin Roosevelt president of USA; Washington D.C. capital of USA;” Each individual path will form a training/inference instance. 
  • 性能:在 FB15k-237 & NELL-995 大幅超过 SOTA, 在 WN18RR 与 SOTA comparable (作者说 FB15k-237 & NELL-995 have more relations and are associated with open-world knowledge (learned by BERT) compared with WN18RR)
  • 短评:BERT yyds,GCN 滚出 NLP!

发表评论

您的电子邮箱地址不会被公开。