论文随记 2021 (109)11.02 ⎮ Language Models As or For Knowledge Bases11.02 ⎮ Understanding Pooling in Graph Neural Networks11.02 ⎮ GNN-LM: Language Modeling based on Global Contexts via GNN11.02 ⎮ Template-free Prompt Tuning for Few-shot NER11.02 ⎮ Generated Knowledge Prompting for Commonsense Reasoning11.02 ⎮ A Simple, Strong and Robust Baseline for Distantly Supervised Relation Extraction11.02 ⎮ RTJTN: Relational Triplet Joint Tagging Network for Joint Entity and Relation Extraction11.02 ⎮ Asymmetric Graph Representation Learning11.02 ⎮ Generating Disentangled Arguments with Prompts: A Simple Event Extraction Framework that Works11.02 ⎮ Stop just recalling memorized relations: Extracting Unseen Relational Triples from the context11.02 ⎮ Enhanced Few-Shot Learning with Multiple-Pattern-Exploiting Training11.02 ⎮ A General Method for Transferring Explicit Knowledge into Language Model Pretraining10.27 ⎮ HAIN: Hierarchical Aggregation and Inference Network for Document-Level Relation Extraction10.16 ⎮ Asking Effective and Diverse Questions: A Machine Reading Comprehension based Framework for Joint Entity-Relation Extraction10.16 ⎮ A Unified Multi-Task Learning Framework for Joint Extraction of Entities and Relations10.15 ⎮ Separating Retention from Extraction in the Evaluation of End-to-end Relation Extraction10.06 ⎮ Position Enhanced Mention Graph Attention Network for Dialogue Relation Extraction10.06 ⎮ Knowledge Base Completion Meets Transfer Learning10.06 ⎮ HacRED: A Large-Scale Relation Extraction Dataset Toward Hard Cases in Practical Applications10.06 ⎮ Modular Self-Supervision for Document-Level Relation Extraction09.16 ⎮ A Novel Global Feature-Oriented Relational Triple Extraction Model based on Table Filling09.14 ⎮ Factual Probing Is [MASK]: Learning vs. Learning to Recall09.14 ⎮ Language Models as Knowledge Bases?09.11 ⎮ Do Prompt-Based Models Really Understand the Meaning of their Prompts?09.03 ⎮ TREND: Trigger-Enhanced Relation-Extraction Network for Dialogues09.03 ⎮ A Conditional Cascade Model for Relational Triple Extraction08.31 ⎮ Rethinking Why Intermediate-Task Fine-Tuning Works08.31 ⎮ A Partition Filter Network for Joint Entity and Relation Extraction08.27 ⎮ Higher-order Coreference Resolution with Coarse-to-fine Inference08.27 ⎮ Coreference Resolution without Span Representations08.27 ⎮ CorefQA: Coreference Resolution as Query-based Span Prediction08.26 ⎮ End-to-end Neural Coreference Resolution08.26 ⎮ Coreference Reasoning in Machine Reading Comprehension08.25 ⎮ Contrastive Triple Extraction with Generative Transformer08.25 ⎮ StereoRel: Relational Triple Extraction from a Stereoscopic Perspective08.25 ⎮ Zero-shot Event Extraction via Transfer Learning: Challenges and Insights08.25 ⎮ Systematic Analysis of Joint Entity and Relation Extraction Models in Identifying Overlapping Relations08.24 ⎮ Consistent Inference for Dialogue Relation Extraction08.24 ⎮ Knowledgeable Prompt-tuning: Incorporating Knowledge into Prompt Verbalizer for Text Classification08.16 ⎮ Deeper Task-Specificity Improves Joint Entity and Relation Extraction08.05 ⎮ Calibrate Before Use: Improving Few-Shot Performance of Language Models08.04 ⎮ Frustratingly simple few-shot slot tagging08.03 ⎮ Template-Based Named Entity Recognition Using BART08.02 ⎮ Are Missing Links Predictable? An Inferential Benchmark for Knowledge Graph Completion08.02 ⎮ Just Train Twice: Improving Group Robustness without Training Group Information08.01 ⎮ NA-Aware Machine Reading Comprehension for Document-Level Relation Extraction07.15 ⎮ UNIRE: A Unified Label Space for Entity Relation Extraction07.15 ⎮ SENT: Sentence-level Distant Relation Extraction via Negative Training07.09 ⎮ Joint Entity and Relation Extraction with Set Prediction Networks07.09 ⎮ 传统与现代铆合下的彩礼给付实践07.04 ⎮ Cutting Down on Prompts and Parameters: Simple Few-Shot Learning with Language Models07.03 ⎮ HiddenCut: Simple Data Augmentation for Natural Language Understanding with Better Generalization07.03 ⎮ Relation Extraction in Dialogues: A Deep Learning Model Based on the Generality and Specialty of Dialogue Text07.03 ⎮ Entailment as Few-Shot Learner06.30 ⎮ Do Models Learn the Directionality of Relations? A New Evaluation Task: Relation Direction Recognition06.30 ⎮ How Attentive are Graph Attention Networks?06.30 ⎮ Dynamic Knowledge Graph Context Selection for Relation Extraction06.29 ⎮ SPANNER: Named Entity Re-/Recognition as Span Prediction06.27 ⎮ PRGC: Potential Relation and Global Correspondence Based Joint Relational Triple Extraction06.26 ⎮ Injecting Knowledge Base Information into End-to-End Joint Entity and Relation Extraction and Coreference Resolution06.20 ⎮ Open Hierarchical Relation Extraction06.19 ⎮ PTR: Prompt Tuning with Rules for Text Classification06.19 ⎮ EIDER: Evidence-enhanced Document-level Relation Extraction06.18 ⎮ Progressive Generation of Long Text with Pretrained Language Models06.12 ⎮ Incorporating Syntax and Semantics in Coreference Resolution with Heterogeneous Graph Attention Network06.09 ⎮ Breadth First Reasoning Graph for Multi-hop Question Answering06.07 ⎮ Entity Concept-enhanced Few-shot Relation Extraction06.05 ⎮ Learning from Context or Names? An Empirical Study on Neural Relation Extraction06.05 ⎮ Revisiting the Negative Data of Distantly Supervised Relation Extraction06.04 ⎮ Discriminative Reasoning for Document-level Relation Extraction06.04 ⎮ SIRE: Separate Intra- and Inter-sentential Reasoning for Document-level Relation Extraction05.27 ⎮ The Power of Scale for Parameter-Efficient Prompt Tuning05.25 ⎮ Multi-Entity Collaborative Relation Extraction05.25 ⎮ Improving Document-level Relation Extraction via Contextualizing Mention Representations and Weighting Mention Pairs05.24 ⎮ SaGCN: Structure-Aware Graph Convolution Network for Document-Level Relation Extraction05.24 ⎮ Densely Connected Graph Attention Network Based on Iterative Path Reasoning for Document-Level Relation Extraction05.21 ⎮ Improving Event Detection by Exploiting Label Hierarchy05.21 ⎮ Fantastically Ordered Prompts and Where to Find Them: Overcoming Few-Shot Prompt Order Sensitivity05.21 ⎮ Locate and Label: A Two-stage Identifier for Nested Named Entity Recognition05.18 ⎮ FEW-NERD: A Few-shot Named Entity Recognition Dataset05.18 ⎮ Three Sentences Are All You Need — Local Path Enhanced Document Relation Extraction05.05 ⎮ ENPAR:Enhancing Entity and Entity Pair Representations for Joint Entity Relation Extraction05.05 ⎮ Is the Understanding of Explicit Discourse Relations Required in Machine Reading Comprehension?04.29 ⎮ Variational Reasoning for Question Answering with Knowledge Graph04.21 ⎮ QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering04.20 ⎮ Revisiting Few-shot Relation Classification: Evaluation Data and Classification Schemes04.19 ⎮ On the Inductive Bias of Masked Language Modeling: From Statistical to Syntactic Dependencies04.16 ⎮ Is Multi-Hop Reasoning Really Explainable? Towards Benchmarking Reasoning Interpretability04.13 ⎮ Uniting Heterogeneity, Inductiveness, and Efficiency for Graph Representation Learning04.12 ⎮ Larger-Context Tagging: When and Why Does It Work?04.09 ⎮ Multi-hop Reading Comprehension across Multiple Documents by Reasoning over Heterogeneous Graphs04.08 ⎮ A Question-answering Based Framework for Relation Extraction Validation04.08 ⎮ Unsupervised relation extraction using sentence encoding04.07 ⎮ Cognitive Graph for Multi-Hop Reading Comprehension at Scale04.07 ⎮ Discrete Reasoning Templates for Natural Language Understanding04.07 ⎮ What Will it Take to Fix Benchmarking in Natural Language Understanding?04.06 ⎮ WhiteningBERT: An Easy Unsupervised Sentence Embedding Approach04.03 ⎮ Integrating Subgraph-aware Relation and Direction Reasoning for Question Answering04.02 ⎮ Fact Distribution in Information Extraction03.31 ⎮ Topology-Aware Correlations Between Relations for Inductive Link Prediction in Knowledge Graphs03.31 ⎮ On Position Embeddings in BERT03.30 ⎮ Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning03.30 ⎮ You Can Do Better! If You Elaborate the Reason When Making Prediction03.29 ⎮ DAGN: Discourse-Aware Graph Network for Logical Reasoning03.29 ⎮ Heterogeneous Graph Neural Networks for Multi-label Text Classification03.22 ⎮ N-ary Relation Extraction using Graph State LSTM03.21 ⎮ Composition-based Multi-Relational Graph Convolutional Networks01.25 ⎮ Simplifying Graph Convolutional Networks01.25 ⎮ Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View