TY - JOUR
T1 - Attention Retrieval Model for Entity Relation Extraction From Biological Literature
AU - Srivastava, Prashant
AU - Bej, Saptarshi
AU - Schultz, Kristian
AU - Yordanova, Kristina
AU - Wolkenhauer, Olaf
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2022
Y1 - 2022
N2 - Natural Language Processing (NLP) has contributed to extracting relationships among biological entities, such as genes, their mutations, proteins, diseases, processes, phenotypes, and drugs, for a comprehensive and concise understanding of information in the literature. Self-attention-based models for Relationship Extraction (RE) have played an increasingly important role in NLP. However, self-attention models for RE are framed as a classification problem, which limits its practical usability in several ways. We present an alternative framework called the Attention Retrieval Model (ARM), which enhances the applicability of attention-based models compared to the regular classification approach, for RE. Given a text sequence containing related entities/keywords, ARM learns the association between a chosen entity/keyword with the other entities present in the sequence, using an underlying self-attention mechanism. ARM provides a flexible framework for a modeller to customise their model, facilitate data integration, and integrate expert knowledge to provide a more practical approach for RE. ARM can extract unseen relationships that are not annotated in the training data, analogous to zero-shot learning. To sum up, ARM provides an alternative self-attention-based deep learning framework for RE, that can capture directed entity relationships.
AB - Natural Language Processing (NLP) has contributed to extracting relationships among biological entities, such as genes, their mutations, proteins, diseases, processes, phenotypes, and drugs, for a comprehensive and concise understanding of information in the literature. Self-attention-based models for Relationship Extraction (RE) have played an increasingly important role in NLP. However, self-attention models for RE are framed as a classification problem, which limits its practical usability in several ways. We present an alternative framework called the Attention Retrieval Model (ARM), which enhances the applicability of attention-based models compared to the regular classification approach, for RE. Given a text sequence containing related entities/keywords, ARM learns the association between a chosen entity/keyword with the other entities present in the sequence, using an underlying self-attention mechanism. ARM provides a flexible framework for a modeller to customise their model, facilitate data integration, and integrate expert knowledge to provide a more practical approach for RE. ARM can extract unseen relationships that are not annotated in the training data, analogous to zero-shot learning. To sum up, ARM provides an alternative self-attention-based deep learning framework for RE, that can capture directed entity relationships.
KW - Attention models
KW - biological literature mining
KW - deep learning
KW - knowledge graphs
UR - https://www.scopus.com/pages/publications/85125725156
U2 - 10.1109/ACCESS.2022.3154820
DO - 10.1109/ACCESS.2022.3154820
M3 - Article
AN - SCOPUS:85125725156
SN - 2169-3536
VL - 10
SP - 22429
EP - 22440
JO - IEEE Access
JF - IEEE Access
ER -