KnowlyBERT
Hybrid Query Answering over Language Models and Knowledge Graphs
- verfasst von
- Jan-Christoph Kalo, Leandra Fichtel, Philipp Ehler, Wolf-Tilo Balke
- Abstract
Providing a plethora of entity-centric information, Knowledge Graphs have become a vital building block for a variety of intelligent applications. Indeed, modern knowledge graphs like Wikidata already capture several billions of RDF triples, yet they still lack a good coverage for most relations. On the other hand, recent developments in NLP research show that neural language models can easily be queried for relational knowledge without requiring massive amounts of training data. In this work, we leverage this idea by creating a hybrid query answering system on top of knowledge graphs in combination with the masked language model BERT to complete query results. We thus incorporate valuable structural and semantic information from knowledge graphs with textual knowledge from language models to achieve high precision query results. Standard techniques for dealing with incomplete knowledge graphs are either (1) relation extraction which requires massive amounts of training data or (2) knowledge graph embeddings which have problems to succeed beyond simple baseline datasets. Our hybrid system KnowlyBERT requires only small amounts of training data, while outperforming state-of-the-art techniques by boosting their precision by over 30% in our large Wikidata experiment.
- Externe Organisation(en)
-
Technische Universität Braunschweig
- Typ
- Aufsatz in Konferenzband
- Seiten
- 294-310
- Anzahl der Seiten
- 17
- Publikationsdatum
- 2020
- Publikationsstatus
- Veröffentlicht
- Peer-reviewed
- Ja
- ASJC Scopus Sachgebiete
- Theoretische Informatik, Informatik (insg.)
- Elektronische Version(en)
-
https://doi.org/10.1007/978-3-030-62419-4_17 (Zugang:
Geschlossen)