论文标题

本体学集合预测的上下文语义嵌入

Contextual Semantic Embeddings for Ontology Subsumption Prediction

论文作者

Chen, Jiaoyan, He, Yuan, Geng, Yuxia, Jimenez-Ruiz, Ernesto, Dong, Hang, Horrocks, Ian

论文摘要

在知识工程和人工智能中,自动化本体构建和策划是一项重要但具有挑战性的任务。通过机器学习技术(例如上下文语义嵌入)的预测是一个有希望的方向,但是相关的研究仍然是初步的,尤其是Web本体论语言(OWL)的表达本体论。在本文中,我们提出了一种名为Bertsubs的新的补充预测方法,用于猫头鹰本体的类别。它利用了预训练的语言模型BERT来计算类的上下文嵌入,其中提出了定制的模板以合并类上下文(例如相邻类)和逻辑存在限制。 Bertsubs能够预测多种组合物,包括来自相同本体或其他本体论的命名类别,以及来自相同本体的存在限制。针对三个不同的补充任务的五个现实世界的广泛评估表明,模板的有效性,而伯尔图布则可以极大地超过使用(文字意识)知识图形嵌入,非上下文词的嵌入和不利的猫术的基本线。

Automating ontology construction and curation is an important but challenging task in knowledge engineering and artificial intelligence. Prediction by machine learning techniques such as contextual semantic embedding is a promising direction, but the relevant research is still preliminary especially for expressive ontologies in Web Ontology Language (OWL). In this paper, we present a new subsumption prediction method named BERTSubs for classes of OWL ontology. It exploits the pre-trained language model BERT to compute contextual embeddings of a class, where customized templates are proposed to incorporate the class context (e.g., neighbouring classes) and the logical existential restriction. BERTSubs is able to predict multiple kinds of subsumers including named classes from the same ontology or another ontology, and existential restrictions from the same ontology. Extensive evaluation on five real-world ontologies for three different subsumption tasks has shown the effectiveness of the templates and that BERTSubs can dramatically outperform the baselines that use (literal-aware) knowledge graph embeddings, non-contextual word embeddings and the state-of-the-art OWL ontology embeddings.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源