Please wait a minute...
Data Analysis and Knowledge Discovery  2020, Vol. 4 Issue (11): 15-25    DOI: 10.11925/infotech.2096-3467.2020.0299
Current Issue | Archive | Adv Search |
Automatically Identifying Hypernym-Hyponym Relations of Domain Concepts with Patterns and Projection Learning
Wang Sili1,2(),Zhu Zhongming1,2,Yang Heng1,Liu Wei1
1Literature and Information Center of Northwest Institute of Eco-Environment and Resources, Chinese Academy of Sciences, Lanzhou 730000, China
2University of Chinese Academy of Sciences, Beijing 100049, China
Download: PDF (765 KB)   HTML ( 18
Export: BibTeX | EndNote (RIS)      

[Objective] This paper tries to automatically identify the hypernym-hyponym relations of domain concepts and establish their ontology. [Methods] First, we combined the traditional unsupervised pattern-based method and the advanced supervised-based projection learning method to automatically extract domain concepts. Then, we examined our new method with an empirical study. [Results] The proposed method could identify the hypernym sets of domain concepts. The identification accuracy in medical and general fields, as well as with the benchmark dataset BLESS were 0.88, 0.83, and 0.85 respectively. [Limitations] More research is needed to reduce the weight of high-frequency top-level words and improve the corpus quality. There are also some misidentified relationships. [Conclusions] The proposed model could find hypernym with different meanings for the same concept, which could also extract low-frequency words and named entities.

Key wordsHearst Pattern      Projection Learning      Word Embedding      Hypernym-Hyponym Relations      Domain Concept     
Received: 09 April 2020      Published: 04 December 2020
ZTFLH:  TP391  
Corresponding Authors: Wang Sili     E-mail:

Cite this article:

Wang Sili,Zhu Zhongming,Yang Heng,Liu Wei. Automatically Identifying Hypernym-Hyponym Relations of Domain Concepts with Patterns and Projection Learning. Data Analysis and Knowledge Discovery, 2020, 4(11): 15-25.

URL:     OR

Framework of Automatic Recognition of Hypernym-Hyponym Relationship Based on Pattern and Projection Learning
英文模式 中文模式
Y such as X Y例如/比如X
Y other than X 除了Y之外的X/ Y不仅是X
Y including X Y包含X
Y especially X Y尤其/特别是X
not all Y are X 不全是/并不是所有的Y都是X
Y like X Y类似X
Y for example X Y例如/比如/示例X
Y which includes X Y是那些包含X
X are also Y X也是Y
X are all Y X都是Y
not Y so much as X 没有Y而是X
Y is a X Y是一种/个/只…X
Recognition Mode of Hypernym Based on Extended Hearst Pattern
实验方法 实验设置 实验结果
①模式 扩展Hearst模式: 分布假设 + 共同下位词识别模式 通用领域:0.38
②投影学习 Word2Vec 100维、训练迭代次数10、单投影1、无负采样、无高频词亚采样 通用领域:0.54
③投影学习 Word2Vec 200维、训练迭代次数20、多投影24、负采样15、高频词亚采样阈值1e-5 通用领域:0.66
④模式 +
扩展Hearst模式 + 训练迭代次数20、Word2Vec 200维、多投影24、负采样15、高频词亚采样阈值1e-5 通用领域:0.83
Tests on Recognition of Hypernym-Hyponym Relationship
医学领域概念词 上位词集合(Top5)
Aneurysm(动脉瘤) procedure; clinical finding; soft tissue lesion; anatomical structure; disease
Diagnostic lumbar puncture(诊断性腰椎穿刺) clinical finding; disease; procedure; sickness; illness
Vertebra(脊椎) body region; bone; body structure; fracture; anatomical structure
Thymosin(胸腺肽) protein; biopolymer; enzyme;
hydrolase; lyase
Pain assessment(疼痛评估) pain; sickness; disease; illness;
practice of medicine
Recognition Results of Hypernym-Hyponym Relationship in Medical Field
通用领域概念词 上位词集合(Top5)
Miscreant(不法之徒) person; bad person; wrongdoer; actor; politician
Queen Elizabeth
person; king; monarch; aristocrat; patrician
Microcontroller(微控制器) electronic circuit; circuitry; pc board; computer chip; electrical device
Business concern
corporation; business organization; government agency; business firm; written agreement
Vegetarian(素食者/素的) dessert; dish; recipe; food product; person
Recognition Results of Hypernym-Hyponym Relationship in General Field
[1] WordNet-A Lexical Database for English[DB/OL]. [2019-10-20].
[2] Cyc: Logical Reasoning with the World’s Largest Knowledge Base[DB/OL]. [2019-11-09].
[3] 程韵如. 基于维基百科的领域实体上下位关系抽取[J]. 价值工程, 2016,35(18):160-163.
[3] ( Cheng Yunru. Hyponymy Extraction of Domain Entity Based on Wikipedia[J]. Value Engineering, 2016,35(18):160-163.)
[4] 唐恩博. 基于WordNet的蒙古文名词语义网上下位语义关系树构造方法的研究[D]. 呼和浩特: 内蒙古师范大学, 2014.
[4] ( Tang Enbo. Research on Construction Method of Mongolian Noun Semantic Network Hyponymy Tree Based on WordNet[D]. Huhhot: Inner Mongolia Normal University, 2014.)
[5] Gunawan, Pranata E. Acquisition of Hypernymy-Hyponymy Relation Between Nouns for WordNet Building[C]// Proceedings of the 2010 International Conference on Asian Language Processing. 2010: 114-117.
[6] Hearst M A. Automatic Acquisition of Hyponyms from Large Text Corpora[C]// Proceedings of the 14th International Conference on Computational Linguistics. 1992,2:539-545.
[7] Roller S, Katrin E K. Relations such as Hypernymy: Identifying and Exploiting Hearst Patterns in Distributional Vectors for Lexical Entailment[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 2163-2172.
[8] 刘磊, 曹存根, 王海涛, 等. 一种基于“是一个”模式的下位概念获取方法[J]. 计算机科学, 2006,33(9):146-151.
[8] ( Liu Lei, Cao Cungen, Wang Haitao, et al. A Method of Hyponym Acquisition Based on “isa” Pattern[J]. Computer Science, 2006,33(9):146-151.)
[9] 汤青, 吕学强, 李卓. 本体概念间上下位关系抽取研究[J]. 微电子学与计算机, 2014(6):68-71.
[9] ( Tang Qing, Lv Xueqiang, Li Zhuo. Research on Domain Ontology Concept Hyponymy Relation Extraction[J]. Microelectronics & Computer, 2014(6):68-71.)
[10] Geffet M, Dagan I. The Distributional Inclusion Hypotheses and Lexical Entailment[C]// Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics. 2005: 107-114.
[11] Kotlerman L, Dagan I, Szpektor I, et al. Directional Distributional Similarity for Lexical Inference[J]. Natural Language Engineering, 2010,16(4):359-389.
doi: 10.1017/S1351324910000124
[12] Baroni M, Lenci A. How We BLESSed Distributional Semantic Evaluation[C]// Proceedings of the GEMS 2011 Workshop on Geometrical Models of Natural Language Semantics. 2011: 1-10.
[13] Mei K W, Syed S R A, Ian D J. A Multi-Phase Correlation Search Framework for Mining Non-Taxonomic Relations from Unstructured Text[J]. Knowledge and Information Systems, 2014,38(3):641-667.
doi: 10.1007/s10115-012-0593-7
[14] Roller S, Erk K, Boleda G. Inclusive Yet Selective: Supervised Distributional Hypernymy Detection[C]// Proceedings of the 25th International Conference on Computational Linguistics. 2014: 1025-1036.
[15] Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
[16] Pennington J, Socher R, Manning C D. GloVe: Global Vectors for Word Representation[DB/OL]. [2018-12-29].
[17] Peters M, Neumann M, Iyyer M, et al. Deep Contextualized Word Representations[OL]. arXiv Preprint, arXiv: 1802. 05365.
[18] Devlin J, Chang M W, Lee K, et al. BERT: Pre-Training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810. 04805.
[19] Fu R J, Guo J, Qin B, et al. Learning Semantic Hierarchies via Word Embeddings[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. USA, 2014: 1199-1209.
[20] Yu Z, Wang H X, Lin X M, et al. Learning Term Embeddings for Hypernymy Identification[C]// Proceedings of the 24th International Joint Conference on Artificial Intelligence (IJCAI 2015). 2015: 1390-1397.
[21] Wang C Y, He X F. Chinese Hypernym-Hyponym Extraction from User Generated Categories[C]// Proceedings of the 26th International Conference on Computational Linguistics. 2016: 1350-1361.
[22] 余弦相似度[EB/OL]. [2019-10-15].余弦相似度.
[22] ( Cosine Similarity[EB/OL]. [2019-10-15].余弦相似度.)
[23] Yamane J, Takatani T, Yamada H, et al. Distributional Hypernym Generation by Jointly Learning Clusters and Projections[C]// Proceedings of the 26th International Conference on Computational Linguistics. 2016: 1871-1879.
[24] Ustalov D, Arefyev N, Biemann C, et al. Negative Sampling Improves Hypernymy Extraction Based on Projection Learning[OL]. arXiv Preprint, arXiv: 1707. 03903.
[25] PubMed Data[DB/OL]. [2019-08-15].
[26] SnomedCT[DB/OL]. [2019-08-10].
[27] UMBC Corpus[DB/OL]. [2019-10-25].
[28] WordNet[DB/OL]. [2019-10-25].
[29] Python Interface to Google Word2Vec[DB/OL]. [2019-08-15].
[30] PyTorch[DB/OL]. [2019-08-15].
[31] BLESS Dataset[DB/OL]. [2019-11-27].
[1] Wei Tingxin,Bai Wenlei,Qu Weiguang. Sense Prediction for Chinese OOV Based on Word Embedding and Semantic Knowledge[J]. 数据分析与知识发现, 2020, 4(6): 109-117.
[2] Su Chuandong,Huang Xiaoxi,Wang Rongbo,Chen Zhiqun,Mao Junyu,Zhu Jiaying,Pan Yuhao. Identifying Chinese / English Metaphors with Word Embedding and Recurrent Neural Network[J]. 数据分析与知识发现, 2020, 4(4): 91-99.
[3] Xinyu Zai,Xuedong Tian. Retrieving Scientific Documents with Formula Description Structure and Word Embedding[J]. 数据分析与知识发现, 2020, 4(1): 131-138.
[4] Hui Nie,Huan He. Identifying Implicit Features with Word Embedding[J]. 数据分析与知识发现, 2020, 4(1): 99-110.
[5] Yan Yu,Lei Chen,Jinde Jiang,Naixuan Zhao. Measuring Patent Similarity with Word Embedding and Statistical Features[J]. 数据分析与知识发现, 2019, 3(9): 53-59.
[6] Qingtian Zeng,Xiaohui Hu,Chao Li. Extracting Keywords with Topic Embedding and Network Structure Analysis[J]. 数据分析与知识发现, 2019, 3(7): 52-60.
[7] Peiyao Zhang,Dongsu Liu. Topic Evolutionary Analysis of Short Text Based on Word Vector and BTM[J]. 数据分析与知识发现, 2019, 3(3): 95-101.
[8] Li Lin,Li Hui. Computing Text Similarity Based on Concept Vector Space[J]. 数据分析与知识发现, 2018, 2(5): 48-58.
[9] Wang Tingting,Han Man,Wang Yu. Optimizing LDA Model with Various Topic Numbers: Case Study of Scientific Literature[J]. 数据分析与知识发现, 2018, 2(1): 29-40.
[10] Zhang Qin,Guo Hongmei,Zhang Zhixiong. Extracting Entity Relationship with Word Embedding Representation Features[J]. 数据分析与知识发现, 2017, 1(9): 8-15.
[11] Xia Tian. Extracting Keywords with Modified TextRank Model[J]. 数据分析与知识发现, 2017, 1(2): 28-34.
[12] Chen Guo,Xiao Lu. Linking Knowledge Elements from Online Community[J]. 数据分析与知识发现, 2017, 1(11): 75-83.
[13] Qun Zhang, Hongjun Wang, Lunwen Wang. Classifying Short Texts with Word Embedding and LDA Model[J]. 数据分析与知识发现, 2016, 32(12): 27-35.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938