Please wait a minute...
New Technology of Library and Information Service  2016, Vol. 32 Issue (1): 17-23    DOI: 10.11925/infotech.1003-3513.2016.01.04
Orginal Article Current Issue | Archive | Adv Search |
A Study on Correlation-based Cross-Modal Information Retrieval
Heng Ding1,Wei Lu1,2()
1School of Information Management, Wuhan University, Wuhan 430072,China
2Center for the Studies of Information Resources, Wuhan University, Wuhan 430072,China
Export: BibTeX | EndNote (RIS)      

[Objective] Summarize the fundamental strategies and core issues in Cross-Modal Information Retrieval (CMIR) based on correlation, and do research about the pros and cons of using partial least squares in feature subspace projection in order to improve retrieval effect. [Methods] Based on Wikipedia CMIR dataset, LDA and BOW models are used as a characteristic expression of text and image resources, cosine distance as the similarity measure, and the least squares method is used to learn subspace projection function replacing canonical correlation analysis method. [Results] Using comparative analysis of the influence of three features subspace projection methods named canonical correlation analysis, partial least squares regression, partial least squares correlation on CMIR results according to three retrieval evaluation indicators that are P@K, MAP and NDCG, and the results show that partial least squares correlation obtains the best results. [Limitations]In dealing with data, partial least squares method assumes a linear relationship between the data and an orthogonal relationship between the data base vectors, therefore the non-linear, non-orthogonal problem can not be solved. [Conclusions] Feature subspace projection learning by using partial least squares correlation is more consistent with original spatial information, and CMIR results are more stable.

Key wordsCross-Modal Information Retrieval      Partial least squares      Subspace projection     
Received: 06 July 2015      Published: 04 February 2016

Cite this article:

Heng Ding, Wei Lu. A Study on Correlation-based Cross-Modal Information Retrieval. New Technology of Library and Information Service, 2016, 32(1): 17-23.

URL:     OR

[1] Smeulders A W M, Worring M, Santini S, et al. Content-based Image Retrieval at the End of the Early Years[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(12): 1349-1380.
[2] Wang A.An Industrial Strength Audio Search Algorithm [C]. In: Proceedings of International Society for Music Information Retrieval Conference, Baltimore, Maryland, USA. 2003: 7-13.
[3] Snoek C G M, Worring M. Concept-based Video Retrieval[J]. Foundations and Trends in Information Retrieval, 2008, 2(4): 215-322.
[4] 王宇新, 郭禾, 何昌钦, 等. 用于图像场景分类的空间视觉词袋模型[J]. 计算机科学, 2011, 38(8): 265-268.
[4] (Wang Yuxin, Guo He, He Changqin, et al.Bag of Spatial Visual Words Model for Scene Classification[J]. Computer Science, 2011, 38(8): 265-268.)
[5] 张志飞, 苗夺谦, 高灿. 基于LDA主题模型的短文本分类方法[J]. 计算机应用, 2013, 33(6): 1587-1590.
[5] (Zhang Zhifei, Miao Duoqian, Gao Can.Short Text Classification Using Latent Dirichlet Allocation[J]. Journal of Computer Applications, 2013, 33(6): 1587-1590.)
[6] 段瑞雪, 王小捷, 孙月萍, 等. HDP主题模型的用户意图聚类[J]. 北京邮电大学学报, 2011, 34(S1): 55-58.
[6] (Duan Ruixue, Wang Xiaojie, Sun Yueping, et al.Clustering User Goals Based on Hierarchical Dirichlet Process Topic Model[J]. Journal of Beijing University of Posts and Telecommunications, 2011, 34(S1): 55-58.)
[7] Wu F, Zhang H, Zhuang Y.Learning Semantic Correlations for Cross-Media Retrieval [C]. In: Proceedings of IEEE International Conference on Image Processing, Atlanta, USA. IEEE, 2006: 1465-1468.
[8] 张鸿, 吴飞, 庄越挺. 基于特征子空间学习的跨媒体检索方法[J]. 模式识别与人工智能, 2008, 21(6): 739-745.
[8] (Zhang Hong, Wu Fei, Zhuang Yueting.Cross-Media Retrieval Method Based on Feature Subspace Learning[J]. Pattern Recognition and Artificial Intelligence, 2008, 21(6): 739-745.)
[9] 胡涛, 武港山, 任桐炜, 等. 基于Ontology的跨媒体检索技术[J]. 计算机工程, 2009, 35(8): 266-268.
[9] (Hu Tao, Wu Gangshan, Ren Tongwei, et al.Ontology-based Cross-media Retrieval Technique[J]. Computer Engineering, 2009, 35(8): 266-268.)
[10] 明均仁, 何超. 基于语义关联挖掘的数字图书馆跨媒体检索方法研究[J]. 图书情报工作, 2013, 57(7): 101-105.
[10] (Ming Junren, He Chao.Research on Cross-media Retrieval Method in Digital Library Based on Semantic Association Mining[J]. Library and Information Service, 2013, 57(7): 101-105.)
[11] 张鸿. 基于相关性挖掘的跨媒体检索研究[D]. 杭州: 浙江大学, 2007.
[11] (Zhang Hong.Correlation Mining Based Cross- media Retrieval [D]. Hangzhou: Zhejiang University, 2007.)
[12] Rasiwasia N, Costa Pereira J, Coviello E, et al.A New Approach to Cross-modal Multimedia Retrieval [C]. In: Proceedings of the International Conference on Multimedia. ACM, 2010: 251-260.
[13] Costa Pereira J, Coviello E, Doyle G, et al.On the Role of Correlation and Abstraction in Cross-modal Multimedia Retrieval[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014, 36(3): 521-535.
[14] 刘扬, 郑逢斌, 姜保庆, 等. 基于多模态融合和时空上下文语义的跨媒体检索模型的研究[J]. 计算机应用, 2009, 29(4): 1182-1187.
[14] (Liu Yang, Zheng Fengbin, Jiang Baoqing, et al.Research of Cross-media Information Retrieval Model Based on Multimodal fusion and Temporal-spatial Context Semantic[J]. Journal of Computer Applications, 2009, 29(4): 1182-1187.)
[15] Zhai X, Peng Y, Xiao J.Cross-modality Correlation Propagation for Cross-media Retrieval [C]. In: Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Kyoto, Japan. IEEE, 2012: 2337-2340.
[16] 张宇, 刘雨东, 计钊. 向量相似度测度方法[J]. 声学技术, 2009, 28(4): 532-536.
[16] (Zhang Yu, Liu Yudong, Ji Zhao.Vector Similarity Measurement Method[J]. Technical Acousitics, 2009, 28(4): 532-536.)
[17] Abdi H, Williams L J.Partial Least Squares Methods: Partial Least Squares Correlation and Partial Least Square Regression [A]. // Methods in Molecular Biology [M]. Humana Press, 2013: 549-579.
[1] Zhou Heng,Chen Zhangjian,Li Aiqin,Cheng Xiaoqiang,Wu Huayi. Spatial Distribution and Socio-economic Driving Forces of Residential Changes: Case Study of Zhejiang Province[J]. 数据分析与知识发现, 2020, 4(9): 81-90.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938