Abstract
[Objective] Narrative murals contain rich story scenes, which are important for users to understand the content of murals. However, traditional search methods focus on the retrieval of association semantics and similar images, ignoring the importance of context information. In order to solve the problem of convenient retrieval and scene perception for narrative murals, this paper constructs a scene mobile visual search model for narrative murals based on context awareness.
[Methods] Combining context awareness theory and information foraging theory, this paper constructs a context graph of murals with contexts as elements. Global and local visual features of murals are extracted by multiple models, and feature matching is conducted through dot-product. In specific contexts, context association is carried out based on the context graph to achieve scene retrieval that is easy for users to perceive and understand.
[Results] When searching for time, place, person and event-associated murals, the mean value of mAP of the proposed model in this paper is 0.840, which is better than models such as VGG16/BOW_KAZE/HOG.
[Limitations] The influence of the scenario the user is in on the search intent is ignored. [Conclusions] A narrative mural-oriented context awareness mobile visual search model was developed, achieving scene-aware mural resource retrieval and exploring the developmental trajectory of scene retrieval.
|