刘遹菡副教授

博士,硕士生导师。中国新闻技术工作者联合新闻信息标准化分会委员,湖北省编辑学会会员,西藏自治区唐卡艺术与文化协会会员;主持教育部人文社科基金、国家重点研发计划子课题、国家科技专项项目子课题、湖北省...查看更多>>

论文成果

Mixture of Experts Residual Learning for Hamming Hashing

发布时间:2024-06-13 点击次数:
论文类型:期刊论文
第一作者:徐锦宇
通讯作者:解庆
合写作者:马艳春,李佳琛,刘遹菡
发表刊物:Neural Processing Letters
收录刊物:SCI
学科门类:工学
一级学科:计算机科学与技术
文献类型:J
关键字:Image retrieval · Hamming hashing · Mixture of experts · Attentional mechanism
摘要:Image retrieval has drawn growing attention due to the rapid emergence of images on the Internet. Due to the high storage and computation efficiency, hashing methods are widely employed in approximate nearest neighbor search for large-scale image retrieval. Existing deep supervised hashing methods mainly utilize the labels to analyze the semantic similarity and preserve it in hash codes, but the collected label information may be incomplete or unreliable in real-world. Meanwhile, the features extracted by a single convolutional neural network (CNN) from complex images are difficult to express the latent information, or potential to miss certain semantic information. Thus, this work further exploits existing knowledge from the pre-trained semantic features of higher quality, and proposes mixture of experts residual learning for Hamming hashing (MoE-hash), which brings in the experts forimage hashing in Hamming space. Specifically, we separately extract the basic visual features by a CNN, and different semantic features by existing expert models. To better preserve the semantic information in compact hash codes, we learn the hash codes by the mixture of experts (MoE) residual learning block with max-margin t-distribution-based loss. Extensiveexperiments on MS-COCO and NUS-WIDE demonstrate that our model can achieve clear improvement in retrieval performance, and validate the role of mixture of experts residual learning in image hashing task.