基于多视角跨模态的电力现场作业行人重识别网络架构技术研究
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:


Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
    摘要:

    可见光到红外光跨模态行人重识别目的是实现在白天和夜间环境下对行人身份的识别判断,在视频监控领域具有重要研究价值。因可见光和红外光成像原理的不同,给跨模态重识别问题带来了挑战。设计了一种新的网络结构,用于缓解模态间数据差异,提高行人重识别模型的精度。网络结构分为两部分:基于注意力的模态迁移模块嵌入特征网络的输入级,可缩小跨模态差异;基于分块的多粒度特征分解模块,同时考虑整体信息和局部信息并了提高有效信息的利用率。在公开数据集SYSU-MM01上,所提方法的累计匹配特性指标的rank1达到了56.45%,平均精确度指标达到了53.52%,比当前最佳方法(XIV,AAAI-2020)分别提高了6.53%和2.79%,有效提高了可见光到红外光跨模态行人重识别的性能。

    Abstract:

    The purpose of cross-modal pedestrian recognition from visible to infrared light is to realize the identification and judgment of pedestrian identity in day and night environments,which is of great research value in the field of video surveillance. Due to the different imaging principles of visible light and infrared light,cross-modal recognition is a challenge. A new network structure is designed to alleviate the data difference between modals and improve the accuracy of pedestrian recognition model. The proposed network structure is divided into two parts: attention-based modal transfer module embedded in the input stage of the feature network,which can reduce the difference across modals,and block-based multi-granularity feature decomposition module,which can consider both global information and local information and improve the utilization rate of effective information. The experimental results on the open data set SYSU-MM01 show that,the rank 1 of the cumulative matching characteristic( CMC) index and the mean average precision( m AP) index of the proposed method reaches 56. 45%and 53. 52%,respectively,which are 6. 53% and 2. 79% higher than the current best method( XIV,AAAI-2020),and effectively improves the performance of cross-modal pedestrian recognition from visible to infrared light.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2022-04-01
  • 出版日期: