基于双通道多尺度注意力机制的光伏板裂缝检测方法
DOI:
作者:
作者单位:

常州大学 机械与轨道交通学院智能制造产业学院

作者简介:

通讯作者:

中图分类号:

391.41

基金项目:

江苏省研究生实践创新计划项目(SJCX21_1272)


Photovoltaic Panel Crack Detection Method Based on Dual-Channel Multi-Scale Attention Mechanism
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对目前传统边缘检测方法提取出的图像边缘轮廓模糊、不连续等问题,提出一种基于双通道多尺度注意力机制的光伏板裂缝检测方法,实现对图像低级边缘、边界、目标轮廓的检测。首先构建了双通道主干网络,包含语义分支通道和空间细节分支通道;其次基于多尺度原则,构建了多尺度及注意力机制模块,对特征图像的高、宽、通道的维度变换,分配特征权重,在捕捉跨通道信息的同时,还能够捕捉方向感知和位置感知的信息;最后将空洞融合模块融合到语义分支通道中,提升网络提取特征信息的能力。实验结果表明,所提出的算法对光伏板图像边缘检测性能有提升,相较HED、RCF与FCN算法,F1值提升了2.83%、0.37%与1.54%,获得了较为清晰的裂缝图像。

    Abstract:

    Aiming at the problems of fuzzy and discontinuous edge contours extracted by traditional edge detection methods, a photovoltaic panel crack detection method based on a dual-channel multi-scale attention mechanism is proposed to detect low-level edges, boundaries, and target contours. First, a dual-channel backbone network was constructed, including a semantic branch channel and a spatial detail branch channel. Second, based on the multi-scale principle, a multi-scale and attention mechanism was built to transform the dimensions of the feature map's height, width, and channel, allocate feature weights, capture cross-channel information, and capture direction and position information. Finally, the hole fusion module was integrated into the semantic branch channel to improve the network's ability to extract feature information. Experimental results show that the proposed algorithm improves the edge detection performance of photovoltaic panel images. Compared to HED、RCF and FCN algorithms, the F1 value was increased by 2.83%、0.37% and 1.54%, respectively, and clearer crack images were obtained.

    参考文献
    相似文献
    引证文献
引用本文

强浩,叶波,唐文祺.基于双通道多尺度注意力机制的光伏板裂缝检测方法计算机测量与控制[J].,2023,31(12):84-89.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2023-02-20
  • 最后修改日期:2023-03-17
  • 录用日期:2023-03-17
  • 在线发布日期: 2023-12-27
  • 出版日期: