基于难分样本挖掘的对抗自编码器推荐系统
DOI:
作者:
作者单位:

太原理工大学软件学院

作者简介:

通讯作者:

中图分类号:

基金项目:

山西省科技厅重点研发计划项目(201803D31226);山西省研究生教育创新项目(2019SY117)


Hard Example Mining based Adversarial Autoencoder Recommender
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对在推荐系统领域中常用数据集的数据分布不平衡、稀疏性大和用户评分偏好不同等问题,提出了基于难分样本挖掘的对抗自编码器推荐模型。考虑到用户偏好差异,使用均模型对数据集进行特征提取处理,在保留数据统计学特征的同时,降低了计算复杂度。之后,基于三元组损失算法对经过均模型处理的数据集进行难分样本挖掘。通过对数据集样本进行正负分类,提升了训练样本质量。再将正负样本分类后的数据分别作为对抗自编码器的输入,从重构和对抗两方面共同对评分预测模型进行训练。同时,采用Adam优化算法为不同参数单独计算更新梯度。实验结果表明,该推荐模型显著提升了推荐性能,多项指标优于基线模型。基于难分样本挖掘的推荐自编码器推荐系统具有一定实用价值。

    Abstract:

    Commonly used datasets in the field of recommendation suffer from unbalanced data distribution, sparsity and different user rating preferences. All these problems affect the quality of recommendation. Thus, this paper proposed a recommender model combining hard example mining with adversarial autoencoder. Considering the difference of users’ preference, Mean Model based triplet loss algorithm was introduced to classify the dataset into positive and negative samples and thus improve the quality of the training data. The application of Mean Model can both reduce computational complexity and retain the statistical feature of original data. Using classified samples, the rating prediction model was trained from both reconstruction and adversarial aspects. Adam optimization algorithm was used to calculate different update gradients for different parameters. Experimental results show that the recommendation model improves the recommendation accuracy significantly, and several performance indexes are better than baseline models. Hard example mining based adversarial autoencoder recommender system has certain practical value.

    参考文献
    相似文献
    引证文献
引用本文

魏东,孙静宇,海洋.基于难分样本挖掘的对抗自编码器推荐系统计算机测量与控制[J].,2020,28(12):161-165.

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-04-25
  • 最后修改日期:2020-05-21
  • 录用日期:2020-05-21
  • 在线发布日期: 2020-12-15
  • 出版日期: