admin管理员组

文章数量:1130349

Jo-SRC: A Contrastive Approach for Combating Noisy Labels

  • Abstract
  • Introduction
  • Method
  • Summary

整理了文章的关键内容,内容源自 Jo-SRC: A Contrastive Approach for Combating Noisy Labels。

Abstract

  1. we train the network in a contrastive learning manner
    对比学习
  2. Predictions from two different views of each sample are used to estimate its “likelihood” of being clean or out-of-distribution.
    一个样本在两个不同的视图预测下,来评估其是否是噪声
  3. we propose a joint loss to advance the model generalization performance by introducing consistency regularization
    联合损失:通过引入一致性正则化来提高模型泛化性能

Introduction

  1. 介绍了深度学习和噪声标签的背景,说明处理噪声标签的必要性

  2. 先前方法:

    • Loss correction: 如噪声转移矩阵,鲁棒损失函数
    • Sample Selection: 如co-teaching

    噪声比较高容易失效

  3. 研究表明DNN在过拟合前优先记住干净简单的模式,目前的方法主要在ID(in-dis

Jo-SRC: A Contrastive Approach for Combating Noisy Labels

  • Abstract
  • Introduction
  • Method
  • Summary

整理了文章的关键内容,内容源自 Jo-SRC: A Contrastive Approach for Combating Noisy Labels。

Abstract

  1. we train the network in a contrastive learning manner
    对比学习
  2. Predictions from two different views of each sample are used to estimate its “likelihood” of being clean or out-of-distribution.
    一个样本在两个不同的视图预测下,来评估其是否是噪声
  3. we propose a joint loss to advance the model generalization performance by introducing consistency regularization
    联合损失:通过引入一致性正则化来提高模型泛化性能

Introduction

  1. 介绍了深度学习和噪声标签的背景,说明处理噪声标签的必要性

  2. 先前方法:

    • Loss correction: 如噪声转移矩阵,鲁棒损失函数
    • Sample Selection: 如co-teaching

    噪声比较高容易失效

  3. 研究表明DNN在过拟合前优先记住干净简单的模式,目前的方法主要在ID(in-dis

本文标签: 论文SRCJoContrastivelabels