admin管理员组文章数量:1130349
Jo-SRC: A Contrastive Approach for Combating Noisy Labels
- Abstract
- Introduction
- Method
- Summary
整理了文章的关键内容,内容源自 Jo-SRC: A Contrastive Approach for Combating Noisy Labels。
Abstract
- we train the network in a contrastive learning manner
对比学习 - Predictions from two different views of each sample are used to estimate its “likelihood” of being clean or out-of-distribution.
一个样本在两个不同的视图预测下,来评估其是否是噪声 - we propose a joint loss to advance the model generalization performance by introducing consistency regularization
联合损失:通过引入一致性正则化来提高模型泛化性能
Introduction
-
介绍了深度学习和噪声标签的背景,说明处理噪声标签的必要性
-
先前方法:
- Loss correction: 如噪声转移矩阵,鲁棒损失函数
- Sample Selection: 如co-teaching
噪声比较高容易失效
-
研究表明DNN在过拟合前优先记住干净简单的模式,目前的方法主要在ID(in-dis
Jo-SRC: A Contrastive Approach for Combating Noisy Labels
- Abstract
- Introduction
- Method
- Summary
整理了文章的关键内容,内容源自 Jo-SRC: A Contrastive Approach for Combating Noisy Labels。
Abstract
- we train the network in a contrastive learning manner
对比学习 - Predictions from two different views of each sample are used to estimate its “likelihood” of being clean or out-of-distribution.
一个样本在两个不同的视图预测下,来评估其是否是噪声 - we propose a joint loss to advance the model generalization performance by introducing consistency regularization
联合损失:通过引入一致性正则化来提高模型泛化性能
Introduction
-
介绍了深度学习和噪声标签的背景,说明处理噪声标签的必要性
-
先前方法:
- Loss correction: 如噪声转移矩阵,鲁棒损失函数
- Sample Selection: 如co-teaching
噪声比较高容易失效
-
研究表明DNN在过拟合前优先记住干净简单的模式,目前的方法主要在ID(in-dis
本文标签: 论文SRCJoContrastivelabels
版权声明:本文标题:论文阅读Jo-SRC: A Contrastive Approach for Combating Noisy Labels 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:https://it.en369.cn/jiaocheng/1763631592a2949734.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。


发表评论