Hybrid Methods
This paradigm combines ideas from previous work such as self-training and consistency regularization along with additional components for performance improvement.
This paradigm combines ideas from previous work such as self-training and consistency regularization along with additional components for performance improvement.
0) Overview:
Dataset:
Metrics:
1) Papers:
Hybrid Methods:
[MixMatch] D. Berthelot, N. Carlini, I. Goodfellow, N. Papernot, A. Oliver, C. Raffel, "MixMatch: A Holistic Approach to Semi-Supervised Learning", in 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019.
[ReMixMatch] D. Berthelot, N. Carlini, E. D. Cubuk, A. Kurakin, K. Sohn, H. Zhang, and C. Raffel, “ReMixMatch: Semi-supervised learning with distribution matching and augmentation anchoring”, in ICLR, 2020.
[FixMatch] K. Sohn, D. Berthelot, N. Carlini, Z. Zhang, H. Zhang, C. A. Raffel, E. D. Cubuk, A. Kurakin, and C.-L. Li, “FixMatch: Simplifying semi-supervised learning with consistency and confidence”, in NeurIPS, vol. 33, 2020, pp. 596–608.
Others:
M. Sajjadi, M. Javanmardi, and T. Tasdizen, “Regularization with stochastic transformations and perturbations for deep semisupervised learning,” in NeurIPS, 2016.
Y. Grandvalet and Y. Bengio, “Semi-supervised learning by entropy minimization,” in NeurIPS, vol. 17, 2005