Published in ECCV Workshop on Learning from Limited and Imperfect Data (L2ID), 2022

  Code   Talk

tl;dr: A contrastive learning based framework to enhance semi-SL methods to also utilize “out-of-class” unlabeled samples.

Additional information

Abstract

Semi-supervised learning (SSL) has been a powerful strategy to incorporate few labels in learning better representations. In this paper, we focus on a practical scenario that one aims to apply SSL when unlabeled data may contain out-of-class samples - those that cannot have one-hot encoded labels from a closed-set of classes in label data, i.e., the unlabeled data is an open-set. Specifically, we introduce OpenCoS, a simple framework for handling this realistic semi-supervised learning scenario based upon a recent framework of self-supervised visual representation learning. We first observe that the out-of-class samples in the open-set unlabeled dataset can be identified effectively via self-supervised contrastive learning. Then, OpenCoS utilizes this information to overcome the failure modes in the existing state-of-the-art semi-supervised methods, by utilizing one-hot pseudo-labels and soft-labels for the identified in- and out-of-class unlabeled data, respectively. Our extensive experimental results show the effectiveness of OpenCoS under the presence of out-of-class samples, fixing up the state-of-the-art semi-supervised methods to be suitable for diverse scenarios involving open-set unlabeled data.

BibTeX

@InProceedings{park2022opencos,
  author="Park, Jongjin and Yun, Sukmin and Jeong, Jongheon and Shin, Jinwoo",
  title="Open{CoS}: Contrastive Semi-supervised Learning for Handling Open-Set Unlabeled Data",
  booktitle="Computer Vision -- ECCV 2022 Workshops",
  year="2023",
  publisher="Springer Nature Switzerland",
  pages="134--149",
  isbn="978-3-031-25063-7"
}

Updated: