A Meta Learning-Based Approach for Zero-Shot Co-Training

Guy Zaks, Gilad Katz

Research output: Contribution to journalArticlepeer-review


The lack of labeled data is one of the main obstacles to the application of machine learning algorithms in a variety of domains. Semi-supervised learning, where additional samples are automatically labeled, is a common and cost-effective approach to address this challenge. A popular semi-supervised labeling approach is co-Training, where two views of the data-achieved by the training of two learning models on different feature subsets-iteratively provide each other with additional newly-labeled samples. Despite being effective in many cases, existing co-Training algorithms often suffer from low labeling accuracy and a heuristic sample-selection strategy that hurt their performance. We propose Co-Training using Meta-learning (CoMet), a novel approach that addresses many of the shortcomings of existing co-Training methods. Instead of employing a greedy labeling approach of individual samples, CoMet evaluates batches of samples and is thus able to select samples that complement each other. Additionally, our approach employs a meta-learning approach that enables it to leverage insights from previously-evaluated datasets and apply these insights to other datasets. Extensive evaluation on 35 datasets shows CoMet significantly outperforms other leading co-Training approaches, particularly when the amount of available labeled data is very small. Moreover, our analysis shows that CoMet's labeling accuracy and consistency of performance are also superior to those of existing approaches.

Original languageEnglish
Pages (from-to)146653-146666
Number of pages14
JournalIEEE Access
StatePublished - 1 Jan 2021


  • Co-Training
  • meta-learning
  • semi-supervised learning

ASJC Scopus subject areas

  • Computer Science (all)
  • Materials Science (all)
  • Engineering (all)


Dive into the research topics of 'A Meta Learning-Based Approach for Zero-Shot Co-Training'. Together they form a unique fingerprint.

Cite this