What role does contrastive learning play in unsupervised representation learning, and how does it ensure that representations of positive pairs are closer in the latent space than those of negative pairs?
Tuesday, 11 June 2024
by EITCA Academy
Contrastive learning has emerged as a pivotal technique in unsupervised representation learning, fundamentally transforming how models learn to encode data without explicit supervision. At its core, contrastive learning aims to learn representations by contrasting positive pairs against negative pairs, thereby ensuring that similar instances are closer in the latent space while dissimilar ones are farther
- Published in Artificial Intelligence, EITC/AI/ADL Advanced Deep Learning, Unsupervised learning, Unsupervised representation learning, Examination review
Tagged under:
Artificial Intelligence, Contrastive Learning, Data Augmentation, InfoNCE, MoCo, SimCLR

