Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Principal Component Masking for Self-Supervised Representation Learning
Self-supervised representation learning (SSL) has emerged as a cornerstone of representation learning in recent years. Models such as OpenAI's CLIP demonstrate how SSL approaches can produce expressive representations applicable to a broad spectrum of downstream tasks. This paradigm relies on paired observations—whether paired views or modalities sharing the same content—to extract meaningful features.
Broadly, SSL methods fall into two categories: discriminative and generative (or reconstruction-based). Discriminative SSL aims to ensure that representations of paired observations are closer in latent space than those of randomly sampled observations. In contrast, reconstruction-based SSL involves reconstructing one observation from its pair.
In multi-view settings, data augmentation techniques, such as image cropping and color jittering, are commonly used to artificially create paired observations from single ones. Among these augmentations, image cropping has proven especially impactful, driving advancements in visual learning models like Meta's DINO.
Recent studies [1] suggest that in the image domain, masking—conceptually similar to cropping—principal components rather than individual image pixels can generate image pairs that foster the learning of expressive features in reconstruction-based SSL. In this project, we aim to investigate whether applying a similar approach to discriminative SSL can yield comparable benefits, focusing specifically on methods like DINO, JEPA and SigLIP.
[1] https://alicebizeul.github.io/assets/pdf/mae.pdf
Keywords: Machine Learning, Deep Learning, Visual Representation Learning, Multi-view learning, Contrastive Learning, Core AI
Please see the attached project proposal.
Please see the attached project proposal.
Please see the attached project proposal.
Please see the attached project proposal.
If you are interested, please contact Alice Bizeul at alice.bizeul@inf.ethz.ch and include your grades, resume, and a brief introduction explaining your motivation.
If you are interested, please contact Alice Bizeul at alice.bizeul@inf.ethz.ch and include your grades, resume, and a brief introduction explaining your motivation.