본문 바로가기

positive sample

(3)
[Barlow Twins 논문 리뷰] - Barlow Twins: Self-Supervised Learning via Redundancy Reduction *Barlow Twins 논문 리뷰를 코드와 같이 분석한 글입니다! SSL 입문하시는 분들께 도움이 되길 원하며 궁금한 점은 댓글로 남겨주세요. Barlow Twins paper: https://arxiv.org/abs/2103.03230 Barlow Twins: Self-Supervised Learning via Redundancy Reduction Self-supervised learning (SSL) is rapidly closing the gap with supervised methods on large computer vision benchmarks. A successful approach to SSL is to learn embeddings which are invariant to distor..
[Simsiam 논문 리뷰] - Exploring Simple Siamese Representation Learning *Simsiam 논문 리뷰를 코드와 같이 분석한 글입니다! SSL 입문하시는 분들께 도움이 되길 원하며 궁금한 점은 댓글로 남겨주세요. *Simsiam는 Non-contrastive learning입니다. Simsiam paper: https://arxiv.org/abs/2011.10566 Exploring Simple Siamese Representation Learning Siamese networks have become a common structure in various recent models for unsupervised visual representation learning. These models maximize the similarity between two augmentations o..
[BYOL 논문 리뷰] - Bootstrap your own latent: A new approach to self-supervised Learning *BYOL 논문 리뷰를 코드와 같이 분석한 글입니다! SSL 입문하시는 분들께 도움이 되길 원하며 궁금한 점은 댓글로 남겨주세요. *BYOL는 Non-contrastive learning입니다. BYOL paper: https://arxiv.org/abs/2006.07733 Bootstrap your own latent: A new approach to self-supervised Learning We introduce Bootstrap Your Own Latent (BYOL), a new approach to self-supervised image representation learning. BYOL relies on two neural networks, referred to as online and ..

반응형