Sameer Ambekar
Deep learning | Computer vision
I am pursuing my Ph.D. (Doctoral Researcher) at Technical University of Munich (TU Munich) and Helmholtz Munich in Deep learning under Prof. Dr. Julia Schnabel and mentored by Prof. Dr. Stefan Bauer. My academic journey includes Masters in Artificial Intelligence (MSc AI) at the University of Amsterdam (UvA) and also worked as Research Intern at the university research AIM lab, UvA. For MSc AI thesis, I addressed Test-Time Adaptation of classifiers for Domain Generalization utilizing Meta learning, Variational inference for efficient multi-source training and transferrable features under Prof. dr. Cees Snoek, Prof. Xiantong Zhen and Zehao Xiao.
Before MSc AI, I worked as Research Assistant (RA) at IIT Delhi India under Prof. Prathosh A.P. in domain adaptation through Generative Latent search in the VAE latent space. Before working as RA at IITD, I worked at Indian Council of Medical Research (ICMR) as a Researcher under Dr. Subarna Roy (Scientist G), and Mr. Pramod Kumar (Scientist C).
Additionally, I also serve as a Reviewer at top-tier conferences and journals such as NeurIPS, CVPR, ECCV, ICCV, WACV, IEEE Transactions on Neural Networks and Learning Systems, Elsevier Applied soft computing. Furthermore, serve as a Mentor for Neurmoatch deep learning course.
I'm always interested in new collaborations, so feel free to email me to chat!
  If you're at TU Munich and interested in pursuing Guided research project or Master thesis, kindly email me.
|
|
Latest News
-
Attended ICVSS Computer vision summer school at Sicily, Italy.
-
Attended EEML Machine learning 2023 summer school by Google Deepmind
-
Attended OxML 2023, 2020 School by University of Oxford
Research Goal
I am interested in Test-time adaptation and Unsupervised learning.
Utilizing learned representations and patterns to transfer to unseen domains although high-dimensional and streaming batches of data with distribution shifts.
My efforts have focussed on addressing problems in Test-time adaptation, Domain Generalization, Domain Adaptation through Variational inference, meta learning, surrogate model updates and model predictions.
|
|
Precise Test-time detection
Sameer Ambekar,
Cosmin I. Bercea,
Daniel Rueckert,
Julia A. Schnabel
Preprint soon
|
|
Test-Time adaptation: Non-Parametric, Backprop-free and entirely feedfoward
Sameer Ambekar,
Daniel M. Lang,
Julia A. Schnabel
Preprint soon
|
|
Test-time adaptation for different distribution shifts without backpropagation
Sameer Ambekar,
Zehao Xiao,
Xiantong Zhen,
Cees G. M. Snoek
Preprint soon
|
|
Probabilistic Test-Time Generalization by Variational Neighbor-Labeling
Sameer Ambekar,
Zehao Xiao,
Jiayi Shen,
Xiantong Zhen,
Cees G. M. Snoek
Accepted at CoLLAs 2024. Also, ICLR 2023 DG workshop - Spotlight paper.
arXiv Link
First, we propose probabilistic pseudo-labeling of target samples to generalize the source-trained model to the target domain at test time. We formulate the generalization at test time as a variational inference problem by modeling pseudo labels as distributions to consider the uncertainty during generalization and alleviate the misleading signal of inaccurate pseudo labels. Second, we learn variational neighbor labels that incorporate the information of neighboring target samples to generate more robust pseudo labels. Third, to learn the ability to incorporate more representative target information and generate more precise and robust variational neighbor labels, we introduce a meta-generalization stage during training to simulate the generalization procedure.
|
|
Unsupervised Domain Adaptation for Semantic Segmentation of NIR Images through Generative Latent Search
Prashant Pandey,
Aayush Kumar Tyagi,
Sameer Ambekar,
Prathosh AP,
ECCV 2020 [Spotlight - Top 5% of the accepted papers]
project page
/
arXiv
/
ECCV Website
/
code
We cast the skin segmentation problem as that of target-independent Unsupervised Domain Adaptation (UDA) where we use the data from the Red-channel of the visible-range to develop skin segmentation algorithm on NIR images. We propose a method for target-independent segmentation where the ‘nearest-clone’ of a target image in the source domain is searched and used as a proxy in the segmentation network trained only on the source domain. We prove the existence of ‘nearest-clone’ and propose a method to find it through an optimization algorithm over the latent space of a Deep generative model based on variational inference.
|
|
SKDCGN: Source-free Knowledge Distillation of Counterfactual Generative Networks using cGANs
Sameer Ambekar,
Ankit Ankit,
Diego van der Mast,
Mark Alence,
Matteo Tafuro,
Christos Athanasiadis
ECCV 2022 workshop VIPriors, 2022 - UvA DL2 course project with 'no edits'
arXiv /
ECCV Website /
code
We propose a novel work named SKDCGN that attempts knowledge transfer using Knowledge Distillation (KD). In our proposed architecture, each independent mechanism (shape, texture, background) is represented by a student 'TinyGAN' that learns from the pretrained teacher 'BigGAN'. We demonstrate the efficacy of the proposed method using state-of-the-art datasets such as ImageNet, and MNIST by using KD and appropriate loss functions.
Moreover, as an additional contribution, our paper conducts a thorough study on the composition mechanism of the CGNs, to gain a better understanding of how each mechanism influences the classification accuracy of an invariant classifier.
|
|
[Re] Counterfactual Generative Networks
Ankit,
Sameer Ambekar,
Mark Alence,
Baradwaj Varadharajan
MLRC 2021, 2021
arXiv
Academic Project, MSc AI, UvA.
|
|
Twin Augmented Architectures for Robust Classification of COVID-19 Chest X-Ray Images
Kartikeya Badola,
Sameer Ambekar,
Himanshu Pant,
Sumit Soman,
Anuradha Sura,
Rajiv Narang,
Suresh Chandra,
Jayadeva,
arXiv, 2022
arXiv
We introduce a state-of-the-art technique, termed as Twin Augmentation, for modifying popular pre-trained deep learning models. Twin Augmentation boosts the performance of a pre-trained deep neural network without requiring re-training. Experiments show, that across a multitude of classifiers, Twin Augmentation is very effective in boosting the performance of given pre-trained model for classification in imbalanced settings.
|
|
University of Amsterdam, Research Masters in Artificial Intelligence (MSc AI)
September 2021 - June 2023
I pursued my MSc thesis (48ECTS, Grade: Excellent) to address Test-time Adaptation for Domain Generalization by proposing 2 methods : (i) Formulating it as a Variational Inference problem and meta-learn Variational neighbor labels in a probabilistic framework alongside exploring neighborhood information (ii) Surrogate update of the model without backpropagation. Courses enrolled: Curriculum based and Research based.
|
|
University of Amsterdam, AIM Lab - Research Intern (Deep learning, Computer Vision)
June 2022 - June 2023
|
|
Indian Institute of Technology Delhi (IITD), Delhi, India - Research Assistant (Deep learning, Computer Vision)
January 2019 - July 2021
|
|
Indian Council of Medical Research (ICMR) NITM Bioinformatics Division, Belgaum, India - Research Trainee
October 2017 - December 2018
|
|
DbCom Inc., New Jersey, USA - Remote Intern
June 2015 - December 2016
|
|
Recipient of DigiCosme Full Master Scholarship, Université Paris-Saclay, France.
|
|
Attended Oxford Machine Learning Summer School (OxML 2020 & OxML 2022), Deep Learning - University of Oxford
|
|
Attended PRAIRIE/MIAI PAISS 2021 Machine Learning Summer Learning - INRIA, Naver Labs
|
|
Attended Regularization Methods for Machine Learning 2021 (RegML 2021) - University of Genoa
|
|
As a part of my recreational activity, I like to play the Indian Flute.
|
Positions of Responsibility
|
|
Served as Charter Secretary
&
President of Rotaract Club of GIT
|
|