Udbhav Bamba

Amazon Lab126, India

prof_pic.jpg
ubamba98@gmail.com

I am an Applied Scientist II at Amazon, where I work at the intersection of large language models and low-latency on-device inference. Beyond my industry role, I actively mentor students and collaborate with researchers. Prior to Amazon, I focused on resource-efficient machine learning as an intern at Mila - Quebec AI Institute and co-founded Transmute AI Labs to mentor underrepresented students at IIT Dhanbad.

These days I am working on building better mixture-of-experts systems and helping LLMs reason efficiently under strict compute budgets. Beyond academic research, I am an active competitor on Kaggle, where I hold the Competitions Master title with five gold and eight silver medals. Apart from work, I enjoy gaming, binge-watching shows, and playing badminton.

For updated details, please see my Google Scholar / LinkedIn pages.

Selected publications

  1. mixbin.png
    Partial Binarization of Neural Networks for Budget-Aware Efficient Learning
    Udbhav Bamba, Neeraj Anand, Saksham Aggarwal, and 2 more authors
    In IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2024
  2. chipnet.png
    ChipNet: Budget-Aware Pruning with Heaviside Continuous Approximations
    Udbhav Bamba, Rishabh Tiwari, Arnav Chavan, and 1 more author
    In International Conference on Learning Representations (ICLR), 2021
  3. ultramnist.png
    An UltraMNIST classification benchmark to train CNNs for very large images
    Udbhav Bamba, Deepak K. Gupta, Abhishek Thakur, and 6 more authors
    In Scientific Data, 2024
  4. metadock.png
    Dynamic Kernel Selection for Improved Generalization and Memory Efficiency in Meta-learning
    Arnav Chavan, Rishabh Tiwari, Udbhav Bamba, and 1 more author
    In IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022