August 2025: Paper on a new KV cache compression technique accepted at NeurIPS 2025; April 2025: Work on ColBERT.jl accepted as a main talk at JuliaCon 2025; January 2025: Paper on PruneNet, a novel structured model compression technique, accepted to ICLR 2025; October 2024: Successfully completed GSoC 2024 project and released v0.1.0 of ColBERT.jl; June 2024: Paper on fair contextual bandits accepted at FoRLaC@ICML2024.
Research Experience
Worked as a research assistant at the Laboratory for Computational Social Systems (LCS2) @IIT Delhi on problems related to LLMs, particularly on model and inference efficiency. During his master's, he worked at the Networks and Learning Group at TIFR Mumbai on problems at the intersection of learning theory, convex optimization, and online algorithms.
Education
Currently a first-year CS PhD student at the Kahlert School of Computing, University of Utah, under Prof. Aditya Bhaskara. Previously worked as a research assistant at the Laboratory for Computational Social Systems (LCS2) @IIT Delhi, led by Dr. Tanmoy Chakraborty. During his master's, he worked as a research assistant at the Networks and Learning Group at TIFR Mumbai, led by Dr. Abhishek Sinha.
Background
Interested in building the theoretical foundations of large language models, including their ability to retain long-term information, reasoning capabilities (e.g., in-context learning), and model compression. Also interested in high-dimensional statistics, online/statistical learning, and privacy-related problems in these areas.
Miscellany
Loves contributing to and exploring open source software, NBA fan, enjoys listening to and playing metal music (one of his favorite bands is Periphery).