Generative AI Engineer in the making || LLMs • Transformers • RAG || CS @ BITS Pilani
Reach out: [email protected] ~ happy to connect on research, projects, or collaborations.
🔭 I’m currently working on:
Exploring research-driven aspects of Generative AI, with a focus on transformer architectures, representation learning through embeddings, and retrieval-augmented generation (RAG) pipelines. I work primarily in Python, using PyTorch and the Hugging Face ecosystem for model experimentation and analysis.
👯 I’m looking to collaborate on:
Research-oriented projects in LLMs, generative modeling, and applied ML, especially those involving model behavior analysis, retrieval mechanisms, fine-tuning strategies, or empirical evaluation of generative systems.
🤝 I’m looking for help with:
Deepening my understanding of LLM fine-tuning, rigorous evaluation methodologies, representation quality in embeddings, and research-informed practices for scaling and deploying generative models.
🌱 I’m currently learning:
Advanced transformer internals, fine-tuning and adaptation techniques, RAG system design, and the use of vector databases (e.g., FAISS-style systems) for efficient retrieval and knowledge grounding.
💬 Ask me about:
Generative AI research, LLMs, transformers, embeddings, RAG, PyTorch, Hugging Face, LangChain, probability and statistics, or algorithmic thinking for ML systems.
⚡ Fun fact:
I enjoy reading research papers end-to-end and writing concise explanations to clarify both the intuition and the limitations of a model.
📍 Background:
Computer Science senior at BITS Pilani, with a strong grounding in mathematics, probability, algorithms, and systems—currently working toward a research-driven career in Generative AI.

