I'm completing my MS in Industrial Engineering from University of Illinois at Urbana-Champaign. I have a strong background and interests in areas of optimization, learning theory, reinforcement learning, and data-driven control.
After specializing in Advance Analytics Concentration, I thought, "Why not teach machines to learn so I can take longer coffee breaks?"
During my studies, I dove into projects involving neural networks and AI—turns out, machines appreciate data even more than I appreciate a good pizza.
Previously, I was at Suraj industries, collaborating with industry veterans and sharpening my technical skills—and my wit. Worked on diverse projects that boosted
my logical abilities and out-of-the-box thinking (though sometimes the box was just a cleverly disguised bug ;)
• Programming Languages: Python, Java, SQL, R, Bash
• Core Frameworks: PyTorch, Keras, TensorFlow, Transformers
• Transformers & NLP: Hugging Face Transformers, spaCy, NLTK
• Cloud & CI/CD: AWS, Terraform, Docker, Kubernetes, Jenkins
• Distributed Systems: Spark, Hadoop, Hive, Kafka, Flink, ETL, HDFS
• Data Processing & Analysis: Pandas, NumPy, Dask, Apache Arrow, Matplotlib, Seaborn
• Continuous Improvement: Lean Six Sigma, 5S, Root Cause Analysis, FMEA
• Supply Chain & Operations: Supply Chain Management, Inventory & Demand Planning
more âžś
• Currently working on my Master’s thesis with Prof. Gökçe Dayanikli at UIUC on minimizing systemic risk using multi-population mean field game models.
• Developing a hierarchical multi-population mean field game model aimed at minimizing systemic risk across interconnected banks.
• Implementing a major-minor player dynamics and Stackelberg equilibria to reflect the hierarchical influence of local banks, central banks, and the World Bank.
• Using a forward-backward stochastic differential equations to analyze both Nash and Stackelberg equilibria, identifying conditions leading to potential failures.
• Supported 450+ undergraduate students across Calculus, Linear Algebra, and Engineering Design courses from Fall 2023 to Spring 2025.
• Led weekly discussion sessions and office hours for MATH 231, MATH 234, MATH 257, and SE 101 — emphasizing conceptual clarity and problem-solving.
• Received an ICES evaluation rating of 4/5, reflecting strong student engagement and support in STEM-heavy sections.
• Clarified complex topics such as calculus, linear algebra using python workflows, and engineering design, 3D modeling in interactive sessions.
I build systems that don't just work—they learn and evolve. With a solid grasp of software, data, and infrastructure, I love turning complex problems into innovative, scalable solutions that actually make a difference.
I love turning data into actionable insights and intelligent systems. Using Python and frameworks like TensorFlow and PyTorch, I build machine learning models that don't just work—they make an impact. My knack for data processing and algorithms helps me create solutions that handle big data without breaking a sweat. Whether it's training models or deploying them into the wild, I enjoy the whole journey of bringing AI projects to life.
Data doesn't organize itself—but that's where I come in. I design and maintain data pipelines using Python, Java, and Spark that keep the information flowing smoothly. Extracting real-time insights from massive datasets is kind of my thing, and it helps businesses make smarter, data-driven choices.
I take pride in simplifying cloud technologies like AWS, Terraform, Docker, and Kubernetes. Whether it's deploying new services or scaling existing ones, I focus on making complex infrastructures accessible and efficient. Automating tasks and optimizing resources? That's where the magic happens.
I enjoy diving into distributed systems like Hadoop, Spark, Hive, and Kafka to process huge amounts of data. Designing efficient and reliable solutions that keep performance up and handle faults gracefully? That's what I do. Big data in a distributed environment doesn't have to be daunting, and I make sure it isn't.
I love making deployments smooth and hassle-free using CI/CD tools like Jenkins, Ansible, and Git. Keeping everything automated and under version control means fewer headaches and more time for the fun stuff. When it comes to data pipelines, Airflow is my trusty sidekick for keeping things efficient and reliable.
Project implemented as a part of collaboration of OneTick with UIUC - HFT lab
Developed a scalable healthcare database system to improve data access and storage efficiency.