Research
My research lies at the intersection of Immersive Visualization, HCI, and AI. I aim to design scalable visualization frameworks that extend the capabilities of reconfigurable display facilities and XR environments. A central focus of my work is integrating AI-driven intelligence to better model and support user behavior, making immersive analytics adaptive, explainable, and collaborative. Alongside systems design, I explore applications of AR for situated visualization, bringing data insights directly into the environments where they matter most.
Publications & Featured Projects
Agentic AI: A Case Study on the Visualization Publications Dataset
Introduces an AI agent that automatically analyzes the VIS publications dataset, and generates visualizations.
Explainable XR: Understanding User Behaviors of XR Environments using LLM-assisted Analytics Framework
Presents an LLM-assisted framework for interpreting user behavior and interactions in XR applications.
Improving Developers’ Understanding of Regex Denial of Service Tools through Anti-Patterns and Fix Strategies
Examines how developers interact with Regex DoS detection and mitigation tools, highlighting anti-patterns and effective fix strategies. Published in IEEE S&P 2023.
News & Milestones
Sep 2025: Agentic AI poster accepted at IEEE VIS 2025. Poster
Jul 2025: Submitted a patent on adaptive immersive visualization facilities.
Jun 2025: FlexiCAVE featured in Stony Brook University News. Press Release
December 2024: Our paper on Leveraging LLMS for Understanding User Behavior and Interactions in XR Applications got accepted in IEEE TVCG. Stay tuned for the presentation in IEEE VR 2025!
November 2024: I passed my Research Proficiency Exam!
May 2024: I switched labs, working under Arie Kaufman now.
May 2023: I presented a poster for our S&P paper during the poster presentations at the Computing Research Association’s Grad Cohort for Women 2023.
November 2022: Our paper got accepted at IEEE S&P 2023.