RESEARCH PROJECTS
Our work is grounded in the learning sciences and draws on theories of collaboration, embodied cognition, and multimodal learning analytics. We approach this work through three interconnected strands: Technology-Mediated STEM Learning, Computer-Supported Collaborative Learning (CSCL), and Applications of Learning Analytics.
Current Projects
Our research is organized around several core projects that span immersive simulations, collaborative learning environments, and learning analytics.
HoloOrbits
HoloOrbits (Campus Research Board #RB22100) is an immersive simulation where students work together to explore celestial mechanics through embodied interaction and shared reference points. We study how learners coordinate attention, share insights, and solve problems in virtual space, focusing on how features like synchronized annotations and gesture-tracking shape collaborative inquiry. This project is part of the NSF AI Institute-funded INVITE initiative, where we also examine how group dynamics unfold through spontaneous collaborative initiatives.
CEASAR
Connections of Earth and Sky with Augmented Reality (CEASAR; National Science Foundation #1822796) is designed to support astronomy education through an immersive and interactive learning experience. It provides access to a star and constellation database and allows for location and time modification to observe celestial objects from three different perspectives. The cross-device system, through MS HoloLens and touch-based tablets, synchronizes all user interactions in real time, ensuring that annotations, highlighted objects, and navigation inputs are visible across all group members’ devices. Our team analyzes how multimodal learning behaviors like gaze, gesture, and annotation align over time to support joint attention and knowledge construction.
INVITE
As part of the INVITE AI Institute (NSF/IES #222961 ), we research detecting and supporting productive collaboration in digital learning environments. Our team develops analytics pipelines that capture how students initiate, respond to, and build on peer ideas. We focus on identifying key interaction patterns—such as initiative uptake and re-engagement—that predict group success, with the goal of designing intelligent supports that enhance collaborative learning.
Inclusive Design for XR in STEM
Our newest project recently received the College of Education Seed Funding. This project brings together Extended Reality (XR) and Human-Centered Design (HCD) to explore how students—especially women of color—experience and contribute to collaborative learning in immersive environments. Grounded in a design justice framework, this project involves high school students and teachers as co-designers and examines how XR can support inclusive, meaningful engagement with complex STEM concepts. We also explore how multimodal learning analytics and large language models can help analyze collaborative processes while mitigating bias.