Immersive virtual reality (VR) is revolutionising education by offering highly personalised learning experiences. To fully leverage this transformative potential, it is essential to deepen our understanding of the self-regulatory processes that underpin VR learning (). Self-regulated learning (SRL) is critical for successful learning: it involves monitoring and controlling one’s behavior, emotions and thoughts, and adapting them to different situations according to one’s goals (Winne & Hadwin, 1998). SRL is particularly important in VR environments, as they are often more complex, dynamic and immersive than traditional learning settings, requiring learners to actively oversee their progress and make strategic adjustments as needed (Azevedo & Gašević, 2019). However, identifying the need for self-regulation and effectively regulating one’s learning can be a challenge (Winne & Azevedo, 2022). It is crucial, therefore, to detect the moments that the need for regulation is recognised through metacognitive monitoring, the core component of the SRL process.
Our recent study set out to pinpoint these pivotal moments using multimodal data collected during VR learning (Sobocinski et al., 2023). In immersive VR environments, learners physically engage with virtual learning materials and tasks in a unique interaction that provides insights into self-regulation not easily observed elsewhere. While the concept of embodied cognition has explored how bodily movement influences cognition (Johnson-Glenberg, 2018), there has been limited research into the role of movement in self-regulated learning within VR environments.
We harnessed various data sources, including bird’s-eye-view videos to track movement, screen recordings for capturing learners’ actions in the virtual environment, physiological metrics such as heart rate variability to indicate cognitive load, and learners’ verbalisations through think-aloud protocols to capture metacognitive monitoring.