Using a 360° virtual reality or 2D video to learn history taking and physical examination skills for undergraduate medical students: Pilot randomized controlled trial

Yi Ping Chao, Hai Hua Chuang, Li Jen Hsin, Chung Jan Kang, Tuan Jen Fang, Hsueh Yu Li, Chung Guei Huang, Terry B.J. Kuo, Cheryl C.H. Yang, Hsin Yih Shyu, Shu Ling Wang, Liang Yu Shyu, Li Ang Lee*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations


Background: Learning through a 360° virtual reality (VR) or 2D video represents an alternative way to learn a complex medical education task. However, there is currently no consensus on how best to assess the effects of different learning materials on cognitive load estimates, heart rate variability (HRV), outcomes, and experience in learning history taking and physical examination (H&P) skills. Objective: The aim of this study was to investigate how learning materials (ie, VR or 2D video) impact learning outcomes and experience through changes in cognitive load estimates and HRV for learning H&P skills. Methods: This pilot system–design study included 32 undergraduate medical students at an academic teaching hospital. The students were randomly assigned, with a 1:1 allocation, to a 360° VR video group or a 2D video group, matched by age, sex, and cognitive style. The contents of both videos were different with regard to visual angle and self-determination. Learning outcomes were evaluated using the Milestone reporting form. Subjective and objective cognitive loads were estimated using the Paas Cognitive Load Scale, the National Aeronautics and Space Administration Task Load Index, and secondary-task reaction time. Cardiac autonomic function was assessed using HRV measurements. Learning experience was assessed using the AttrakDiff2 questionnaire and qualitative feedback. Statistical significance was accepted at a two-sided P value of <.01. Results: All 32 participants received the intended intervention. The sample consisted of 20 (63%) males and 12 (38%) females, with a median age of 24 (IQR 23-25) years. The 360° VR video group seemed to have a higher Milestone level than the 2D video group (P=.04). The reaction time at the 10th minute in the 360° VR video group was significantly higher than that in the 2D video group (P<.001). Multiple logistic regression models of the overall cohort showed that the 360° VR video module was independently and positively associated with a reaction time at the 10th minute of ≥3.6 seconds (exp B=18.8, 95% CI 3.2-110.8; P=.001) and a Milestone level of ≥3 (exp B=15.0, 95% CI 2.3-99.6; P=.005). However, a reaction time at the 10th minute of ≥3.6 seconds was not related to a Milestone level of ≥3. A low-frequency to high-frequency ratio between the 5th and 10th minute of ≥1.43 seemed to be inversely associated with a hedonic stimulation score of ≥2.0 (exp B=0.14, 95% CI 0.03-0.68; P=.015) after adjusting for video module. The main qualitative feedback indicated that the 360° VR video module was fun but caused mild dizziness, whereas the 2D video module was easy to follow but tedious. Conclusions: Our preliminary results showed that 360° VR video learning may be associated with a better Milestone level than 2D video learning, and that this did not seem to be related to cognitive load estimates or HRV indexes in the novice learners. Of note, an increase in sympathovagal balance may have been associated with a lower hedonic stimulation score, which may have met the learners’ needs and prompted learning through the different video modules.

Original languageEnglish
Article numbere13124
JournalJMIR Serious Games
Issue number4
StatePublished - Oct 2021


  • Cognitive load
  • Heart rate variability
  • Learning outcome
  • Secondary-task reaction time
  • Video learning
  • Virtual reality


Dive into the research topics of 'Using a 360° virtual reality or 2D video to learn history taking and physical examination skills for undergraduate medical students: Pilot randomized controlled trial'. Together they form a unique fingerprint.

Cite this