Research Hub: Publications & InsightsOpen-Source XR DatasetsMEMBERSHIP PRICING UPDATENEW MEMBER: VR hotelo

VR Eyes: Emotions Dataset (VREED)

I 2021
Virtual reality

VREED (VR Eyes: Emotions Dataset) is a multimodal affective dataset in which emotions were triggered using immersive 360°Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1-3 min each) selected through focus groups and a pilot trial. The paper (citation below) presents statistical analysis that confirmed the validity of the selected 360-VEs in elicitingthe desired emotions and preliminary machine learning analysis that demonstrated state-of-the-art performance reported in affective computing literature using non-immersive modalities.

Please cite the paper when using the dataset: Tabbaa, L., Searle, R., Ang, C.S., Bafti, S.B., Hossain, M.M., Intarasirisawat, J., and Glancy, M. 2021. VREED: Virtual Reality Emotion Recognition Dataset using EyeTracking Physiological Measures. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 269.904, doi: https://doi.org/10.1145/3495002.

Notes

The dataset contains the following elements:

Dataset summary: explaining all the content and variables within the dataset.
Stimuli selection: including the VE selection process, pilot trial, and final VEs list and description.
Self-reported questionnaires: results of questionnaires filled by participants.
Eye-tracking data: including pre-processed and features extracted.
ECG/GSR data: including pre-processed and features extracted.
Support documentation: including a verbal instructions protocol, sample questionnaires, and randomisation of VEs per participant.