VREED (VR Eyes: Emotions Dataset) is a multimodal affective dataset in which emotions were triggered using immersive 360°Video-Based Virtual Environments (360-VEs) delivered via Virtual Reality (VR) headset. Behavioural (eye tracking) and physiological signals (Electrocardiogram (ECG) and Galvanic Skin Response (GSR)) were captured, together with self-reported responses, from healthy participants (n=34) experiencing 360-VEs (n=12, 1-3 min each) selected through focus groups and a pilot trial. The paper (citation below) presents statistical analysis that confirmed the validity of the selected 360-VEs in elicitingthe desired emotions and preliminary machine learning analysis that demonstrated state-of-the-art performance reported in affective computing literature using non-immersive modalities.
Please cite the paper when using the dataset: Tabbaa, L., Searle, R., Ang, C.S., Bafti, S.B., Hossain, M.M., Intarasirisawat, J., and Glancy, M. 2021. VREED: Virtual Reality Emotion Recognition Dataset using EyeTracking Physiological Measures. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 269.904, doi: https://doi.org/10.1145/3495002.