As known from everyday contexts of multimedia usage, suddenly occurring quality impairments are capable of causing strong negative emotions in human users. This is particularly the case if the displayed content is highly relevant to current motives and behavioral goals. The present study investigated the effects of visual degradations on quality perception and emotional state of participants who were exposed to a series of short video clips. After each video playback, participants had to decide whether a certain event happened in the video. For data collection, subjective measures of quality and emotion were complemented by behavioral measures derived from capturing participants' spontaneous facial expressions. For data analysis, two general approaches were combined: First, a multivariate analysis of variance approach allowed to examine the effects of visual degradation factors on perceived quality and subjective emotional dimensions. It mainly revealed that perceived quality and emotional valence were both sensitive to degradation intensity, whereas the impact of degradation length was limited when task-relevant video content had already been obscured. Second, using a machine learning approach, an automatic Video Quality of Experience (VQoE) prediction system based on the recorded facial expressions was derived, demonstrating a strong correlation between facial expressions and perceived quality. Hereby, estimates of VQoE might be delivered in an objective, continuous and concealed manner, thus diminishing any further need for subjective self-reports.

Emotional impact of video quality: Self-assessment and facial expression recognition

Porcu S.
;
Atzori L.
2019-01-01

Abstract

As known from everyday contexts of multimedia usage, suddenly occurring quality impairments are capable of causing strong negative emotions in human users. This is particularly the case if the displayed content is highly relevant to current motives and behavioral goals. The present study investigated the effects of visual degradations on quality perception and emotional state of participants who were exposed to a series of short video clips. After each video playback, participants had to decide whether a certain event happened in the video. For data collection, subjective measures of quality and emotion were complemented by behavioral measures derived from capturing participants' spontaneous facial expressions. For data analysis, two general approaches were combined: First, a multivariate analysis of variance approach allowed to examine the effects of visual degradation factors on perceived quality and subjective emotional dimensions. It mainly revealed that perceived quality and emotional valence were both sensitive to degradation intensity, whereas the impact of degradation length was limited when task-relevant video content had already been obscured. Second, using a machine learning approach, an automatic Video Quality of Experience (VQoE) prediction system based on the recorded facial expressions was derived, demonstrating a strong correlation between facial expressions and perceived quality. Hereby, estimates of VQoE might be delivered in an objective, continuous and concealed manner, thus diminishing any further need for subjective self-reports.
2019
978-1-5386-8212-8
Emotion
Facial expression recognition
Machine learning
Quality of Experience
Valence
Video quality
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11584/340046
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? ND
social impact