Measuring Expressed Emotion

B-Automotive uses a continuous approach (Apparent VAD) to measure expressed emotion. This better fits the real human experience of emotional states. This approach allows emotion regions to be defined and to measure the transitions away from and towards these regions.

This continuous approach, where appropriate, can be mapped back to a much less exact categorical representation. For example, excited, calm, or angry.  

We use cameras, to monitor the facial muscle movement underpinning facial expression, identifying how much those muscles are activated. We also determine the direction of eye gaze, and the pose of the head. 

This brings objective measures such as the frequency and intensity of facial muscle actions, head actions, and social gaze to areas which have traditionally been dominated by subjective interpretations.  

Our continuous approach to measuring expressed emotion (VAD) uses machine learning to analyse face and voice data during a number of predetermined tasks our software identifies over time how actively engaged the user is. 

We call this Arousal. 

We use the same approach to assess how positive or negative the user is feeling (Valence). 

We are currently deploying a third dimension Dominance  - how able an individual feels to deal with the cause of the emotion.

By plotting these three values, with Valence on the x-axis, Arousal on the y-axis and Dominance as a depth to the plot, we can pick a point or collection of points within the three dimensional space and give it a label.  

These valence and arousal scores are continuous and take the temporal dynamics of expressed emotion into account. 

To help understand and communicate Valence and Arousal Russell's Circumplex Model of Emotions is sometimes to translate the continuous representation to a discrete emotion or state label. For example, excited, calm, or angry.  By performing this analysis over time and in a continuous 3-dimensional space we can accommodate many more labels.