Intelligent Cockpits

We use in-car cameras to monitor facially expressed behaviour using facial muscle movements, Facial Action Units, (AU)  gaze behaviour, and the pose of the head.  Audio information can also be used but this does not generalise well to noisy multi-occupant environments like automotive cabins. This is an area we are continuing to actively research.

Expressed emotional and mental states can be interpreted as categorical or continuous states. 

B-Automotive uses a continuous approach (Apparent VAD). This better fits the real human experience of emotional states and avoids having to quantify the fuzzy boundaries and subjective definitions of discrete emotional states between different individuals. 

This continuous measure of state, where appropriate, can be mapped back to a much less exact categorical representation.

B- Automotive predicts the Valence (how positive or negative the user is feeling ) and Arousal (how actively engaged the user is)  scores for an occupant using a deep learning-based temporal model to define a 2D VA space that covers all possible underlying emotional states. 

You can incorporate this information into your vehicle’s on board computing systems (CANBUS) and fine tune the car’s environment; adjusting the temperature and air conditioning, seat position, in-car entertainment, lighting and ride based on real time passenger feedback allowing each journey to be customised to your passengers’ preferences.