What Is Expressive Behaviour Analysis?

If you’ve visited BLUESKEYE AI’s website, or seen any of our marketing material, you might well know that Blueskeye uses machine learning to OBJECTIVELY MEASURE social, emotional, and medically relevant expressed behaviour, BETTER than ever before.

But what do we mean exactly when we say expressed behaviour? Is there other behaviour than expressed behaviour? And what makes expressed behaviour social, emotional, or medically relevant?

In this article, I will answer these questions and explain exactly what we mean by expressed behaviour.

Our remarkable face

Our face is probably the most used and most versatile interactive interfaces you will ever encounter. It is not an exaggeration at all to say that the face is our main means of interaction with the world around us, both in terms of sensing and in terms of signalling. It harbours all five senses and is the exclusive site of four of them: smell, taste, vision and hearing. Our sensing apparatus shape your face and head, with the eyes, nose, and ears prominently in view. Your lips are partly shaped the way they are to accommodate the sense of touch and temperature, all plump, protruding, with thin skin and lots of blood flowing through.

Facial expressive behaviour

Equally important is the face’s ability to signal a wide variety of information. Besides housing your speech apparatus, the face is our most important mode of non-verbal communication, with a wealth of communicative signals coming from its facial displays, head movements and gaze directions. It is therefore only natural that humans have been studying facial expressions since the ancient Greeks, and possibly even before that. Some of this signalling is done consciously, some unconsciously. Some of the states that cause the signals are transient, such as emotion, others are permanent, such as identity.

Voice expressive behaviour

The voice is another great way of expressing ourselves in a way hat others can clearly pick up. We split what’s expressed by the voice in two categories:


  1. Verbal signals (WHAT is said), and

  2. Non-verbal signals (HOW it’s said).


Both are important, and you cannot fully rely on just one category. For example, if someone says ‘Yeah, because you’re really good at that', then based solely on the verbal signals you would assess that this is a positive statement. However, if this was said in a really sarcastic tone, the non-verbal signals would indicate that the statement was not serious. If the statement is accompanied by an angry face, then you can be pretty sure that the opposite was meant.

What expressive behaviour does BLUESKEYE AI measure?

Blueskeye measures expressive behaviour in units called ‘Behaviour primitives’. They stand out for being short in duration, clearly understood by people, unambiguous, and objective. These include the intensity of facial muscle actions, gaze direction, and the speed at which someone is uttering words, among others.

Below is a list of the behaviour primitives we can measure:

Behaviour primitives of expressive behaviour measured by BLUESKEYE AI

NB: Not all behaviour primitives are included in all our products. For details please refer to the technical specifications of each product.

Other measurable behaviour

Beyond using your face, voice, and body, there are other patterns of behaviour that can be used to express yourself. Writing and painting come to mind, for example.

There are other things that you can measure about a person that aren’t normally used to express ourselves, but which can still tell a lot about ourselves, either (mental) states that we temporarily are in, or individual traits that stay with us for the better part of our lives. There are very many non-expressive behaviours that provide evidence on what state or trait you are likely to be in, including such diverse measures as how fast you type, what websites you frequent, how much time you spend at home etc.

Two particular groups of measurements are particularly worth mentioning due to their strong relation to emotion and medical conditions: biophysical and biochemical measures.

Biophysical measures quantify mechanical and conductance properties of the body that aren’t already captured by the expressive behaviour categories of the face and body. The most used measures are heart rate, heart rate variability, breathing rate, and electrodermal activity.

Biochemical measures quantify chemical concentrations in different parts of the body. Good examples are hormones such as cortisol, neurotransmitters such as dopamine and serotonin, blood alcohol levels, or blood oxygen levels.

Some biophysical measures can be derived from the face using computer vision, such as heart rate using remote photoplethysmography (rPPG).

Social, emotional, and medically relevant expressive behaviour

Once you have measured exactly what behaviour was expressed, you can start answering a much more meaningful question: WHY did the person in question express that way? This causal inference is what drives the real value. There are (at least) three different ways to explain why someone expressed:


  1. For the purpose of social signalling

  2. Caused by an underlying emotion

  3. Caused by a medical condition


As you can imagine, it doesn’t really matter that you recognise that someone has lowered their eye-brows in a clear frown (FACS AU4). What matters is whether that was done so signal that someone didn’t understand what a robot is showing, or that they are indicating that they don’t like a ne didn’t understand what a robot is showing, that they are feeling sad, or that they’re feeling acute pain. All three are possible in principle. To distinguish between these possibilities, BLUESKEYE AI combines multiple expressive behaviour signals over a period of time using their award-winning, proprietary techne didn’t understand what a robot is showing, that they are feeling sad, or that they’re feeling acute pain. All three are possible in principle. To distinguish between these possibilities, BLUESKEYE AI combines multiple expressive behaviour signals over a period of time using their award-winning, proprietary technology.

Traits and states

Another way to think about a person’s expressive behaviour is what it says about traits and states. The difference lies in their temporal nature: traits are (almost) permanent, whilst states vary over time. Good examples of traits are someone’s personality as expressed by e.g. conscientiousness or extraversion. Good examples of states are emotion, fatigue, or depression.

Most traits and states influence your expressive behaviour, but they do so in a variety of ways. Some are more clearly picked up from facial behaviour, some from pupillometry, and others from speech. Some states can be determined from as little as one 3 seconds recording of behaviour (e.g. to measure apparent emotion), whilst others require multiple longer recordings spread out over a number of days to derive a clear indication (e.g. to measure the effectiveness of a treatment for depression).

How we help you to measure behaviour

BLUESKEYE AI only produces software, not hardware. Our SDKs and software services measure expressive behaviour using off the shelf, consumer grade, non-invasive sensors such as your mobile phone camera and microphone.

BLUESKEYE AI technology can interface well with data collected from wearables such as smart watches or the GPS on your mobile phone.

BLUESKEYE AI technology is developed using privacy by design, security by design, and a host of other ethical AI principles.

Blueskeye has specific products for the Automotive, Health, and Social robotics. Follow the link for each to learn more:


  • B-Automotive SDK for in-cabin occupant monitoring to improve comfort, health, and safety

  • B-Healthy Platform to support clinical trials with objective measurement of digital endpoints

  • B-Social SDK to imbue your robot with the ability to see users and improve empathy


Previous
Previous

BLUESKEYE AI completes Zenzic CAM Scale-up data collection to measure fatigue in vehicles

Next
Next

Self-driving cars will increase the need for driver monitoring, not reduce it