What is Affective Computing?

In today’s world, technology plays a crucial role in our daily lives. From smartphones to home appliances, technology has made our lives more convenient and efficient. However, technology has also made it possible to understand and interpret human emotions through the field of affective computing, sometimes known as emotion AI.

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human emotions. This exciting and increasingly popular field combines computer science, psychology, and neuroscience to create systems that can understand human emotions and respond appropriately and in a more human-like way, detecting and responding to emotions in real-time.

Affective Computing makes use of various technologies, such as natural language processing, computer vision, and machine learning, to analyse speech, text, and facial expressions. It then combines these technologies to infer what emotion could be the reason behind the behaviour shown by a person. The aim is to enable computers to interact with humans in a more human-like way. The demand for this can be found in various industries; including healthcare,  automotive and other safety critical industries, as well as entertainment, marketing, and customer service.

Our primary aim at BlueSkeye AI is to use machine learning to automatically analyse face and voice data and interpret medically relevant expressed behaviour to help clinicians, patients and their friends and families assess, treat, and monitor health, mood, and mental state. 

This is desperately needed.  According to the World Health Organisation there are a billion people worldwide living with mental illness. 

We use cameras, for example in a smartphone, to monitor the facial muscle movement underpinning facial expression, identifying how much those muscles are activated. It also determines the direction of eye gaze, and the pose of the head. This brings objective measures such as the frequency and intensity of facial muscle actions, head actions, and social gaze to areas which have traditionally been dominated by subjective interpretations.  Combined with analysis of voice during a number of predetermined tasks our software identifies over time how actively engaged the user is. We call this Arousal. 

We use the same approach to assess how positive or negative the user is feeling (Valence)  and how able they feel to deal with the cause of the emotion (Dominance).

By plotting these three values with Valence on the x-axis, Arousal on the y-axis and Dominance as a depth to the plot, it can pick a point or collection of points within the three dimensional space and give it a label.  Typically, we use those commonly applied to emotion. For example, excited, calm, or angry.  But given it is performing this analysis over time and in a continuous 3-dimensional space it can accommodate many more labels. 

Our technology is currently in use in a Smartphone App, Avocado which has helped 150 000 mums-to-be use interactive tasks, such as; recording a pregnancy diary, singing a nursery rhyme or reading a book out loud to their baby,  to automatically assess and respond to mood changes during their pregnancy and postpartum journeys. 

In the automotive industry we are working with a global car manufacturer to enable in-cabin sensing, In addition to safety monitoring, BlueSkeye AI technology offers automotive manufacturers the opportunity to personalise the driver and passengers’ in car and journey experience, By continuous in-cab monitoring of facial expressions, facial muscle actions, the direction and context of occupants’ gaze, body pose and their tone of voice, our technology can detect micro differences in behaviour and these insights are used to trigger interventions in the vehicle cabin.

Using our technology, smarter cars gather and respond to visual cues to change the in-cabin environment or the ride in response to data gathered;  constantly adapting the in-car experience to maintain or improve the driver and passengers’ mood, allowing them to arrive at their destination relaxed and happy.

We’ve also been working with a global FMCG company’s researchers to allow  reliable, repeatable scientific measurement of pain when a consumer performs their daily personal care routine when/after using their products.  Previously this pain data was only available subjectively through consumer reporting and could not be measured longitudinally.  BlueSkeye has helped the FMCG company capture quantitative, measurable pain data to guide product decision making as part of its consumer focus group activities. 

Our technology has even helped film makers adapt their storylines to fit their audience’s emotional response. In his 2023 film Before We Disappear, filmmaker and researcher Richard Ramchurn used an ordinary computer camera and BlueSkeye software to read emotional cues and instruct the real-time edit of the film. Cinema audiences tend not to exhibit strong emotion when watching a film but our software was sufficiently sensitive enough to pick up enough small variations and emotional cues to adapt the film to viewer reactions.

In the future it is possible to see affective computing informing the development of empathetic social robots designed to meet the care needs of an ageing population. 

While affective computing has many potential benefits, it also raises concerns about privacy and ethical considerations. For example, the use of technology to monitor and track a person’s emotions raises questions about the protection of personal data and the potential for misuse of this information.

At Blueskeye we created our technology with privacy included by design. Data collection and storage is minimised wherever practical, and we process all data on people’s own devices, without using the cloud. Users choose who they share their data with, and when and always with end-to-end encryption.

And as with any machine learning the outputs of a model are driven by the quality and biases of the data input. Our AI models are also designed to be interpretable and transparent, with predictions based on readily verifiable data. Resulting in an AI system where outputs can be checked independently. We continuously update and test  our technology to deal with differences of expressed gender and ethnicity.

In conclusion, affective computing is a rapidly growing field that has the potential to revolutionise the way we interact with technology. By understanding and processing human emotions, we can create more intuitive and personalised technology that improves our lives and provides new solutions to emotional and mental health problems. 

If you would like to know more about what we do and how we do it, please get in touch. 

Previous
Previous

BlueSkeye AI shortlisted for StartUps Magazine Most Innovative Tech Award

Next
Next

Will your next car know you have Alzheimer’s before your Doctor does?