Yes, you can use BLUESKEYE AI to measure apparent emotion despite the EU AI Act restrictions.

Using AI to measure emotion is something to be taken very seriously.

In fact, the EU AI Act restricts the use of AI that aims to infer what emotion a person experiences in order to protect workers and students from bosses and teachers trying to control them through measuring their deepest feelings. This is a worthy endeavour that BLUESKEYE AI fully supports. Another concern of the authors of the EU AI Act is that current AI systems are not yet able to infer what emotion a person really experiences. On this topic I think the authors of the EU AI Act are some way off the mark. Based on years of research in this area, it is clear that measuring felt emotion automatically is indeed a very hard thing to do; but, I am convinced that with a proper multimodal approach that goes beyond face and voice analysis, this would be possible. See the excellent work of Jonathan Gratch of USC ICT on this topic.

Luckily, this concern does not apply to Blueskeye’s technology as we do not infer what a person experiences.

Still, the question we often get is: "is it OK to use BLUESKEYE AI to measure emotion, given the heavy-handed rhetoric of the EU AI Act on the topic of measuring emotion?".

The simple answer is: yes, you can use the apparent emotion measurement of the B-Automotive, B-Healthy, or B-Social SDKs and integrate them into your products or use their outputs in your research. BLUESKEYE AI is fully compliant with the EU AI Act, following ISO42001 and other relevant standards to demonstrate that.

What Does BLUESKEYE AI Measure About Emotion Then?

BLUESKEYE AI measures facial muscle actions, tone of voice, gaze directions and other expressive behaviour that any person can readily observe and thus verify that BLUESKEYE’s technology is accurate. The EU AI Act explicitly permits this, citing specifically that the recognition of smiles and other expressive behaviour is allowed and should not be confused with felt emotion recognition.

Based on this detected expressive behaviour, we then infer how someone observing this behaviour would interpret that in terms of emotion. We call this apparent emotion, and assessing someone’s apparent emotion is what you and I constantly do when we interact with others to determine how to proceed with an interaction. Crucially, this is not about reading what someone feels, but what someone is communicating with you. You are meant to read these expressions, in fact not doing so greatly impedes natural interactions. Like measuring smiles, apparent emotion recognition is not prohibited by the EU AI Act, so you are entirely allowed to use this.

If you as a user of BLUESKEYE’s SDKs want to go beyond the apparent emotion recognition we provide, and decide to make inferences of a person’s felt emotion for example with additional sensor data, then you still have broad options on how to use this in the EU. Because the only areas where you are not allowed to deploy felt emotion recognition is in the workplace or in an educational setting. And even in those two settings, it would still be allowed if you did so for safety or health reasons. As you can see, statements like ‘measuring emotion is illegal in the EU’ are not just misleading, they are incorrect.

This is not to say that the EU AI Act doesn’t impact how you use Blueskeye’s SDKs in your own products. Apparent emotion recognition, expressive behaviour detection and the face re-identification provided by BLUESKEYE AI are high-risk applications of AI in the terminology used by the EU AI Act. What this means in practice is that BLUESKEYE has had to develop its technology to be compliant with the ISO42001 standards, which means among others creation of an AI Impact Assessment, setting up an AI Council and an AI working group, maintaining risk registers, and assessing every new feature development for risks. Using BLUESKEYE’s technology does mean that you must also show that your product adheres to the EU AI Act. We can provide you the technical files to demonstrate that our part of your product is fully compliant.

Measuring social, emotional, and medically relevant expressive behaviour is a very powerful tool that will significantly improve the lives of people if applied correctly. The EU AI Act is there to protect consumers and ensure this technology is used for good, something that everyone at BLUESKEYE AI is fully behind.

Want to learn more? Contact me on LinkedIn or request a demo.

Next
Next

Automotives, here's how you use health features in your vehicles!