B-Automotive will revolutionise the passenger experience making the car of the future safer, smarter and more responsive to passenger mood, wellbeing and health.
B-Automotive uses machine learning to objectively and automatically analyse face and voice data to interpret medically relevant expressive behaviour and assist in the assessment and monitoring of, and the response to, health, mood and mental state.
Using B-Automotive’s interactive AI driven face and voice analysis modules, you will be able to:
Meet current and planned international safety monitoring requirements
Personalise passengers’ journey experience by being able to respond to both their mood and their health
Detect the early signs of degenerative illness
B-Automotive requires an ARM V8 SoC or above per occupant and one 1920 x1080 2MP NIR camera with IR illuminator for each occupant. Cameras need to be positioned 0.4 to 0.8M +/- 48 degrees from the centre of the scanned occupant’s eye line.
A GPU can be used if available but is optional.