Melexis, a global provider of automotive semiconductors, and emotion3D, a supplier of camera-based automotive in-cabin analysis software, have joined forces to offer a unique 3D Time-of-Flight (ToF) demonstrator.
This solution combines the driver-monitoring system (DMS) with high-precision 3D driver localization, to dynamically align augmented reality head-up displays (AR HUD) objects.
Melexis and emotion3D’s novel DMS covers all basic functions, such as driver drowsiness and attention warning to conform to the EU’s General Safety Regulation and Euro NCAP’s testing protocols. In addition, the demonstrator provides 3D locations of the driver’s facial landmarks.
These are relevant for an optimal augmented reality head-up display (AR HUD) user experience. The objects projected by the HUD require precise alignment with real-world objects, following the dynamic position changes of the driver.
“Regulatory requirements make it necessary to integrate DMS into new vehicles and augmented reality head-up displays become more and more popular,” said Florian Seitner, CEO of emotion3D. Our combined system offers a highly precise and cost-efficient solution for automotive manufacturers.”
The demonstrator consists of a camera built around Melexis’ MLX75027 3D ToF sensor and emotion3D’s advanced in-cabin analysis software (E3D ICMS). The software can be flexibly integrated in any automotive SoC of choice.
“We combine accurate and robust 3D eye position detection for HUD with sunlight invariant eye gaze and eye openness detection for leaner DMS algorithm implementations,” said Gualtiero Bagnuoli, product marketing manager at Melexis. “Use of ToF technology is key. It is very easy to get accurate depth data from the 3D ToF sensor with low processing effort. The result is that the DMS and HUD algorithms work impressively well with wide-field of view lenses and VGA resolution ToF sensors.”
Filed Under: Automotive, Components, News, Software