Melexis and emotion3D have joined forces to offer a unique 3D Time-of-Flight (ToF) demonstrator that combines the driver monitoring system (DMS) with high-precision 3D driver localization, to dynamically align augmented reality head-up displays (AR HUD) objects. The demonstrator consists of a camera built around Melexis’ MLX75027 3D ToF sensor and emotion3D’s advanced in-cabin analysis software (E3D ICMS). Melexis and emotion3D’s novel DMS covers all basic functions such as driver drowsiness and attention warning to conform to the EU’s General Safety Regulation and Euro NCAP’s testing protocols.
Moreover, the demonstrator provides 3D locations of the driver’s facial landmarks. These are relevant for an optimal augmented reality head-up display (AR HUD) user experience. The objects projected by the HUD require precise alignment with real-world objects, following the dynamic position changes of the driver.
Talking about the benefits of ToF technology, Gualtiero Bagnuoli, Product Marketing Manager at Melexis commented, “We combine accurate and robust 3D eye position detection for HUD with sunlight invariant eye gaze and eye openness detection for leaner DMS algorithm implementations. Use of ToF technology is key. It is very easy to get accurate depth data from the 3D ToF sensor with low processing effort. The result is that the DMS and HUD algorithms work impressively well with wide-field of view lenses and VGA resolution ToF sensors.”
Florian Seitner, CEO of emotion3D said, “Regulatory requirements make it necessary to integrate DMS into new vehicles and augmented reality head-up displays become more and more popular. Our combined system offers a highly precise and cost-efficient solution for automotive manufacturers.”