In-vehicle sensing belongs to one of these groups of advanced perception technology that we classify as mission-critical systems. This means they are directly linked to people’s safety. The Euro NCAP is aware of this and has established hard standards that OEMs must demonstrate to obtain the green light to install their in-cabin monitoring systems in production vehicles.
Thanks to the advances that they are bringing, especially in terms of accurate sensing in adverse lighting conditions, NIR cameras are called to become an essential piece for driver and occupant monitoring systems development. And guess what, they will also be key to passing the Euro NCAP evaluation of Driver State Monitoring systems (DMS).
Let’s start from the beginning.
What is near-infrared (NIR) imaging?
Near-IR (NIR) imaging alludes to a range of wavelengths in the light spectrum between 800and up to 2500 nm (NIR cameras effective spectral response is up to 1060 nm), so, near-infrared is a portion of the electromagnetic spectrum that is just outside the range of what humans can see.
NIR light has a longer wavelength than visible light which is easier to transmit through materials like plastic or paper. For that same reason, it also reacts differently to materials than visible light, and it has certain properties that make it very interesting for in-cabin sensing and other computer vision applications:
- Penetrates materials more easily
- Reduces color saturation in staged objects
- Removes unwanted glare and reflections
- Neglect unwanted details in various inspection and detection cases
Applications of NIR imaging
Besides the in-cabin monitoring case, which we will talk about next in this article, near-infrared imaging can solve problems in several markets and industries:
- Computer vision
- Surveillance and security
- License plate recognition
- Factory automation
- Life sciences
Why do near-infrared cameras enable advances in in-vehicle sensing systems?
A regular Euro NCAP-compliant in-cabin driver state monitoring system must demonstrate that it can sense a wide variety of drivers’ behaviors and states under any real driving circumstances. This means any lighting conditions, environmental conditions, face occlusions, and a long etcetera.
It’s no secret that any of these “not so extreme” conditions can defy typical RGB sensors and cameras, putting the whole system’s reliability at risk of malfunction. Imagine if they faced extreme situations for real…
Near-infrared cameras spring into action to complement RGB cameras in these challenging conditions and keep system performance at a decent level.
NIR cameras track and take clear images regardless of day, night, shaded, or well-lit environments. It achieves impressive performance in challenging conditions such as reflections, face occlusions (glasses and sunglasses, protective masks, beards, etc.), extreme head rotations, and many other situations.
This makes them an essential component of a driver monitoring system, and a genuine trump card to improve and enable advances in the actual in-vehicle sensing systems.
But is this all? Are also NIR cameras the “insurance policy” to guarantee success at the Euro NCAP evaluation of driver state monitoring systems?
NIR cameras: key for gaze detection in challenging lighting and occlusion conditions
Euro NCAP will evaluate the following driver statuses: distraction, fatigue, and unresponsive driver. In order to detect distractions, the in-cabin system will have to be able to identify head movement, eye movement, and body lean-looking behaviors. To detect fatigue, the system will need to identify different degrees of eye closure. And, to determine unresponsive drivers, the system needs to detect drivers whose gaze has been away from the forward road view or has been eyes closed for a certain amount of seconds.
Now dear readers, I would like to ask you a question. What is the common denominator of the three blocks mentioned above to detect distraction, fatigue, and unresponsiveness?
Exactly! The eyes and gaze. As we’ve seen in the previous section of this article, these are two elements that can be difficult to monitor when the system experiences adverse lighting conditions, or face occlusions, and only NIR cameras can unravel them effectively.
A new challenge for developers rises… How to gather effective and accurate data to develop in-cabin monitoring AI models based on NIR cameras?
Stay tuned to our social channels, this is a matter we are going to be facing very soon.
Anyverse™ is the hyperspectral synthetic data generation platform for advanced perception that accelerates the development of autonomous systems and state-of-the-art sensors capable of supplying and covering all the data needs throughout the entire development cycle. From the initial stages of design or prototyping, through training/testing, and ending with the “fine-tuning” of the system to maximize its capabilities and performance.
Anyverse™ brings you different modules for scene generation, rendering, and sensor simulation, whether you are:
– Designing an advanced in-cabin perception system
– Training, validating, and testing in-cabin systems AI, or
– Enhancing and fine-tuning your in-cabin perception system,
Anyverse™ is the right solution for you.
The data you generate with Anyverse™ helps you build a robust in-cabin monitoring system capable of sensing a wide variety of drivers ready to operate under any circumstance, identifying driver and occupants’ state and behaviors, detecting driver distraction, fatigue, unresponsive drivers, and much more.