Why is simulating the near infrared key for in-cabin sensing?


Simulating the near infrared (NIR) has been requested by many of our clients needing synthetic data, both to produce training data and to support the design and configuration of their in-cabin monitoring systems in the early stages of development.

In this article we will try to answer several questions: why is the near infrared band key for (camera-based) in-cabin monitoring systems to perform well in low light? Why is simulating the NIR a challenge? What solutions have been used so far to simulate it? How does Anyverse simulate it?

Why is the near infrared band needed in the in-cabin monitoring simulation?

The in-cabin monitoring solutions incorporate dynamic near infrared lighting systems with the aim of enlightening the driver (and the occupants) at night (or in low light conditions) without disturbing them. The reason is that the human eye does not perceive light in this part of the spectrum which makes it adequate for the interior monitoring system to continue to function normally in low light conditions.

How does it work?

The near infrared system incorporates LEDs that emit only in the near infrared spectrum, this light is not visible to the human eye, so it does not disturb the driver, but it is visible to the cameras.

Why is simulating near infrared key for in-cabin sensing

Image 1 – Image without active LED illumination

Image 2 – Same image with active NIR LED Illumination

Why is it a challenge to simulate the near infrared?

Simulating the near-infrared is not easy. First, you need to accurately characterize the emission of the light in the NIR band. Then, you need to simulate the response of the materials to these new wavelengths, which defer from the response to the visible spectrum. This is the greatest challenge because it is not easy to gather this information for all materials.

Finally, you need to simulate the response and behavior of the sensor used by your system for the NIR band.

How has the near-infrared been simulated so far?

Traditional graphic engines have struggled to simulate this band and have tried to resolve this challenge by producing high-quality grayscale images and applying several effects, but this solution differs greatly from a correct near-infrared simulation.

How does Anyverse simulate the near infrared?

The more advanced the perception system is, such as the in-cabin monitoring systems, and the further away or different from human vision is, the more you need an accurate physics-based simulation.

Anyverse render engine is hyperspectral and provides physically based simulation features. We are extending the range of wavelengths supported to include the NIR band up to 1000 nm.

At the same time we are working with partners to correctly characterize the reflectivity of materials in that new part of the spectrum. The result will be a physically correct NIR simulation. If you are designing a perception system that can consume raw output from a NIR sensor, or NIR image produced by cameras you will be able to generate as much data as you need to train your system without the need of artificial imagery trickery.

Stay tuned! 

About Anyverse™

Anyverse™ helps you continuously improve your deep learning perception models to reduce your system’s time to market applying new software 2.0 processes. Our synthetic data production platform allows us to provide high-fidelity accurate and balanced datasets. Along with a data-driven iterative process, we can help you reach the required model performance.

With Anyverse™, you can accurately simulate any camera sensor and help you decide which one will perform better with your perception system. No more complex and expensive experiments with real devices, thanks to our state-of-the-art photometric pipeline.

Need to know more?

Visit our website, anyverse.ai anytime, or our Linkedin, Instagram, and Twitter profiles.

Scroll to Top

Let's talk about synthetic data!