ANYVERSE

Get better AI design and development results with a revolutionary technology stack

Generate automatically any scenario your AI model may need.

Render the closest to reality training images with our proprietary hyperspectral render.

Simulate faithfully the specific sensors that your system equips.

Make sensible

sensor decisions

When you train AI models with synthetic data, it is crucial to ensure they show a high degree of generalization to real-world data. Beyond a proper physics description of the synthetic 3D world, accurate modeling of the target sensor is essential to guarantee appropriate data matching in real operation. Anyverse’s sensor simulation uses an accurate hyperspectral light transport model. It combines a physics description of lights and materials in a 3D scene with a detailed simulation of sensor intrinsics. The sensor pipeline calculates energy’s propagation per wavelenth, considering many subtle physics effects such as the conversion of photons into voltage and RAW digital values before producing a final image.

low-pass filter, infrared cut-off filter, color filter array.

Exposure time, Conversion gain, Well capacity, Voltage swing, Dark Current and Noise (Shot Noise, Read Noise), Exposure (Global Shutter, Rolling Shutter), Motion Blur and Vibrations.

model provided by the user, or export the RAW data to an external ISP module.

using the ISET Toolbox from Stanford University.

Synthetic data platform for advanced perception | Anyverse

Optical Systems: 

Anyverse performs a geometric optical system simulation using ray tracing technology, capturing effects such as extreme optics distortion, lens shading, lens blurring (depth of field), complex assembly of multiple lenses, and more.

Multiple sensors:

Perception models are sensitive to different sensors. Anyverse™ technology allows you to simulate different sensors and ISP configurations to choose the right sensor for your perception system.

Our proprietary

physically-based render engine

Anyverse™ hyperspectral render engine implements a pure spectral ray-tracing engine that computes the spectral radiance of every light beam interacting with materials in the scene, simulating lights, cameras and materials with physical accuracy. This allows for a very detailed simulation of the amount of light that is reaching the camera sensor to generate a final image containing all the spectral information from a 3D scene (using Anyverse’s sensor simulation pipeline).

Both, the hyperspectral render and the sensor simulation pipeline are the core of Anyverse’s synthetic data platform that helps our customers to generate the data they need to develop advanced sensing and perception systems.

Our render engine simulates energy as an electromagnetic wave all the way across the rendering pipeline to its final digital values. Light sources in Anyverse™ emit using a characteristic spectrum profile that depends on the type of light source (LED, incandescent bulb, sky, etc). The materials’ behavior is also wavelength-dependent, this allows for typical effects like dispersion. These features are also key and necessary for an accurate camera sensor simulation.

Make sensible sensor decisions

You have full control on several sensor parameters…

Pixel Size

pixel size

Group 2

Fill factor

Exposure time

Exposure time

Well capacity

Well capacity

Conversion gain

Conversion gain

Group 3

Noise

Pixel Size

pixel size

Group 2

Fill factor

Exposure time

Exposure time

Well capacity

Well capacity

Conversion gain

Conversion gain

Group 3

Noise

QE curves

QE curves

Pixel Vignetting

Pixel vignetting

Analog offset gain

Analog offset/gain

infrared filter

infrared filter

Low-pass filter

Low-pass filter

Color filter array

Color filter array

QE curves

QE curves

Pixel Vignetting

Pixel vignetting

Analog offset gain

Analog offset/gain

infrared filter

infrared filter

Low-pass filter

Low-pass filter

Color filter array

Color filter array

…To generate a raw image, and full control on Image Signal Processor (ISP) parameters.

Combined Shape

white balance

Combined Shape

RGB to XYZ matrix

Combined Shape

XYZ to device matrix

Combined Shape

Gamma

Scene generation

The world is different depending on where you look, Anyverse™ has developed technology to recreate a diverse synthetic reality and to generate training and validation data. Generate thousands of variations of the 3D scene, change camera position, lighting, and weather conditions.

Weather conditions: rain, hail, sand, haze, dense fog, smoke, and more

Automatic simulation of traffic, cyclists and pedestrians

Lighting conditions: low visibility, dark conditions, sun in front, glare conditions, and more

In addition, with Anyverse’s procedural engine and API, you can programmatically control your scenes and automatically generate all data variations.

ASSETS IN THE SCENE

WEATHER CONDITIONS

LAYOUT

CAMERA POSITION TIME OF DAY

LIGHT SOURCES

BEHAVIOR

We'd love to know more about your project's technical needs and how we fit

Looking for the right Synthetic Data to speed up your system? Please, enter the Anyverse now

Client Story

Would you like to know how Cron AI has improved LiDAR simulation accuracy with physically correct synthetic data?

Let's talk about synthetic data!

[contact-form-7 404 "Not Found"]