A flexible and accurate

synthetic data generation platform

Craft the data you need for your perception system in minutes.

Design scenarios for your use case with endless variations.

Generate your datasets in the cloud.

Anyverse™ brings you a scalable synthetic data software platform to design, train, validate or test your perception system’s AI. It provides unparalleled computing power in the cloud to generate all the data you need in a fraction of time and cost compared with classic real-world data.

synthetic data generation

Hyperspectral (pixel-accurate)

render engine

synthetic data generation

Accurate sensor simulation

Procedural (API-based) scene generation

synthetic data generation

Graphical interface for dataset


synthetic data generation

Built-in assets library

synthetic data generation

Scalable (API-based) cloud data

production engine

A flexible modular platform

Anyverse offers a modular platform for scene generation, rendering, and sensor simulation, allowing you to decide which modules fit better with your workflow. You may want to use all modules or connect different parts of Anyverse to your data pipeline. Anyverse Studio is the graphical interface application that enables you to visually develop your scenes and datasets.


Anyverse Studio

User interface for the Anyverse Platform. Graphically design the base for your datasets. Whether you need static annotated images or sequences, use Anyverse’s extensive asset library to compose a scene. Apply dynamic behaviors and program the environmental variability you need with python scripts. Produce your datasets in the cloud and explore the results in Anyverse Studio, including all the associated ground truth data.

GUI Scene design

camera definition

sensor & ISP definition

2D & 3D view ports

3D assets management

dynamic behavior

weather conditions

illumination conditions

python scripting API

endless variability

script based dataset generation



Anyverse render engine implements a pure spectral ray-tracing engine that computes the spectral radiance of every light beam interacting with materials in the scene, simulating lights and materials at close physical level.


custom lens



(256 bands sampling)

motion blur

global rolling shutter

complex environments (sky and water)

high bit-depth image output

raw sensor data

photometric accuracy

custom lens



(256 bands sampling)

motion blur

global rolling shutter


complex environments (sky and water)

high bit-depth image output

raw sensor data

photometric accuracy

Ground-truth channels

Anyverse's datasets comprise various information such as color images, raw sensor data, JSON files, and ground-truth channels.

Click on the options to discover more

JSON file with all meta information: camera and objects positions, 2D and 3D bounding boxes, characters’ poses, environment information like time of day and weather conditions and more.

16 or 32-bit color image generated from the render and the sensor and ISP simulation. Typically used to feed your model training pipeline.

Image in which every pixel for every object class in the scene has a specific unique color according to Anyverse’s ontology. Used as ground truth to help the models understand what pixels belong to what object.

Only for objects of interest. The pixels of different instances of the same object class have different colors. This helps the AI understand different instances of the same class during training.

Every different material has a different color at the pixel level. This channel can be interesting for use cases dependent on the objects of interest materials.

Every pixel in this image has 3 32-bit channels each with one of the x, y, z coordinates of the pixel in the world reference system. It is useful for spatial reference.

Contains the XYZ image. The color in Anyverse rendering system is encoded by spectra rather than RGB triplets. Spectral information is converted to XYZ images using the CIE 1931 system.

This channel is called Albedo sometimes. It contains the color image without the contribution of lights. It is useful to have as a reference when the color image is not created.

It contains the material roughness for every pixel with a value between 0 and 1. A material with a roughness value of 1 will provide white pixels, whereas a material with 0 roughness will give black pixels.

In this channel Anyverse encodes the velocity vector of every pixel in the image in world coordinate system. These will be non-zero for objects that are moving in the scene when the sample is taken. Useful for dynamic use cases.

This is the raw image coming out of the sensor simulation without any ISP. Useful if you have your own ISP simulation to generate a final color image.

This channel contains a value between 0 and 1 representing the distance of every pixel to the camera. These values can be easily converted to meters. Useful to train AI models that deal with distance estimation of objects.

For every pixel in this channel you have the normal vector for the surface (geometry) that pixel belongs to, in the world reference system.

For every pixel in this channel you have the normal vector for the surface that pixel belongs to, including the texture effect, in the world reference system.



Simulate your camera sensor accurately. With the spectral info provided by the render we can simulate all the physics happening at the sensor to implement our Sensor Simulation Pipeline.

pixel size

fill factor

exposure time

well capacity


qe curves


analog offset/gain





color filter array

conversion gain



rgb to xyz matrix

xyz to device matrix

pixel vignetting

Use cases

The right synthetic data makes all the difference in advanced perception and machine learning. It can fill in data gaps or complement real-world footage in a variety of industries. At Anyverse™ we can simulate any scenario and cover a whole range of corner cases with accuracy that consequently boosts your AI.

You made it till the end! DO YOU WANT TO KNOW MORE?

Share your specs with us and get Anyverse's power to overcome your system challenges

Looking to start your Synthetic Data journey or need help with your current project? We'd love to know more.

Looking for the right Synthetic Data to speed up your system? Please, enter the Anyverse now

Client Story

Would you like to know how Cron AI has improved LiDAR simulation accuracy with physically correct synthetic data?

Let's talk about synthetic data!

[contact-form-7 404 "Not Found"]