ANYVERSE

A breakthrough synthetic data

solution for advanced perception.

Get the right data for your perception model – sensor-specific and ground-truth.

Our Workflow

Get in touch, share your sensor specs, scene and data needs and we generate custom synthetic datasets for you.

Here is the workflow we apply:

01. Sensors

Sensors are crucial for perception training and testing so this is where we start. We collect your requirements and build the exact sensor model(s) you need. We define camera parameters and choose a specific lens or add LiDAR.

  • Lens type, FOV, color filter, response curves, sensor size
  • Raw sensor data
  • Image Processing functions
  • LiDAR settings

02. Ego-vehicle

We define your ego car or any other ego vehicle such as an UAV or a robot. Then we add, position and rotate all the sensors previously created. There is no limit to the number of cameras/LiDAR.

  • Ego vehicle set-up
  • Add and position cameras/ LiDAR

03. Scenario

Once we have your vehicle and its sensors all set, it’s time to take care of its scenario. We start by setting scene features and continue by adding all the additional objects to complete the scene, including the ego-vehicle.

  • Scene model and assets
  • Dynamic assets – traffic and pedestrian
  • Ego-vehicle behavior

04. Variability

Next step – move on to defining variability ranges such as object materials and textures, weather and lighting conditions, and other parameters for greater variability. Perfect for everyday corner cases and challenging situations.

  •  Weather and lighting conditions
  • Object materials and textures
  • Object positions and dynamic parameters

05. Data generation

We define your ego car or any other ego vehicle such as an UAV or a robot. Then we add, position and rotate all the sensors previously created. There is no limit to the number of cameras/LiDAR.

  • Ego vehicle set-up
  • Add and position cameras/ LiDAR

Key benefits

Sensor data model

Custom defined sensors for fitting your exact perception model, optical and LiDAR. Get unprocessed high bit depth raw data in addition to your RGB.

Ground-truth Data

Automatically-generated and with no margin for error. Choose from a number of pixel-accurate channels available, apart from classing bounding boxes and more.

Variability Under Control

Set the ranges of variability for anything, including light, weather, objects, textures, positions, behaviors, etc. covering possible everyday corner cases.

Any synthetic data questions, drop us a message!

Let's talk about synthetic data!