1. Home
  2. Knowledge Base
  3. Getting Started
  4. Frequently asked questions (FAQs)

Frequently asked questions (FAQs)

Dataset production

What is a GigaPixel?

The GigaPixel is the unit of measurement for producing synthetic data with Anyverse and corresponds to 1 billion pixels. One GigaPixel corresponds to approximately 1000 images in HD resolution.

Each time an image is produced, the corresponding pixels are deducted. If you produce at different resolutions, the consumption will vary accordingly.

You can purchase additional Gigapixels to increase your production capacity. The Standard subscription includes 1 GigaPixel free of charge.

How can I purchase additional GigaPixels?

If you have already used up your GigaPixels or simply plan to produce more images, you can purchase more GigaPixels in Anyverse™ Studio on the go and online with a one-time credit card payment.

Log in to Anyverse and access the purchase page by clicking on the shopping cart icon located on the left side menu, or through the user configuration screen: User configuration / Plan / Go to store

Hyperspectral render engine

Why is hyperspectral rendering better than other standard graphic engines?

Hyperspectral rendering is the only way to simulate optics and sensors accurately. Faithfully simulating the sensor is a great way to reduce the domain gap in your synthetic datasets, generating images closer to what the real cameras would take. Training with such images gives you a better chance that your perception deep learning model will generalize to real-world images, improving your perception system’s performance cost-effectively.

Images generated from hyperspectral data are richer and more accurate since the information used to calculate the color of every pixel includes all wavelengths sampled by the render, unlike other non-hyperspectral ray tracing render engines that can generate nice images with less accuracy.

Anyverse plans

What are the differences between Standard and Premium?

The standard subscription plan gives you full access to all the Anyverse features with free initial credits for data production and limited production bandwidth. It’s perfect for all users who want to generate high-quality synthetic data in small organizations or teams.

The Anyverse Premium plan is designed for medium and large organizations, allowing clients to customize the subscription with extended capabilities such as unlimited data storage, additional cloud nodes for high-performance production, management tools for users and teams, etc. Premium plans are billed annually and eligible for special discounts on more GigaPixels.

Data management

How long can I store my data?

If you are a Standard user you can store your datasets in the cloud for up to 90 days at no additional cost. After that period, your data will be deleted, so you must download it and store it elsewhere. We will notify you before this happens so you have time to organize the “data moving”. You will always have the option to upgrade your plan to Premium to have more storage time. Contact sales if you wish to upgrade your plan.

If you are a Premium user you will have a custom data storage time. Check your subscription for more details or contact your sales representative.

What kind of ownership do I have on the datasets produced?

Datasets created in Anyverse are licensed to you on a exclusive royalty-free perpetual basis. You can use the datasets for any commercial purposes except specific restricted cases. Sublicensing and reselling restrictions of synthetic data may apply.

Read here the Anyverse End User License Agreement (EULA) for more information.

Can I resell the datasets?

Anyverse may apply sublicensing and reselling restrictions for synthetic data created with its platform without a previous legal agreement.

Please contact info@anyverse.ai or read here the Anyverse End User License Agreement (EULA) for more information.

Dataset generation

Why do the images I produce often have a dominant green color?

The dominant green color in the images you produce can be attributed to the underlying Bayer pattern utilized in digital image sensors. The Bayer pattern is a mosaic of red, green, and blue color filters placed over individual pixels on the sensor. This pattern allows for the capture of color information by using a technique called color interpolation.

In the Bayer pattern, there are twice as many green filters as red or blue filters. This design choice is based on the fact that human vision is more sensitive to green light compared to red and blue. By having more green pixels, the image sensor can capture a higher level of detail and accurately represent the luminance component of the image.

During image processing, the missing red and blue color information is interpolated or estimated using the surrounding green pixels. This interpolation process can sometimes lead to a perceived dominance of green in the final image, especially in areas with fine details or in scenes dominated by green elements.

It’s important to note that this green dominance is not a flaw, but rather a characteristic of the Bayer pattern used in digital cameras. To achieve a balanced and natural color representation, the image signal processor (ISP) in cameras applies various algorithms to reconstruct the full spectrum of colors based on the captured data. If you desire a different color balance in your images, you can adjust the white balance settings on your ISP. For instance a white balance triplet R:1.418, G:1.0, B:2.3265 will provide more visually pleasant images for the human perception system.

Can I add custom 3D objects (assets) to the library?

Currently, it’s not possible to import your 3D objects to our built-in assets library. But don’t worry, we are already working on it and you’ll be able to import your own assets very soon. Stay tuned, we’ll keep you posted.

In the meanwhile, if you need any particular assets, please contact us and we’ll see what we can do 🙂

How do I build new scenarios?

You can build your scenarios from preset workspaces, using our base scenes library or from scratch in Anyverse Studio. Anyverse includes a very powerful API (Python) to create scenes programmatically, add assets, apply dynamics behaviors, control environmental variability and many other options.

Check here some of the Anyverse quick platform tutorials.

Can I simulate custom sensors?

Anyverse allows you to accurately simulate any optical sensor that your perception systems implement. It supports a variety of sensors and lenses in visible and non-visible (near infrared) bands. Anyverse exposes a variety of intrinsic sensor features that you can just copy and paste from the sensor sheet page or manufacturer.

How can I speed up dataset production?

If time is a key variable for the success of your project you can add additional cloud nodes through a subscription plan. More nodes mean more images being produced in parallel. Contact Sales for more information.

I have special scenario requirements. Do you offer custom services?

Our team is always willing to help. Contact us and tell us more about your scenario and specific requirements. We’ll study your case and give you an answer as soon as possible.

Was this article helpful?

Related Articles

Scroll to Top

Let's talk about synthetic data!