Anyverse – A Success Story

Javier Salado is the Technical Product Manager at Anyverse, whose mission is to provide the most accurate synthetic data solution on the market. Anyverse will be joining us at all four of our 2023 events, starting with InCabin Phoenix in March. Ana Bennett, Marketing Director for Sense Media, caught up with him to get the full low-down on their brand new platform…

Ana Bennett: How does 2023 look for Anyverse?

we have just launched AnyverseTM, the hyperspectral synthetic data platform for advanced perception systems.

Javier Salado: 2023 is already a very significant year for our clients and for us, we have just launched AnyverseTM, the hyperspectral synthetic data platform for advanced perception systems. These are computer vision-based systems such as driver monitoring or self-driving systems in which the accuracy of the data used for development makes the difference, or in other words, it is instrumental to accomplishing their critical mission and directly linked to people’s safety.

Ana Bennett: Great news, It seems 2023 is the year! And perfect timing. We are happy that you have decided to count on us and our events to spread the news and introduce AnyverseTM to the market.

We have already participated in several Sense Media events, and the truth is that we have always achieved decent results, both due to the number of good leads made, as well as the participation of key companies in the sector, networking…

Javier SaladoPerfect timing, yes! We have already participated in several Sense Media events, and the truth is that we have always achieved decent results, both due to the number of good leads made, as well as the participation of key companies in the sector, networking… without forgetting the amazing support your team has always provided us to maximize each event participation. We can’t think of a better showcase to introduce AnyverseTM and to show everything it can do.

Ana Bennett: We are so glad to hear that, it is important for us to collaborate with the key companies in the industry and the most disruptive within each market segment. So it’s a pleasure to have you both at InCabin Phoenix and Brussels, as well as AutoSens Detroit and Brussels.

But tell us more about AnyverseTM, is it available now?

we are very excited to finally put the platform in the hands of users. AnyverseTM transfers all the hyperspectral synthetic data generation power to the user.

Javier Salado: Definitely, yes! There were a lot of customers waiting for this moment, and we are very excited to finally put the platform in the hands of users. AnyverseTM transfers all the hyperspectral synthetic data generation power to the user. Companies can now produce as much data as they want for themselves with full autonomy. We have been developing Anyverse for years and we ourselves have been the first users having generated an infinite amount of data for our clients.

Ana Bennett: What can InCabin Phoenix attendees expect from you this March?

[With AnyverseTM] Developers can now decide with accurate hyperspectral synthetic data what is the best configuration for their computer vision AI system and sensors while reducing costs and time

Javier Salado: What can they expect? AnyverseTM is an advanced modular hyperspectral synthetic data generation platform that accelerates the development of autonomous systems and state-of-the-art sensors capable of covering and supplying all the data needs throughout the entire development cycle. From the initial stages of design or prototyping, through training/testing, and ending with the “fine-tuning” of the system to maximize its capabilities and performance.

AnyverseTM can be used for 3 main goals:
First, Design an advanced perception system – Developers can now decide with accurate hyperspectral synthetic data what is the best configuration for their computer vision AI system and sensors while reducing costs and time to market of their advanced perception system.

Ana Bennett: Let me interrupt for a second, you insist on the platform’s capability to export hyperspectral data, but I was wondering, what exactly differentiates hyperspectral synthetic data from “traditional” synthetic data?

hyperspectral synthetic data is the only way to physically simulate optics and sensors

Javier Salado: No problem, in fact, it’s a very good question that we are also happy to answer because our proprietary hyperspectral render engine is actually one of the key differentiators that make Anyverse unique among other data providers.

First of all, hyperspectral synthetic data is the only way to physically simulate optics and sensors. It’s as simple as that. Faithfully simulating the sensor is a great way to reduce the domain gap in your synthetic datasets generating images closer to what the real cameras would take. So training with such images you have a better chance that your perception deep learning model will generalize to real-world images improving the performance of your perception system in a very cost-effective way.

Even if you don’t have sensor simulation capability, the images generated from hyperspectral data are richer and more accurate since the information used to calculate the color of every pixel in the final RGB image includes all wavelengths sampled by the render giving much more detail to the final result. As opposed to other non-hyperspectral ray tracing render engines that can generate nice images but with less detail and quality.

And, if you are designing your own sensors, you can directly use the hyperspectral data to programmatically simulate the response of your sensor before implementing it on silicon. It is a powerful tool that can significantly reduce the cost and time to market of new sensors.

Ana Bennett: So we could say that your hyperspectral render is the heart of the platform…

Javier Salado: Both, the hyperspectral render and the sensor simulation pipeline are the core of AnyverseTM.

Ana Bennett: I see, and I believe you will have some technologists queuing at your Phoenix stand to ask you more about this…

Let’s resume the conversation, what other interesting applications does the platform have?

Javier Salado:

When safety is at stake, the highest data precision is mandatory.

Of course, the other two main use cases for the platform are:

  • Training, validating, and testing computer vision systems AI – Users can generate as much synthetic data as they need in a fraction of the time and cost compared with classic real-world data. We are talking about complex and very close-to real-world data, with spectral information and metadata. And all this automatically and autonomously, without going out into the “streets”, investing less time and resources, and obtaining wider variability than what a “real-world data only” approach can provide. They just need to connect the platform to their data pipelines and start designing custom scenarios and generating datasets with ground truth in the cloud.

  • Perception system enhancement and fine-tuning – Companies will take advantage of hyperspectral synthetic data to fine-tune their perception systems for mission-critical use cases. Simulating any extreme or corner case assures higher levels of confidence in the overall performance of the system. That is, situations with a low probability of occurence, but a high percentage of damage if they do occur, data that is difficult to obtain in the real world, due to probability, privacy, danger, etc. Situations that the platform can simulate to achieve robust, safe and reliable perception systems. When safety is at stake, the highest data precision is mandatory. False positives or malfunction due to ineffective training is inadmissible.

Ana Bennett: We can’t wait to check out that demo at Phoenix! Out of curiosity, when you talk about advanced perception systems and the capabilities of the platform, I get the impression that it could be helpful for other markets or end applications, apart from obviously in-cabin and automotive.

Javier Salado: Indeed, since we offer a very exclusive and unique solution for these applications that few data companies can currently offer, we have strong in-cabin monitoring and AV/ADAS traction. But we have clients from various markets, such as inspection, surveillance, security, or defense, and for various applications such as inspection of infrastructures, detection of injuries in risk scenarios, etc.

Ana Bennett: I understand, and going back to Phoenix, and more specifically to your data generation solution for the development of in-cabin technology, what makes you different from other data generation solutions? Or in other words, why did your clients choose you?

Javier Salado:

We provide a comprehensive platform so that our clients’ interior sensing systems are capable of monitoring a huge variety of drivers and passengers, operating under any circumstance…

Summing up, we solve many of the problems our clients face when they try to gather the data required, and above all, with the quality and accuracy necessary to develop these systems, not only to function safely and reliably but also to meet the standards the EuroNCAP demands.

Today, we can say that we provide a comprehensive platform so that our clients’ interior sensing systems are capable of monitoring a huge variety of drivers and passengers (of any ethnic group, characteristics, or age) operating under any circumstance, identifying their behaviors, distractions, gaze, etc. In short, we make our clients’ monitoring systems more robust, reliable, and Euro NCAP compliant, at the same time we save them problems with privacy policies (remember that this is a technology that directly involves children and adults), time, and resources.

Ana Bennett: It is clear that being able to simulate any situation and behavior of the vehicle’s occupants and generate data that perform closely to real-world data has tangible advantages for developers.

Javier, thank you for taking the time to speak with us! We look forward to seeing you on-site in Phoenix to see the platform in action and continue sharing insights.

Javier Salado: Thank you Ana, we can’t wait to show the platform live to all attendees who want to stop by our stand. In addition, we invite any company who wishes to get a platform demonstration to request it via our website anytime. We will be pleased to showcase it to them. Thanks for everything.

We remind all readers that the Anyverse platform is now available. Visit www.anyverse.ai for further information. dama

As well as a full exhibition featuring the Anyverse demo, InCabin Phoenix will feature star speakers from companies covering every aspect of in-cabin technology, which you’ll be able to enjoy across two stages. Explore our Live Agenda and start saving your unmissable sessions.
Shopping cart0
There are no products in the cart!
0

2024 ADAS GUIDE

The state-of-play in today's ADAS market

With exclusive editorials from Transport Canada and SAE;  the ADAS Guide is free resource for our community. It gives a detailed overview of features in today’s road-going vehicles, categorized by OEM, alongside expert analysis.