NVIDIA Reveals Omniverse Microservices
On Monday, NVIDIA revealed its Omniverse Cloud Sensor RTX, a collection of microservices that enable accurate sensor simulation to fast-track the development of autonomous machines of every type.
Sensors provide humanoids, smart spaces, autonomous vehicles, mobile robots, and industrial manipulators with the data required to interpret the physical world to make decisions. With this sensor from NVIDIA, developers can test associated AI software and sensor perception at scale in realistic, physically accurate virtual environments before deployment in the real world. This will enhance safety while saving costs and time.
Vice president of Omniverse and simulation technology at NVIDIA, Rev Lebaredian, said training and testing in physically based virtual worlds is required to develop reliable and safe autonomous machines powered by generative physical AI.
The Omniverse Cloud Sensor RTX microservices would allow developers to build large-scale digital twins of cities, factories, and even Earth easily, assisting the acceleration of the next wave of AI.
Omniverse Cloud Sensor RTX is powered by NVIDIA RTX neural-rendering and ray-tracing technologies and built on the OpenUSD framework. This speeds up the creation of simulated environments by combining synthetic data with real-world data from cameras, videos, lidar, and radar.
The microservices can be used to simulate a broad range of activities even for situations with limited real-world data.