While the consumer metaverse has been, outside of gaming, overhyped and damaged by premature efforts to sell Metaverse real estate, the commercial metaverse is in full production and providing benefits now as companies like BMW, working with Siemens, have discovered. At the heart of this industrial effort is NVIDIA with its AI enhanced tool Omniverse, and at Siggraph this year, starting on August 8th, we’ll get a chance to see just how far this technology has advanced.
Let’s explore what may be the most interesting part of this development, which is the increasing integration of AI that makes the digital twins aspect of the metaverse real and is a foundational element to the technology’s future.
Why AI is critical to the metaverse
Whether we are talking entertainment or simulation, the metaverse without AI won’t be very useful. AI is what makes the metaverse able to emulate the real world for simulation and gaming. Without it, you have a metaverse frozen in time. Granted, this limitation may be okay for efforts like architecture which has historically shown fixed images to represent buildings that will be built in the future.
But even with architecture, if you can add realistic lighting by simply showcasing how external light will impact those working inside the building, you are more likely to end up with a final product that doesn’t need to be modified. For instance, a few years back a huge glass building was built and it resulted in so much shifted light and heat that it was melting parts off cars and likely becoming a significant safety issue to anyone walking through that laser-like experience. The fix was neither trivial nor cheap. Had they been able to create that building in NVIDIA’s Omniverse first, with its ability to better emulate the real world, that problem might have been caught before the building was built and helped avoid both the related liability for damages and the need to modify the building to mitigate them, not to mention the damage done to the reputation of the architecture firm.
Particularly when building increasingly automated structures like BMW’s cutting-edge new factory, the use of AI was critical to being able to assure that the result was safe and optimized long before the foundations of the structure were laid. This undoubtedly reduced the testing time significantly that otherwise would have been needed once the building was built and eliminated most if not all the post-build changes that are typically far more expensive after a building is completed. In addition, AI can better emulate how people and equipment interact, assuring that injuries are minimized by anticipating that people won’t always walk where they are supposed to walk or remain alert to the dangers around them. You can even use AI to emulate what would happen if someone were distracted to provide better protection against human error.
This last hits home for me because my uncle lost most of his hand when he gestured without thinking and put his hand overhead into a piece of equipment. This accident effectively ended his career.
Wrapping up
At Siggraph this year we will get an update on just how far NVIDIA’s metaverse tool, Omniverse, has advanced, as well as a near-term look into its future. It has come a long way over a very short while to become a critical tool for everything from training autonomous cars (effectively moving the availability of safer autonomous vehicles forward by years), to planning the factories of tomorrow that will be highly populated with robots interacting with human employees who are physically disadvantaged should the two elements collide. This last makes this technology critical to many of our lives which would otherwise be harmed or even ended should these autonomous machines be undertrained as they are rushed to market.
Because Omniverse can better anticipate and showcase problems at machines speeds, years of training can be done in hours, significantly reducing the time and cost of training on the critical path to production and better assuring that this training will be adequate prior to deployment. If you want to step away from the overhyped metaverse elements like Metaverse Real Estate and see where the metaverse is viable today, don’t miss the NVIDIA keynote at Siggraph this month. (Oh, and NVIDIA is famous for making its keynotes interesting, so it is often worth watching the keynote for entertainment value alone). If you want a sense of what an NVIDIA keynote looks like check out this one from earlier this year. It is amazing.
- Microsoft Recreates the Terminal with Windows 365 Link - November 26, 2024
- Things to Look for at Microsoft Ignite - November 18, 2024
- IBM Power: Doing Hybrid by Choice - November 13, 2024
Pingback: Siggraph 2024: Learning About the Present and Future of Generative AI and the Coming of AGI
Pingback: Dowiedz się więcej o teraźniejszości i przyszłości generatywnej sztucznej inteligencji i nadejściu AGI – nanostrefa.pl