NVIDIA is working on several big, worldwide problems. These projects include efforts to mitigate climate change, improve and expand renewable energy, address current and future pandemics with faster cures, speed up critical drug recovery, and help scientists understand the universe around us.
These efforts require massive computing efforts using HPC (high performance computing), AI, and supercomputer resources to both fully understand the related problems and come up with viable ways to mitigate or eliminate these problems.
Let’s talk about how HPC and supercomputing are changing and how NVIDIA’s efforts are functioning to create a better world. This is all from Ian Buck’s (VP and GM of NVIDIA’s Accelerated Computing) keynote at ISC22 (you can watch this keynote here).
Current workloads for HPC and supercomputers
Buck started out with an overview of the current critical workload being placed on these massively powered, current-generation supercomputers. They broke down into five areas: edge computing focused on microscopy image processing; transformer models; simulation at ever-increasing scales; digital twin (like Earth 2 focused on combating climate change) and quantum computing which will, when it fully matures, disruptively revolutionize much of the computing environment (particularly massive data set analysis, security, and communications).
These efforts and related advancements and technologies are changing dramatically what comprises computer science and assure that the computing world of tomorrow will be very different than the computing world of today.
Let’s break down a couple of these highlighted current workloads.
One of the largest efforts NVIDIA is involved in is Earth 2, a global simulation of the earth at relatively high resolution so that weather events can be better anticipated, and the related loss of life significantly reduced. This resolution started at kilometer scale and that scale has been reduced over time. Simulation has also been used to simulate what the Covid-19 spike does and how to mitigate the related damage. If you know a virus well, you can craft a cure for that virus and supercomputers have been invaluable in helping craft the cures that have emerged.
Forecast Net, a cooperative, collaborative effort, has proven to be better able to measure global scale air movement (atmospheric rivers) critical to anticipating future weather events. Other efforts look at more efficient ways to extract and produce energy more efficiently using atomic energy (achieved a 1,000% speed improvement in the effort), plasma physics inside a reactor, and predicting (Siemens did this) to better plan and implement predictive maintenance to reduce costs while simultaneously increasing reliability.
Rise of HPC at the edge
These efforts are focused on analyzing the world around us. Light, gravity, magnetics and other sensors that produce a massive amount of data that is simply too large to store. HPC at the Edge, and resulting model enhancements, turn the data into concentrated information that can more accurately identify problems and come up with cost effective targeted solutions to mitigate these problems. A lot of this work is done with images that can range from world scale images down to microscopic images. Supercomputers can reduce the image to its critical elements, then use that information to expand the image back out again into something that is far easier to interpret. One of the interesting demonstrations was looking at cells to reconstruct biology in real time. Applied to cancer, this could be critical to both understanding better how cancer works, and to finding out less dangerous and more effective ways to treat it. Researchers can not only see these cells interact at a level never seen before, but they can interact with the cancerous material to determine causes and effects.
Digital twins are one of the critical parts of the coming metaverse tied to simulation at scale to provide a way to create a digitally simulated world that stays in sync with the real world. Earth 2 is the planned digital twin of the planet, and Buck showcased visually how Forecast.net is currently able to predict, five orders of magnitude faster, possible outcomes to identify future catastrophic weather events with greater accuracy than ever before.
Quantum computing is still in its infancy, but supercomputers are being used to emulate future quantum computers and develop skills and tools that can be used once this technology becomes viable. Working with partners, they are creating a substantial foundation of knowledge that will be critical to making faster use of quantum computers once they become viable.
Wrapping up: Venado
This all builds to the announcement at the show of the most powerful ARM-based supercomputer yet. Venado is at Lawrence Livermore labs and is focused on energy, unmanned vehicles (for firefighting), and modeling fusion energy production. It could be one of the most impactful supercomputer efforts yet created. Building a digital twin of a fusion reactor cuts the development cost and risk down substantially, which helps assure the viability, reliability and safety of the result without physically building any part of it. Because they can better explore the digital twin of the reactor, researchers can create effective tools to maintain and optimize the resulting reaction. Once done, and the actual reactor built, the digital twin of the reactor and the reactor will be linked so that predictive maintenance, failures and enhancements can be modeled and created without adversely impacting the energy production of the physical reactor.
Venado is a showcase of NVIDIA technology and a true showcase of the current capabilities of NVIDIA’s hardware and software, as well as the foundation for the next generation of supercomputers which likely will be created out of the current generation, suggesting the next performance jump will be substantially greater.
If you were concerned about the current speed of advancement, things are about to move one hell of a lot faster as HPC capabilities pivot to focus on creating their replacements.
- Qualcomm Helps Apple Get the Job Done, Again - September 18, 2023
- IBM Demonstrates How to Properly Use AI with watsonx Mainframe Application - September 13, 2023
- IBM Advances Viable Quantum Computing One Big Step with Error Correcting Codes - September 7, 2023