IBM mainframe quantum computing quantum advantage

IBM and the Emergence of Quantum Advantage

This week IBM presented a major quantum computing update, and it was a fascinating experience. IBM does a nice job when presenting a technology. It steps away from the typical speeds-and-feeds thing to showcase how a technology can and should be used, or, in this case, what is being done to create the technology and get it to work.

In the presentation, you can clearly see an increasingly blended solution that goes around the monolithic concept of quantum supremacy and instead surfaces the concept of “quantum advantage.” This concept, which is a blend of computational systems that will create a new class of High-Performance Computers (HPCs), is slated to revolutionize the server market. It should be not only a blend of the server technologies that preceded it, but include a bit of a return to the mainframe concepts that once dominated the market in terms of reliability, scalability, security and particularly scale.

Let’s explore what appears to be the future of quantum advantage servers and how they might transform the computing industry.

The birth of quantum advantage servers

Initially, the industry looked at quantum computers as we did at the early computers that came to market as highly insular, monolithic and focused exclusively on the massive analytical tasks that quantum computers are best at accomplishing. That was named Quantum Supremacy, and it no longer makes sense.

However, we did this before with GPUs and discovered that it didn’t make sense to take a new, high-performance technology and treat it like it had to stand alone. On the contrary, just as we learned that CPUs and GPUs must exist in the same hardware to truly experience all the benefits and breadth of both, we are learning that Quantum Processing Units (QPUs) need to be peers to more traditional servers to allow for not only the breadth of workloads out there, but to assure each technology is available should the task require it.

And we now have Neural Processing Units (NPUs) to add to the mix that are focused increasingly on the large language models and improving the human interfaces and AI capabilities in our servers. Thus, the heterogeneous nature of Quantum Advantage is replacing the largely failed homogeneous quantum supremacy approach to assure all of these technologies are available to the user depending on need and that a new class of quantum advantage servers can rise to the quantum opportunity while still being capable of doing the huge number of jobs that don’t, and likely will never, require quantum computing to accomplish.

The next generation of computing

Sometime between now and 2033, we are likely to see the birth of quantum advantage servers which will have CPUs, GPUs, focused accelerators, NPUs and QPUs. They’ll undoubtedly force a new architecture and cooling solutions that are far more advanced than we have today. Due to the processing headroom that the quantum technology will provide, you will likely need far fewer of these servers which should be able to exceed the performance of today’s supercomputers when they arrive but be far smaller and more energy efficient.

Advances in interconnect, fabric, internal and external (to the server) data transport (likely initially optical), and huge jumps in AI capability will be the hallmark of these new servers. I expect one of the big hurdles will be addressing timing differences of the various components and rigorously looking for and eliminating latency so this class of blended technology server can truly shine and meet its full potential.

I expect, and certainly IBM is betting on, the fact these quantum advantage servers will be game changers when they arrive and mature. It should be noted that IBM’s efforts to create a quantum computing infrastructure largely remain unmatched but helping assure IBM won’t become obsolete when most server and a lot of workstation technologies die out because they are simply not competitive with the coming wave of quantum advantage servers.

Wrapping up: The future of computers a quantum mainframe?

As I look at this coming birth of quantum advantage servers, it strikes me that while you’ll still need a lot of CPUs, GPUs, and even NPUs and accelerators, you shouldn’t need that many QPUs given the kinds of jobs quantum computing is good at performing. Even quantum encryption shouldn’t be that resource-intensive, suggesting these future servers may be more like mainframes, basically a cloud service in a box, so that they can be rapidly deployed and so that latency between the components can be minimized.

There is no doubt in my mind that IBM is already mocking up these new servers so that when quantum is ready, it’ll bring out a line that will shock the world. It is no longer a question of “if” so much as “when.” I expect the when will now be before 2030 and partially tied to the recent massive move of the date when we are anticipating the arrival of the Singularity.

Latest posts by Rob Enderle (see all)
Scroll to Top