IBM Advances Viable Quantum Computing One Big Step with Error Correcting Codes

Image from Pixabay

Quantum computing is one of a number of game-changing technologies that IBM is driving towards along with generative AI, and like generative AI, quantum computing is years from being ready. However, one of the elements that will delay the use of quantum computers is error checking, a critical process to assure the results that these future quantum computers will provide.

This technology was believed to be a distant advancement that would arrive after quantum computers were far more viable than they are today. But IBM broke that assumption with an announcement in August indicating it has another path that would provide this technology years before it was thought the effort would be started, let alone ready.

Let’s talk about why IBM’s error-correcting codes for near-term quantum computers are critical to the timely creation of this technology and will increase the speed at which quantum computers come to market.

Dependencies

With any new technology, there are critical dependencies. If those dependencies are not addressed, you will have problems, potentially big ones, and the biggest problems are likely to occur with quality. We’re seeing that play out today as generative AI systems, which still lack effective error correction technology, are becoming less trusted after the release of generative AI tools than they were before that release.

Quality is always important in any product, but in a product that could be used to define future spaceflight, predict weather that can create catastrophic events, or identify threats both terrestrial (climate change), extra-terrestrial (asteroids), or assess risks of catastrophic failure (nuclear power plants), assuring data accuracy is not just important, it may mean the difference between life and death.

Thus, the safe deployment of any advanced technology is predicated on the creation of a quality measurement and assurance tool that can assure that the answers these systems provide are reliably accurate.

Quantum computing’s unique problem and IBM’s solution

Traditional error correction on classical computers is comparatively simple in that you are dealing with bit flip errors where the bits improperly change their state from 1s to 0s or vice versa. Quantum computers are far more complex, so error correction, which will also require quantum computers to work, must deal with phase errors and correct those errors without corrupting the extra information that qubits carry.

The effort must work across large numbers of qubits but not across all-to-all connectivity for large systems because the scale of those systems makes such an approach impractical. You do not want to have to spin up a larger quantum computer to identify and correct the problems with a small quantum computer. So, the approach must be extremely focused.

IBM’s approach is focused, doesn’t compromise the information in the qubits as it corrects, requires a small quantum computer to implement, and yet should meet or exceed the data quality requirements of IBM customers.

Wrapping up

From the smallest and least powerful computer to the largest and most powerful computer, accuracy is critical to being able to trust the results. Because they will be used on the biggest and most critical of the world’s problems, quantum computers will need to have an even higher level of quality assurance but due to the size of the data sets, complexity of the process, and unique nature of quantum computing, the efforts to assure quality were believed nearly impossible to accomplish.

IBM is one of the market leaders for quantum computing and has figured a way out of this mess by creating an error-correcting solution that is more affordable, easier to implement, and acceptably comprehensive while providing the level of quality the market will demand.

This move is a testament to IBM’s quantum leadership and highlights yet again why IBM’s historical focus on quality, integrity, and exceeding customer expectations make it the company at the forefront of practical quantum computing today.

Latest posts by Rob Enderle (see all)
Rob Enderle: As President and Principal Analyst of the Enderle Group, Rob provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.

View Comments (0)

Related Post