Quantum is smart. The progression toward embryonic quantum computing services represents a shift (you know what kind) in terms of the amount of intelligence we can apply to computing, data analytics and reasoning the like of which we have never seen. Without covering a brief history of quantum again, let’s assume that we understand binary is based upon 1s and 0s and that, to provide a seemingly infinite broader plain of compute, quantum is 1s and 0s and every value in between.
But with great power, comes great responsibility and - in this case - comes greater fragility (a quantum state can be said to be brittle or fragile, mostly due to the speed at which machines at this level work) and greater potential for errors. It makes logical enough sense i.e.
all computing devices are prone to calculation errors as result of architectural disconnects, misconfigurations or plain old bugs. The antidote in this case is known as quantum error correction. Out Of Noisy NISQ The use of quantum error correction techniques is supposed to help us get out of the Noisy Intermediate Scale Quantum (NISQ) era.
A term laid down by Cornell University’s John Preskill to explain the error rates that occur in quantum computers, often as a result of the “noise in quantum gates” which can be caused by factors ranging from thermal fluctuations, nearby mobile phones or cosmic rays. Clearly a story and a study in and of itself, let’s move on to quantum error correction in all haste. Transitioning beyond the noisy NISQ era and scaling quantum hardware to enable transformative applications will require quantum error correction.
This is the proposition made by Steve Brierley, founder and CEO of Riverlane, an organization that specialises in this exact practice. His firm’s Quantum Error Correction Report 2024 seeks to provide an understanding of the current state of QEC development and uncover some of the opportunities and challenges ahead through an analysis of public data from well-known hardware companies. “Solving quantum error correction is no longer a distant goal - it has become the fundamental cornerstone of quantum computing, and its implementation is rapidly expanding across the industry,” said Brierley.
“Recent developments in quantum error correction, combined with continued device improvements, have unlocked the next generation of quantum computers that will surpass classical computers within the next five years.” Planting Pivotal Steps It’s quite a statement to make. We did use the word embryonic in the first line of this tale and we know that public cloud and deeply embedded AI functions are still relatively formative and germinal at the prototyping experimental stage for so many organizations.
.. let alone the leap (apologies, there’s that word) to quantum.
Regardless, Riverlane thinks it can lay down the “pivotal steps needed” to advance quantum computing from the experimental stages to real-world applications. To help us get there, Riverlane says it is building Deltaflow, a QEC computing stack that solves quantum computer errors using every type of qubit and turns unstable physical qubits into error-free logical qubits. At Deltaflow’s core is a quantum error decoder.
The company says that the essential error correction challenges are largely the same in all quantum computers; there are some differences between qubit type and the architectures of different quantum computers but essentially QEC is a classic large and real-time information processing task – only a very complex one due to the nature of quantum computers. “Deltaflow helps quantum computers accelerate their path to performance beyond the supercomputing threshold. Today, the world’s best quantum computers can perform at most a few thousand reliable quantum operations (QuOps) before failure due to mounting data errors.
This must scale to millions of quantum operations (MegaQuOps) and ultimately trillions (TeraQuOps) for quantum computers to fulfil their vast potential,” notes the company, in its report. Deltaflow also includes an essential orchestration layer. This coordinates and synchronizes the complex cycle, which needs to happen very quickly – aka in real-time - to prevent a backlog of errors building up and overwhelming the quantum computer.
“Achieving one million error-free quantum operations marks a pivotal moment when quantum computers surpass the capabilities of any classical supercomputer. The quantum community is aiming for this next landmark goal, confident that QEC will achieve the MegaQuOp within two to three years. As quantum computers enter and then move beyond the MegaQuOp regime, the range and complexity of quantum applications will increase,” explains Brierley.
“Other challenges include improving QEC codes to reduce the ‘QEC overhead’ and integrating QEC with quantum control systems. Along with the difficulty of scaling and stabilising quantum systems, the ongoing process of engineering a dedicated QEC stack is a complex undertaking.” Earl Campbell, VP of quantum science at Riverlane underlines Brierley’s comments and states five years ago, we would not have believed the experiments realized in the last 12 months with quantum were possible.
He admits that while some difficult engineering challenges remain, entering the QEC era means there's now little doubt that quantum computing will become a reality. How Will We Quantify Quantum? Entering the noise-free QEC era has implications. As this technology advances beyond previous generations, current performance metrics may no longer be adequate for comparing performance or capturing the complexities of increasingly fault-tolerant systems.
Future quantum technologies will need new metrics and perhaps even new application-specific benchmarks. Other challenges in the noise-free quantum era range from managing the complexity of large-scale quantum systems to ensuring equitable access to this powerful technology. Consensus-based international standards, which involve sharing global best practices, will have an important role to play.
“Quantum computing holds immense promise for tackling today’s global challenges, but for everyone to benefit and for no country to be left behind we need international standards that operate across borders,” says Gilles Thonet, the Deputy-Secretary General of the International Eletrotechnical Commission (IEC), an organization whose standards are used in almost 170 countries. Thonet contends that standards are already building a solid foundation for collaboration, starting with a common terminology that makes it possible to exchange information accurately. He also says that they prioritize safety and address the concerns of society as they are vital for ensuring the interoperability of quantum computing elements.
Quantum may also need new delivery platforms and software application development tools. Some commentators in the tech industry suggest that the most likely first manifestation of quantum in organizations will not be a buzzing new quantum machine installation in the company datacenter; more likely, we may be quantum-as-a-service delivered via the cloud computing model of service-based computing as the major hyperscalers (and major vendors such as IBM ) themselves are among the first to deliver working machine services. Either way, quantum is coming, so pretty soon you’ll need to know your bits from your qubits.
.
Technology
Riverlane Navigates Upstream On Quantum Error Correction
The use of quantum error correction techniques is supposed to help us get out of the Noisy Intermediate Scale Quantum (NISQ) era.