Hybrid computing is pushing control to the center of quantum progress
Quantum computing has entered a new era.
The field is no longer defined just by laboratory experiments or proof-of-concept demonstrations. Researchers and companies are working toward sustained, error-corrected workloads that must run reliably over long periods of time, and qubits are no longer the main bottleneck. As hybrid quantum-classical computing becomes central to real systems, it’s crucial to pay close attention to the system architecture – and, in particular, to the role of control.
“In any computer technology, control is a first order problem,” says Yonatan Cohen, the CTO of Quantum Machines. “First, it determines how you interact with the device and how you program it. And second, it has significant implications on the scalability of the entire computer.”
Control determines how a quantum computer is programmed, how it interacts with classical resources, and how it scales. It is a defining part of the machine. In practice, this increasingly means enabling hybrid control flows, where the quantum control system can tightly integrate with classical accelerators such as GPUs and FPGAs to handle feedback, optimization, and decision-making in real time.
Why control can no longer be abstracted away
Take superconducting qubits as an example.
The size, the power consumption, and the cost of the control system scale linearly with the number of qubits. When extrapolated to the scales required for fault-tolerant quantum computing, those trends lead to completely impractical systems. Addressing this is not a matter of incremental optimization. It requires orders-of-magnitude improvements in density, power efficiency, and cost, while still meeting analog performance requirements. “We really need to drive the cost down, decrease the power consumption and make things smaller. If we don’t do that, the entire quantum computer won’t be viable,” says Cohen. The control layer must be both extremely precise and deeply integrated with classical infrastructure.
“First, we need control systems that are very reliable and stable. And very importantly, some of the qubits as well as the control parameters in today’s quantum computers, they drift. This can be detrimental to the performance of the computer and to the ability to perform quantum error correction at scale,” says Cohen. Qubit parameters change over time, as do the optimal control settings needed to operate them. This creates a serious obstacle for large-scale, error-corrected computation.
As the qubits’ quality improves and researchers move from fairly short experiments to ever more sustained execution, it’s imperative that the control system improves in parallel. This is the main focus of Quantum Machines – to ensure that control doesn’t bottleneck the progress in the field.
So in addition to building stable, reliable electronics, it is also crucial to be able to recalibrate qubits and control parameters during the runtime of the quantum program. Calibration must be embedded directly into the execution flow, with tight coupling between quantum hardware and classical processors that analyze data and make decisions on the fly. At Quantum Machines, this capability is supported by Qualibrate, a calibration and characterization platform designed to automate, orchestrate, and scale these runtime calibration workflows as systems grow in size and complexity.
That’s the essence of a hybrid control approach.
“These real time calibrations that we’ve been enabling were demonstrated by many of our academic partners as proof of concept along the years,” says Cohen. “And now they’re starting to also be adopted by the large-scale commercial efforts. For example, Diraq is relying on these capabilities to calibrate and stabilize their quantum processors. And it’s really, really great to see.”
Watch top industry experts outline the latest progress in hybrid quantum-classical computing:
Control as the enabler of hybrid quantum-classical loops
Crucially, the same infrastructure required for runtime calibration is also required for quantum error correction. Syndrome extraction, decoding, and feedback must happen fast enough to keep pace with errors as they occur. “This capability is the same capability that we need to do the error correction decoding itself,” stresses Cohen.
In this sense, hybrid computing is not an optional enhancement but a structural necessity. Without low-latency classical processing tightly integrated into the control loop, large-scale error correction is simply not practical. Control systems must support flexibility, as algorithms and techniques are still evolving, and extremely high performance, as timing margins become ever more critical at scale. This tight loop increasingly spans the quantum controller and external classical accelerators, which process data and return results fast enough to keep pace with real-time error correction and control requirements.
As control systems and qubit technologies improve, the balance of challenges in the field is beginning to shift. While qubits remain difficult to build and operate, recent progress has made large-scale quantum computers feel increasingly plausible.“The amazing progress on qubits is making me very optimistic,” says Cohen.
Still, that optimism highlights a new concern: the relative lack of mature algorithms and compelling use cases that can take advantage of hybrid quantum-classical systems. Many of the most promising applications are expected to rely on tight integration between quantum processors and classical HPC, rather than purely quantum execution. Developing such algorithms is challenging and requires close collaboration across disciplines. This reinforces the importance of control once again. Hybrid algorithms depend on fast, deterministic interaction between quantum and classical components, and the control system is where that interaction is made.
Hybrid computing is the dominant paradigm
Given these trends, it is increasingly clear that hybrid quantum-classical computing will define the next stage of the field. Rather than replacing classical computation, quantum processors will act as specialized accelerators within larger workflows. Control systems are what make this possible. “The hybrid approach is definitely key,” says Cohen. “We believe that some of these use cases are going to be hybrid use cases, where the quantum processor and classical high-performance compute resources have to work together to bring us quantum advantage.”
Of course, it’s also important that novel quantum algorithms are developed in parallel, to help unlock the full potential of ever-faster maturing quantum computing technology. “We really don’t have enough algorithms and use cases,” says Cohen. “And I believe that there are not enough people working on this challenge. This is really what the community now needs, in order to fulfill the entire vision of a large-scale quantum computer that also does useful things for humankind.”
This perspective has important implications for system design. Control platforms must be modular enough to support different qubit modalities, yet in the future become unified enough to present agnostic interfaces to higher-level software. They must integrate seamlessly with classical compute resources while maintaining the precise timing and signal quality required by quantum hardware. Because of its position between qubits and software, the control layer is also a natural integration point for the broader quantum ecosystem. It connects hardware vendors, software stacks, error-correction frameworks, cloud platforms, and HPC providers into a single operational system.
“On the one hand, the control system needs to talk to different qubits,” says Cohen. “And on the other hand, control is the integration point to the rest of the stack – the software stack, the HPC and so on. That makes the position of Quantum Machines very important to really work with the entire community and make sure that we’re developing the right interfaces and the right technology for the community so that we can move faster in the roadmap.”