Nu Quantum CEO predicts encryption-breaking machines will arrive by 2033

The Infrastructure Gap in the Quantum Race

For decades, the narrative surrounding quantum computing has been one of isolated, cryogenic monoliths. We have obsessed over increasing the cubit count on a single chip, effectively trying to build the ultimate mainframe before we even have a network. But

, the founder and CEO of
Nu Quantum
, is flipping that script. She is betting the future of the industry on the networking layer—the "connective tissue" that will transform individual quantum processing units (QPUs) into a distributed, scalable powerhouse. This is the same evolutionary leap that took classical computing from the basement of research labs to the global cloud infrastructure that dominates the economy today.

isn't just another hardware play; it is a category creator. By focusing on the entanglement fabric—the ability to link processors through photons—Palacios-Berraquero is addressing the single biggest bottleneck in the field. Without networking, quantum computers are limited by the physical constraints of a single cooling environment. With it, we enter the era of the quantum data center. This shift from a "bigger chip" mentality to a "networked cluster" approach is what will finally bring quantum out of the academic realm and into the commercial market.

Replicating the Cloud Architecture for Entanglement

The parallels between what

is building and the existing classical cloud stack are striking. In the classical world, we use Network Interface Cards (NICs), routers, and orchestration layers to allow thousands of GPUs to work as one. In the quantum realm, the components are fundamentally different but the logic is identical. You need an interface to convert static cubit information into traveling photons, a quantum networking unit (QNU) for switching, and a control layer to manage the "entanglement fabric."

This isn't just an academic exercise in networking. It is the only viable path to scale. Palacios-Berraquero notes that while we currently celebrate machines with dozens of cubits, the industry needs thousands, if not millions, of error-corrected cubits to solve world-changing problems. Attempting to cram all those cubits onto one chip is a scientific nightmare. However, by using a "stamp and repeat" method—taking reliable, smaller processors and networking them together—the industry shifts from a scientific roadmap dependent on Nobel-worthy breakthroughs to an engineering roadmap driven by iterative refinement. This is how you build a trillion-dollar industry: you stop looking for miracles and start optimizing the interconnects.

The Engineering Pivot and the Logical Era

In 2018,

began by developing component-level devices at the
University of Cambridge
. But a visionary founder knows when to pivot. Palacios-Berraquero realized that the real value wasn't in selling individual

3 min read