For those who are old enough to remember, think of the major leap that moving from vacuum tube transistors to microprocessors was for the computing industry. Now also think about the massive jump that has happened between the processing power in your home computer and the capabilities of your Smartphone or watch. But can you imagine massive computing power at the subatomic level that can operate one million times faster (for certain applications) than any device you have in your office or home? That’s the promise of quantum computing.
The potential impact of quantum computing on technologies such as security and artificial intelligence will enable new applications and capabilities in the fields of healthcare and pharmaceuticals, financial services, aerospace and defense. However, a lot must happen in quantum computing development, with the methods in which quantum computers operate and communicate with each other, and how new quantum native algorithms get developed before any of its practical applications can see the light of day.
A broader definition of quantum computing
A quantum computer uses a quantum bit or “qubit,” as its fundamental computing unit, just like bits are used in today’s computing. However, where traditional computing bits can be defined as either a “one” or “zero” with equal probability, qubits can represent 2 to the power of n (where n corresponds to number of bits) states at the same time. By leveraging this massive parallelism, you can do a lot of things simultaneously using qubits such as modeling drug interactions for all 20,000-plus proteins encoded in the human genome.
The challenges of bringing quantum computing to market
There are many challenges to quantum computing that are pushing out the availability of scalable quantum computing systems and applications. For example:
1. Keeping larger number of qubits stable for longer periods of time
We need a lot of qubits to be able to solve real, meaningful problems, which translates into moving from single-digit qubits on a single chip to tens and potentially hundreds of qubits. Today, Intel has verified package designs and fabrication on 17- and 49-qubit chips. But one needs to have thousands of qubits in order to build meaningful applications.
In addition, qubits are very sensitive to temperature and operating conditions in and around the quantum computer, causing their state to change in just a matter of microseconds. As a result, qubits need to be super cooled to remain stable and operate. Furthermore, one will require sophisticated error correction codes to account for unstable qubit related errors.
2. Interconnecting quantum computers
Research into methods for interconnecting quantum computers is in its early stages. Currently, there is no “quantum network” for connecting quantum computers outside of research labs. Initially, existing optical networking technology can be leveraged to connect quantum computers across WAN distances using a combination of classical networking gear and quantum communication routers. For LAN distances with a line of sight between the quantum computers, you can employ both fiber optic networks as well as free space networks directly between quantum networking equipment. Free space networks use light propagating in free space to wirelessly transmit data in telecommunications or computer networking, rather than a physical solid medium like fiber optic cable. A lot of qubits will be required to implement error correction codes for long distances across quantum computers.
3. Creating quantum computing design algorithms
A quantum computer can simultaneously be in many states at the same time, one has to fundamentally design algorithms in a new way to take advantage of quantum computers. In simple terms, one can think of designing massively parallel algorithms using this new computing model. Many types of search algorithms can be sped up using quantum algorithms. For example, big data search and sorting (Grover’s algorithm), AI optimization (National University of Singapore’s algorithm) and complex decoding for cybersecurity (Shor’s algorithm) are a few sample native quantum computing algorithms.
Due to these complex issues, we are realistically another 7-10 years away from having quantum computing systems and applications that are solving meaningful problems. Initially we will have hybrid computers that will be a combination of classical and quantum computers. There are a number of commercial vendors such as D-Wave, Google, Intel, IBM and research and development communities that are trying to accelerate that timeline. We also anticipate that the first quantum computing solutions will be provided as a service by major IaaS players such as Google and IBM to make them more accessible to a greater number of businesses.
The practical applications of quantum computing
Morgan Stanley predicts the market for quantum computing could double over the next decade to $10 billion. This growth is mainly driven by the stakeholders of the quantum computing market, but also will be stimulated by the growing use of quantum cryptography for security applications.
Quantum computing will become increasingly essential as we reach the limits of Moore’s law. We need to find new computing types and approaches to more quickly resolve certain problems that take too long to solve using traditional computing. For example, as we start to build more complex AI models, we will need more and more computing power to find the solution to a problem in a huge “search space,” where there are many options to consider. To equate this to a game of chess, quantum computing helps AI applications think thousands of moves ahead all at once.
Quantum computing is already getting a lot of attention from the private and public sectors in some key markets:
- Security: When it comes to security, quantum computing can be a double-edged sword. On the positive side, it can help create systems that are fortified against quantum cyberattacks. For example, an enterprise could deploy quantum cryptographic key distribution to protect its customer data. But it can also potentially help bad actors crack vulnerable security and encryption systems.
- Healthcare: In the areas of personalized medicine and drug development, healthcare and pharmaceutical companies could use quantum computing to model complex molecular interactions, such as simulating chains of chemical reactions to create new ways to cure cancer.
- Energy: Increased data analysis can help companies better optimize oil and gas extraction processes and improve real-time monitoring of their equipment to reduce accidents.
- Financial services: Brokerage firms can reinvent data analytics to come up with the best algorithms to develop new forms of portfolio optimization and risk management.
- Aerospace: Commercial aerospace and defense industries can develop more efficient aircraft navigation patterns by calculating multiple simulations based on various traffic scenarios and weather conditions.
How we see Equinix playing a role in this emerging paradigm
Initially, quantum computing will not be a commodity and will be primarily offered as a service by the large hyper-scalars. Thus, many enterprises will use these services in the cloud and, in many cases, will want to keep control over their data or will want to leverage quantum computing innovation from multiple cloud providers. Since Equinix data centers are strategically located close to many hyper-scalar data centers (more than 2,900 cloud and IT service providers have their edge nodes at Equinix), we are strategically positioned to extend newer generations of our Equinix Cloud Exchange Fabric™ services to connect user data with the quantum compute operations in multiple clouds.
Today, the raw data generated by Internet of Things (IoT) devices gets moved to the core clouds (AWS. Microsoft Azure, Google, IBM), where the AI models get built, and then subsequently these models are pushed to the edge where the AI models are used for real-time applications. However, in the future, with massive amounts of data being produced at the edge (due to billions of IoT devices), there will not be enough bandwidth available to move data to the core hyper-scalar clouds to create AI models using quantum computing. Instead, you would need to build AI models locally at the edge (near the IoT devices) and then send the local AI models to the core data center to build more accurate global AI models. Equinix data centers are ideally suited to host quantum computing for creating these local models due to their presence in most of the major global markets (200 data centers in 52 metros across the world).
It is important to note that quantum computing is not a panacea for all types of computing problems. Traditional computers will continue to be efficient in providing the necessary compute power to meet most needs. Still, as the introduction of microprocessors radically changed everything we once knew or imagined about traditional computing, the potential outcomes of quantum computing promise to exceed our wildest expectations.
Read about our global interconnection platform by downloading our Platform Equinix Vision paper.