START TYPING AND PRESS ENTER TO SEARCH

High-Performance Computing, Quantum Computing and Data Centers

Posted by Stephan Lam on January 23, 2025

Different data centers have different priorities. Some focus on the capacity to support growing numbers of users, applications, and devices. Others need highly efficient storage for data-intensive workloads. But in an increasing number of data centers, processing power is king.

High levels of computing power and speed are required to run advanced applications and solve complex problems. As applications such as deep learning and predictive analytics become more prevalent, more organizations are implementing high-performance computing (HPC) in their data centers. Researchers are also advancing the development of quantum computers that harness quantum mechanics to solve problems that are out of the reach of the most powerful supercomputers.

These technologies require different approaches to the data center infrastructure. HPC systems generate high degrees of heat that cannot be efficiently dissipated with traditional air cooling. Quantum computers require supercooling to near absolute zero. Data center operators should keep these trends in mind as they plan for the future.

What Is High-Performance Computing

HPC aggregates computing power to deliver significantly more performance than the typical computer. High-speed computers are clustered together to work in parallel. HPC clusters may include tens of thousands of nodes and deliver one million times the performance of traditional servers.

Traditionally, HPC involved multimillion-dollar supercomputers with thousands of processors, taking up thousands of square feet. These machines were usually found in research labs, government, and higher education, where scientists could use the compute power to tackle important issues in medicine, defense, energy, and other areas. HPC systems are still costly, but prices are coming down, and entry-level systems are available.

In certain use cases, HPC may ultimately save money by processing data faster and improving application performance. HPC systems can also enable powerful simulations that reduce the need for expensive physical models and repeated testing. HPC is used in several industries, including:

  • Aerospace
  • Automotive
  • Energy
  • Financial services
  • Healthcare
  • Government
  • Manufacturing

In any case, high-performance computing is evolving rapidly. In 2022, the Frontier supercomputer crossed the exascale barrier, meaning it was capable of performing one quintillion operations per second. Just a couple of years later, the El Capitan supercomputer raised the bar even higher. 

What Is Quantum Computing

Quantum computing performs calculations using the quantum states of subatomic particles rather than electrical pulses. Traditional processors store bits of information as 0 or 1. Quantum computing is based on quantum bits, or qubits, which can represent 0 and 1 simultaneously — a concept called “superposition.” This inherent parallelism allows quantum computers to handle millions of computations simultaneously.

Entanglement means that two quantum particles are so closely correlated that they share the same state and interact with one another regardless of distance. Because of entanglement, a quantum computer can instantly change a qubit’s state by changing its paired qubit’s state. This enables exponentially faster processing than traditional computers.

HPC and Quantum Computing in the Data Center

HPC systems have relatively low availability requirements but high power requirements and very high utilization. They generate intense heat and often create “hot spots.” The data center must have adequate floor space, power, and cooling capacity to handle HPC. Server racks must be able to support heavier equipment. Standard load ratings won't cut it. Furthermore, liquid cooling is often required, and legacy data centers may not be able to meet these requirements without major infrastructure changes.

Quantum computing has unique data center requirements. As a result, these systems are being deployed in specially designed data centers with cryogenic cooling equipment. As quantum computing evolves, it may become less dependent on this specialized infrastructure.

Both technologies have more advanced requirements than traditional data center infrastructure, which represents a significant barrier to mass adoption. Building a data center to support quantum and high-performance computing requires massive investment. The solution we’ll likely see as these systems develop is an as-a-service model. The hyperscale cloud providers are exploring these concepts today

Preparing Your Data Center for the Future

Data centers have a 20- or 30-year lifespan, so now’s the time to begin planning for the deployment of HPC and even quantum computing. A key component is the server cabinets used in your data center infrastructure.

The Enconnex InfiniRack data center cabinet is designed to take your data center into the future:

  • Load ratings of 4,000 pounds static and 3,000 pounds dynamic to handle high-density deployments
  • Four heavy-duty casters for easy transport and deployment
  • Maximum internal space with a low-profile frame design and dual integrated PDU channels
  • Highly efficient airflow management through cable brush panels, grommet-sealed cable openings, and optional air dam kits

The InfiniRack can be configured in virtually unlimited ways and ships fully assembled with all accessories installed to your specifications. One of our data center infrastructure specialists is ready to assist you with your order. Contact us today.


Posted by Stephan Lam on January 23, 2025

Stephan has over 15 years of IT experience, including all aspects of data center operations, project management, service delivery, and sales engineering.

Learn more about Enconnex

Get to know Enconnex with a customized fit-out