Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • Subscribe
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Engineering diversity
    • Trends
  • Supplier Listings
  • Advertise
  • Subscribe

What is the role of liquid cooling connectors in AI data centers?

By Aharon Etengoff | February 26, 2025

Artificial intelligence (AI) and machine learning (ML) applications consume significant power and generate considerable heat in data centers. High-performance AI accelerators — such as graphics processing units (GPUs), tensor processing units (TPUs), and application-specific integrated circuits (ASICs) — increasingly require more efficient cooling methods to maintain safe and optimal thermal operating levels.

This article discusses the growing energy demands of AI and ML and explores the rise of liquid cooling for these high-performance workloads. It also reviews key design requirements for liquid-cooling connectors and highlights evolving industry standards formulated by the Open Compute Project (OCP).

The increasing energy demands of AI and ML

Accounting for 10% to 20% of all energy consumed in US data centers (Figure 1), AI-driven applications are considerably more power-intensive than many conventional workloads. For example, a ChatGPT query draws ten times more energy than a standard Google search. As computational power requirements for AI model training double every nine months, data centers may soon consume as much energy as entire countries.

Figure 1. An inside look at a data center showcasing extensive server infrastructure. (Image: DataCenterKnowledge)

With thermal design power (TDP) requirements reaching 1500W and average rack power increasing from 8.5 kW to 12 kW, effective cooling systems are critical to maintaining optimal data center temperatures of 70 to 75°F (21 to 24°C). Cooling infrastructure now accounts for approximately 40% of total energy consumption in some of theties, prompting organizations such as The Green Grid to develop a Liquid Cooling Total Cost of Ownership Calculation Tool (tggTCO).

The rise of liquid cooling for AI and ML workloads

Many liquid cooling systems circulate dielectric fluids or water-based solutions through pipes or channels placed near or directly on components like GPUs. This process effectively dissipates thermal buildup in data centers running a wide range of high-performance AI and ML applications, large learning models (LLMs), and training sets. These mixtures offer superior thermal conductivity and greater heat transfer capacity than traditional air cooling, fan-based systems, or passive heat sinks.

Figure 2. Data center immersion-cooling system with rack-mounted service rails for easy maintenance and hot swaps. (Image: GreenRevolutionCooling)

Data centers typically implement liquid cooling using two primary methods: cold plate and immersion cooling (Figure 2). Cold plate cooling circulates dielectric coolant over or near the hottest components, delivering high performance at the chip level yet still relying on supplemental air cooling to dissipate residual heat. As rack densities increase, cold plate liquid cooling scales more efficiently than stand-alone air-cooling systems, which often struggle to dissipate heat from densely packed equipment.

Significantly reducing the use of auxiliary fans, immersion cooling further improves energy efficiency by dissipating, recapturing, and reusing nearly 100% of generated heat. This cooling method, however, often requires new facility designs, structural modifications, and upgraded or new power distribution systems.

Precision liquid cooling, which occupies a middle ground between cold plate and immersion cooling, uses minute amounts of dielectric coolant to target the hottest components and effectively cool the entire system. This hybrid method, which eliminates water use for cooling, can reduce energy consumption by up to 40%.

Key performance requirements for liquid cooling connectors

When designing liquid-cooled AI systems, data center architects select connectors that meet key performance requirements, such as resisting temperatures up to 50°C (122°F), handling coolant flow rates up to 13 liters per minute (LPM), and maintaining pressure drops around 0.25 psi.

Figure 3. Data center infrastructure submerged in a liquid cooling solution. (Image: AKCP)

Additionally, these connectors ensure easy serviceability and compatibility with water-based or dielectric fluid (Figure 3) mixtures, preventing corrosion and leaks. Liquid cooling connectors also integrate seamlessly with in-rack manifolds and existing cooling infrastructure.

Additional key liquid cooling connector features include:

  • Quick disconnect: facilitates easy, dripless connection and disconnection for routine maintenance and emergency access in AI and ML data centers.
  • Large diameter: accommodates high flow rates, typically with a 5/8-inch inner diameter for server cooling in AI racks.
  • Thermal resistance: optimizes heat transfer by reducing thermal resistance, which is critical for cooling efficiency.
  • Manifold compatibility: aligns fluid connectors with three-inch square stainless-steel tubing for optimized coolant distribution.
  • Hybrid designs: combines high-speed data transfer and liquid cooling channels for AI systems.
  • Rugged designs: ensure durability and prevent leaks in challenging conditions, such as fluctuating temperatures, abrupt pressure drops, and strong vibrations.

Many companies, such as CPC (Colder Products Company), Koolance, Parker Hannifin, Danfoss Power Solutions, and CEJN, offer liquid cooling connectors for high-performance AI workloads in the data center. These manufacturers provide various quick disconnect fittings, couplings, and other components designed to manage thermal efficiency.

Evolving industry standards for liquid-cooling connectors

Industry organizations like the Open Compute Project (OCP) are developing open standards for liquid cooling connectors in data centers. The evolving OCP Large Quick Connector Specification outlines a universal quick connect, with standardized interface dimensions and performance requirements.

These include a working pressure of 35 psi at 60°C, a maximum operating pressure of 175 psi (12 bar), a flow rate of over 100 liters per minute (LPM), and ergonomic designs limiting mating torque to less than 5 Nm. Connectors must also handle temperatures from -4°F to 140°F (-20°C to 60°C), with shipping ranges of -40°F to 158°F (-40°C to 70°C). Additional criteria specify fluid loss under 0.15 mL per disconnect and a service life of at least 10 years of continuous use.

Summary

High-performance AI accelerators increasingly require efficient cooling to maintain safe, optimal thermal levels in data centers. Liquid cooling systems, which circulate dielectric fluids or water-based solutions near or directly on GPUs and TPUs, provide superior thermal conductivity and capacity compared to traditional air cooling, fan systems, or passive heat sinks. Liquid cooling connectors, designed for demanding environments, must resist temperatures up to 50°C (122°F), handle flow rates up to 13 LPM, and maintain pressure drops around 0.25 psi.

Related EE World Online content

How Are High-Speed Board-to-Board Connectors Used in ML and AI Systems?
Driving Standards: The Latest Liquid-Cooling Cables and Connectors for Data Centers
Where Are Liquid Cooled Connectors and Connectors for Liquid Cooling Used in EVs?
Where Are Liquid-Cooled Industrial Connectors Used?
Liquid Cooling For High-Performance Thermal Management

References

The Basics of Liquid Cooling in AI Data Centers, FlexPower Modules
High-Power Liquid Cooling Design: Direct-to-Chip Solution Requirements for 500-kW Racks, ChillDyne
Six Things to Consider When Introducing Liquid Cooling Into Your Data Center, Data Center Dynamics
Harnessing Liquid Cooling in AI Data Centers, Power Electronics News
How AI Is Fueling a Boom in Data Centers and Energy Demand, Time
Cooling the AI Revolution in Data Centers, DataCenterFrontier
Data Center Cooling: The Unexpected Challenge to AI, Spectra
Supporting AI Workloads: The Future of Data Center Cooling, DataCenterPost
The Advantages of Liquid Cooling, Data Center Frontier
Answering the Top FAQs on AI and Liquid Cooling, Schneider Electric Blog
Large Quick Connector Specification, Open Compute Project
How Immersion Cooling Helps Reduce Operational Costs in Data Centers, GRC


Filed Under: Connector Tips
Tagged With: Parker Hannifin, cpc, cejn
 

LEARNING CENTER

Design World Learning Center
“dw
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Design Engineering Professionals.

Design World Digital Edition

cover

Browse the most current issue of Design World and back issues in an easy to use high quality format. Clip, share and download with the leading design engineering magazine today.

EDABoard the Forum for Electronics

Top global problem solving EE forum covering Microcontrollers, DSP, Networking, Analog and Digital Design, RF, Power Electronics, PCB Routing and much more

EDABoard: Forum for electronics

Sponsored Content

  • Sustainability, Innovation and Safety, Central to Our Approach
  • Why off-highway is the sweet spot for AC electrification technology
  • Looking to 2025: Past Success Guides Future Achievements
  • North American Companies Seek Stronger Ties with Italian OEMs
  • Adapt and Evolve
  • Sustainable Practices for a Sustainable World
View More >>
Engineering Exchange

The Engineering Exchange is a global educational networking community for engineers.

Connect, share, and learn today »

Design World
  • About us
  • Contact
  • Manage your Design World Subscription
  • Subscribe
  • Design World Digital Network
  • Control Engineering
  • Consulting-Specifying Engineer
  • Plant Engineering
  • Engineering White Papers
  • Leap Awards

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search Design World

  • Home
  • Technologies
    • ELECTRONICS • ELECTRICAL
    • Fastening • joining
    • FLUID POWER
    • LINEAR MOTION
    • MOTION CONTROL
    • SENSORS
    • TEST & MEASUREMENT
    • Factory automation
    • Warehouse automation
    • DIGITAL TRANSFORMATION
  • Learn
    • Tech Toolboxes
    • Learning center
    • eBooks • Tech Tips
    • Podcasts
    • Videos
    • Webinars • general engineering
    • Webinars • Automated warehousing
    • Voices
  • LEAP Awards
  • 2025 Leadership
    • 2024 Winners
    • 2023 Winners
    • 2022 Winners
    • 2021 Winners
  • Design Guides
  • Resources
    • Subscribe
    • 3D Cad Models
      • PARTsolutions
      • TraceParts
    • Digital Issues
      • Design World
      • EE World
    • Engineering diversity
    • Trends
  • Supplier Listings
  • Advertise
  • Subscribe
We use cookies to personalize content and ads, to provide social media features, and to analyze our traffic. We share information about your use of our site with our social media, advertising, and analytics partners who may combine it with other information you’ve provided to them or that they’ve collected from your use of their services. You consent to our cookies if you continue to use this website.OkNoRead more