Professional Documents
Culture Documents
www.coolitsystems.com
We strive to deliver
the highest level of
product and service so
that when customers
choose CoolIT, they
know they are getting
the best.
Geoff Lyon, CEO/CTO CoolIT
Proven Technology
Achievements
CoolIT named Gartner 2015 Cool Vendor for Data Center Power & Cooling
Geoff Lyon elected as Chair of The Green Grid Liquid Cooling Work Group
50 issued patents
Global Installations
Technology Partners
Solution Partners
Performance
Efficiency
Density
ck Power
CAPEX
CAPEX
(annualized)
(annualized)
OPEX
OPEX
(annualized)
(annualized)
Rack
Density
Rack Power
Total IT Power
1.5 MW
1.5 MW
Total IT Pow
2.7 MW
1.8 MW
Total Data C
100
55
# of Racks R
15kW/rack
27.7kW/rack
# of Racks Required
IT Density
*Including Cooling
IT Density
www.coolitsystems.com
Modular Components
Retrofitable Cooling Solutions for All IT Demands
Rack DCLC utilizes a three module approach (Server Module, Manifold Module and Heat Exchanger Module) that allows for a
tremendous amount of product flexibility when integrating liquid cooling. Rack DCLC Modules are designed for a flexible fit to various
compute environments. While Server Modules and Manifold Modules are installed with each system and are local to the rack, the
appropriate heat rejection method may vary. Rack DCLC offers a variety of heat exchanging modules depending on load requirements
and availability of facility water.
Server Module
CoolITs Server Modules can cool any combination of CPU, GPU, Xeon Phi and Memory components, with customization available
for VR, ASIC, FPGA, and other devices. A server installed with CoolIT Systems Rack DCLC technology remains hot-swappable and
simple to service.
CPU
GPU
Memory
V-groove Micro-channel
Cold Plate
FEP Tubing
Manifold Module
The Manifold Module is the system that transports liquid between the Heat Exchanger Modules and any number of Server Modules. The
Manifold Module can be customized to suit any server or rack environment. Utilizing a stainless steel body and non-drip quick connects
makes CoolITs Manifold Modules incredibly safe and robust.
Rack Manifold
Chassis Manifold
AHx (Liquid-to-Air)
No Facility Water
Modules
Server Module
Manifold Module
CHx650 Module
CHx40 Module
AHx35 Module
AHx20 Module
www.coolitsystems.com
Key Features
Manages 650kW of IT load
W3-W5 warm water cooling
Redundant centralized pumps
Full control system
Key Benefits
Significantly reduces energy costs
Dramatically increase CPU/GPU density
Eliminates the need for chillers and CRACs
Servers remain hot-swappable and easy to service
Monitor system health remotely
The CHx650 Module can be customized to fit various
data center environments. Standard equipment groups
offer N+1 redundancy and when used in tandem can
provide tier 4 resiliency.
CoolingCooling
Capacity
CHx650 Module
Capacity
Maximum Cooling
Input
Temperature
(C)
Maximum
CoolingLoad
Load(kW)
(kW)vs.
vs.Facility
FacilityLiquid
Liquid
Input
Temperature
(C)
Capacity (kW)
1000
750
500
250
Cooling Capacity
(@ 30 C facility liquid)
650kW
Power Consumption
(Max)
4.2kW
5-20
ROI (months)
1-6
Yes
10
20
30
40
50
Key Features
Supports 40kW+ cooling capacity per rack
W3-W5 warm water cooling
Redundant centralized pumps
2U form factor
Leak detection and mitigation system
Full control system
Key Benefits
Dramatically increase CPU/GPU density
Eliminates the need for chillers and CRACs
Servers remain hot-swappable and easy to service
Monitor system health remotely
Cooling Capacity
(@ 30 C facility water)
40kW
Power Consumption
(Max)
110-120 Watts
30
20
ROI (months)
1-6
10
Yes
60
50
40
10
20
30
40
www.coolitsystems.com
Liquid-to-Air heat exchanger for faster, more powerful clusters in less space.
The AHx35 Module is a unique top-of-rack solution that
can manage 35kW+ of processor load without the need
of facility water.
The AHx35 Module dissipates heat to the surrounding
environment via a liquid-to-air heat exchanger.
Key Features
Supports 35kW+ cooling capacity per rack
No facility water required
Redundant centralized pumps
Key Benefits
Dramatically increase CPU/GPU density
Stand alone rack-based liquid cooling
Monitor system health remotely
Servers remain hot-swappable and easy to service
Easily upgradable to CHx Module
Cooling Capacity
(@ 25C air)
35kW
Power Consumption
(Max)
795 Watts
30
20
ROI (months)
1-12
10
No
60
50
40
10
20
30
40
50
Facility Air
Air Inlet
Inlet Temperature
Temperature (C)
Facility
(C)
2015, CoolIT Systems, Inc. All Rights Reserved
Key Features
Supports 20kW+ cooling capacity per rack
No facility water required
Redundant centralized pumps
Leak detection and mitigation system
Full control system
Key Benefits
Dramatically increase CPU/GPU density
Stand alone rack-based liquid cooling
Servers remain hot-swappable and easy to service
Monitor system health remotely
Easily upgradable to CHx Module
Capacity (kW)
30
22.5
15
7.5
Cooling Capacity
(@ 25C facility water)
20kW
Power Consumption
(Max)
300 Watts
ROI (months)
1-12
No
10
20
30
40
FacilityFacility
Air Inlet
Temperature
(C)
Input Temperature (C)
www.coolitsystems.com
50
Case Study
CoolIT Deploys Rack DCLC Solution to Poznan Supercomputing & Networking
Center in Partnership with Huawei and Itprojekt
The Challenge
Poznan Supercomputing & Networking Center (PSNC) was looking for
a safe and reliable way to increase HPC compute density while lowering
overall operating costs. In addition, PSNC wanted to leverage the
lower operating temperatures from liquid cooling to increase processor
performance.
The Solution
The PSNC solution combines Huawei CH121 blade servers in the E9000
chassis with CoolIT Systems Rack DCLC CHx40 centralized pumping
solution to manage 100% of CPU and memory thermal load. Dual left
and right Manifold Modules were positioned on the front of the rack
to manage the front i/o server architecture and ensure easy and quick
service access to the blades. The cluster, delivered by Warsaw based
systems integrator Itprojekt, is an extremely efficient solution that utilizes
inlet water temperatures of 40C and beyond.
HPC Setup
CoolIT Systems Rack DCLC CHx40
CPU and Memory cooled by liquid
Huawei E9000 chassis
Huawei CH121 server
Intel Xeon Processor E5-2697 v3
Memory: DDR4 8GB
128 servers in 3 racks
Results
80% of total IT load managed by liquid cooling
75% fan speed reduction
30kW per rack
910 GFLOPS per node (116 TFLOPS total)
40C primary fluid supply temperature
First ever Huawei liquid cooled cluster
Case Study
CASE STUDY: Cooling a Supercomputer
Liquid Cooling for Heat Reuse: Center of Biological Sequence Analysis at the
Technical University of Denmark
The Challenge
The Center for Biological Sequence Analysis (CBS) at the Technical
University of Denmark is striving towards C02 neutrality in its High
Performance Computing (HPC) facility known as Computerome. An equally
important goal is provisioning these computational resources as efficiently
as possible within a modular data center environment. CoolIT Systems
accepted the challenge to deliver a liquid cooled system that dramatically
lowers energy consumed for cooling and provides a workable system to
recycle the waste heat from the servers.
The Solution
CoolIT Systems is addressing CBSs need to be CO2 neutral in the universitys HPC facility by replacing traditional air
cooling with highly efficient liquid. The CBS solution combines rack mounted HP ProLiant DL560 Gen8 servers with
CoolIT Systems direct contact liquid cooling and Rack DCLC CHx40 centralized pumping solution, to manage the
dual CPU thermal load. The liquid is distributed directly to the processors through stainless steel manifolds that utilize all
metal, dry-break quick connects for hot-swappable servicing of any single server. With 16,000+ cores and 92 TB memory,
Computerome is now ranked 121 of the Top500 supercomputers in the world. The cluster, delivered by Go Virtual, accepts
an inlet water temperature of 40C and beyond to gain maximum efficiency. CBS enables waste heat recovery by routing
the high temperature liquid from the servers to provide heat for the adjacent buildings and nearby town of Roskilde.
HPC Setup
CoolIT Systems Rack DCLC CHx40
Dual CPU liquid cooled server modules
HP ProLiant DL560 Gen8 servers
Intel Xeon Processor E5 4600v2 130Watt CPUs
Modular data center
40C primary fluid supply temperature
Results
72% of total IT load managed by liquid cooling
14kW total rack load (130watt CPUs x 108)
563 GFLOPS per server (at peak performance)
Further results will be published after testing
www.coolitsystems.com
@coolitsystems
sales@coolitsystems.com
www.coolitsystems.com
T: +1 403.235.4895
Toll Free: 1.866.621.2665
Bay 10, 2928 Sunridge Way N.E., Calgary, AB, T1Y 7H9, CANADA