SlideShare une entreprise Scribd logo
1  sur  33
Télécharger pour lire hors ligne
Rolf Brink
Founder/CEO
+31 88 96 000 00
www.asperitas.com
COPYRIGHT © 2017 BY ASPERITAS
Asperitas, Robertus Nurksweg 5, 2033AA Haarlem, The Netherlands
Information in this document can be freely used and distributed for internal use only. Copying or
re-distribution for commercial use is strictly prohibited without written permission from
Asperitas.
If you would like to use the original powerpoint version of this document, you can send an email
to: Marketing@Asperitas.com
FOR MORE INFORMATION, FEEDBACK OR TO SUGGEST IMPROVEMENTS FOR THIS DOCUMENT,
PLEASE SEND YOUR SUGGESTIONS OR INQUIRIES TO:
WHITEPAPERS@ASPERITAS.COM
Copyright © 2017 by Asperitas
THE DATACENTRE OF THE FUTURE
A DATACENTRE IS NOT ABOUT ICT, POWER,
COOLING OR SECURITY.
IT IS NOT EVEN ABOUT SCALE OR
AVAILABILITY OF THESE SYSTEMS.
IT IS ABOUT THE AVAILABILITY OF
INFORMATION…
Copyright © 2017 by Asperitas
INCREASE IN INFORMATION FOOTPRINT
DEMAND FOR HIGH DENSITY CLOUD
CONSOLIDATION OF POWER DEMAND
OVERLOADING OF OUTDATED POWER GRID
GLOBAL NETWORK LOAD
CREATING EXERGY
THE CHALLENGES
Copyright © 2017 by Asperitas
Datacentres, Server rooms, Network hubs etc.
Estimated 4% Global Electricity Production (25 PWh)
4% = 1 PWh
1000000000000000 Wh (1015)
ENERGY FOOTPRINT OF INFORMATION
FACILITIES
Copyright © 2017 by Asperitas
EXPLANATION PREVIOUS SLIDE (ENERGY FOOTPRINT)
▪ There is no real data available which can substantiate any sort of claim to the percentage
mentioned. It is a very rough estimate for a global scale.
▪ The fact that the information is unavailable, is part of the problem. You cannot solve the problem which
cannot be identified.
▪ https://yearbook.enerdata.net/electricity/world-electricity-production-statistics.html
▪ The only available source for a global figure is the European Commission where Neelie Kroes
mentioned datacentres being responsible for 2% of the global electrical energy production and
ICT in general being responsible for 8-10%. (2012)
Copyright © 2017 by Asperitas
GLOBAL PUE: APPROX. 1.7
Cooling
37%
UPS/Nobreak
2%
Other
2%
Power supply (15%)
9%
Fans
(20%)
10%
Information
(65%)
40%
ICT
59%
Global estimated PUE breakdown
Copyright © 2017 by Asperitas
EXPLANATION PREVIOUS SLIDE (GLOBAL PUE ESTIMATE)
▪ There is no real data available which can substantiate any sort of claim to the percentages
mentioned. Everything is a very rough estimate for a global scale.
▪ For both the Power supply and Fans percentages, the estimates are based on the following
factors:
▪ Most datacentres (colo) have no control over the internal temperature management of the IT.
▪ Environmental temperatures in datacentres are usually too high for optimal power efficiency of IT.
▪ IT is in general greatly under-utilised (30% is already high in many cloud environments, corporate IT in
colo is often worse) which creates a very high overhead for both the PSU and fans.
▪ Most cloud and corporate backoffice platforms are running on inefficient, cheap IT hardware which is
driven by CAPEX as opposed to OPEX. Reason for this is that power consumption is not part of the IT
budget.
▪ Virtually all high density cloud environments (5-10 kW/rack) are based on 1U servers. These are the
least efficient when it comes to fan overhead.
Copyright © 2017 by Asperitas
ACTUAL INFORMATION EFFICIENCY
▪ 1000 TWh total energy
▪ Actual cooling: 471 TWh
▪ Actual power loss: 112 TWh
112 TJ/s
▪ Other overhead: 17 TWh
17 TJ/s
▪ Energy for information: 400 TWh
400TJ/s
▪ 1000 TWh/400TWh
▪ PUE equivalent: 2.5
Cooling + IT
fans
47%
UPS+IT
Power
supply
11%Other
2%
Information
40%
Actual efficiency
Copyright © 2017 by Asperitas
Global thermal energy production by Information
1904400000000000000 J or 1.9 EJ (1018) (529 TJ/s*3600):
Energy for heat rejection
EXERGY DESTRUCTION:
471000000000000 Wh
ENERGY TRANSFORMATION
Copyright © 2017 by Asperitas
EXCLUDE COOLING INSTALLATIONS
REDUCE IT OVERHEAD
BALANCE THE POWER GRID
REDUCE THE DATA NETWORK LOAD
BECOME ENERGY PRODUCERS
WHAT IF INFORMATION FACILITIES COULD...
Copyright © 2017 by Asperitas
1 MW critical load, ΔT of 10°C
Thermal production: 1MJ/s
1°C rise with air:
1005 J/kg°C * 0.001205 kg/L = 1.211025 J/L/s
1MJ/s requires: 1000000 J/(10°C *1.2 J)
83333L/s AIR
COOLING THE CLOUD
Copyright © 2017 by Asperitas
Water required for ΔT of 10 °C
4187 J/kg°C * 1 kg/L = 4187 J/L/s per 1°C
10°C with 1 MJ/s: 24 L/s
83333 L/s AIR VS WATER
1 MJ/s
THE DATACENTER OF THE FUTURE:
6 L/s
AND IS AN ENERGY PRODUCER...…
Liquid can travel
200 TIMES THE DISTANCE
with same thermal losses
Copyright © 2017 by Asperitas
EXPLANATION PREVIOUS SLIDES (WATER VS AIR)
▪ For simplicity, the identical ΔT is maintained for both approaches.
▪ After feedback from reviewers and the audience of Datacentre Transformation Manchester, the
comparison was raised to 10 °C
▪ Air will usually allow the ΔT to become higher than 10 °C, although due to poor utilization, this is not
always achieved.
▪ In CRAC water circuits, the ΔT is usually below 10 °C.
Copyright © 2017 by Asperitas
DON’T ASK WHICH TECHNOLOGY TO USE
CHOOSE BETWEEN AIR OR LIQUID
THEN COMBINE LIQUID TECHNOLOGIES
THE WRONG QUESTION
Copyright © 2017 by Asperitas
TLC - IMMERSED COMPUTING®
▪ 100% Removal of heat from the IT
▪ Highest IT efficiency by eliminated fans
▪ No air required
▪ Level of intelligence
▪ Management control and insight
▪ Automatic optimisation of the water circuit
▪ Optimised for high density cloud/HPC nodes
▪ Varying servers
▪ Flexible IT hardware
▪ Feed: 18-40°C / 55°C Extreme / max ΔT 10°C
Copyright © 2017 by Asperitas
DLC - DIRECT-TO-CHIP LIQUID COOLING
▪ Removes heat from hottest parts of the IT
▪ Increased IT efficiency by reduced fan power
▪ Requires additional cooling (ILC)
▪ Level of intelligence
▪ Management control and insight
▪ Automatic optimisation of the water circuit
▪ Optimised for HPC racks with identical nodes
▪ Very high temperature chips
▪ High density computing
▪ Feed: 18-45°C / max ΔT 15°C
Copyright © 2017 by Asperitas
ILC - (ACTIVE) REAR DOOR COOLING
▪ 100% Removal of heat from the IT
▪ Small IT efficiency by assisted circulation
▪ Acts as air handler in the room
▪ Level of intelligence
▪ Management control and insight
▪ Automatic optimisation of the water circuit
▪ Optimised for IT with limited liquid compatibility
▪ Storage
▪ Network
▪ Legacy systems and high maintenance servers
▪ Feed: 18-23°C / 28°C Extreme / max ΔT 12°C
Copyright © 2017 by Asperitas
TECHNOLOGY NORMAL
INLET OUTLET
CRAC (generic) 6-18°C 12-25°C
ILC (U-Systems) 18-23°C 23-28°C
DLC (Asetek) 18-45°C 24-55°C
TLC (Asperitas) 18-40°C 22-48°C
TECHNOLOGY EXTREME
INLET OUTLET
CRAC (generic) 21°C 30°C
ILC (U-Systems) 28°C 32°C
DLC (Asetek) 45°C 65°C
TLC (Asperitas) 55°C 65°C
OPTIMISING LIQUID
INFRASTRUCTURES
Copyright © 2017 by Asperitas
17°C
CRAC
Parallel
• +5°C
• Output 22°C
ILC Parallel
• +6°C
• Output 28°C
TLC & DLC
paired
• +16°C
• Output 44°C
TLC & DLC
paired
• +16°C
• Output 60°C
Facility
output
60°C
INCREASING ΔT WITH TEMPERATURE CHAINING
▪ Serial implementation of the infrastructure
CREATE HIGH ∆T
3-stage cooling for low water volume
Down to 35 °C, free-air
Between 35-28 °C, adiabatic
Below 28 °C, chiller
USABLE HEAT
Copyright © 2017 by Asperitas
TEMPERATURE CHAINING EXAMPLE
▪ Closed room 3-stage configuration
▪ ILC setup maintains air temperature
▪ Water volume decreased by 85%
▪ ΔT 6°C : 29.9 L/s
▪ ΔT 40°C : 4.5 L/s
▪ Cooling options
▪ Closed cooling circuit with pumps and
coolers
▪ Closed cooling circuit with pumps and reuse
behind Heat Exchanger
▪ Open cooling circuit with external water
source supplied for reuse
+8°C
+5°C+6°C+6°C+7°C+6°C
+6°C+7°C+4°C+3°C+6°C
+7°C
+9°C
+6°C
+6°C
+7°C+10°C+7°C
+7°C+8°C+6°C
+14°C
+16°C
+17°C
+12°C
Stage 1
ILC
Stage 2
DLC & TLC
Stage 3
Optimised DLC&TLC
Facility input
20°C
26°C
+6°C
+5°C
+8°C
+7°C
+2°C
+5°C
+7°C
+5°C
+9°C
+8°C
+4°C
+6°C
+5°C+7°C+6°C+2°C
+7°C+6°C+2°C+3°C+5°C
+9°C
+7°C
Facility output
60°C
400 kW TLC
40 kW DLC
+19°C
160 kW TLC
60 kW DLC
+15°C
120 kW ILC
+6°C
45°C
Copyright © 2017 by Asperitas
REUSE MICRO INFRASTRUCTURE
▪ Micro datacentre or server room
▪ Open water circuit
▪ Reusable requirement: 65°C
▪ Variable volume
▪ Feedback loop for
constant temperature
+7°C
+7°C
Constant temp
Facility output
65°C
51°C
Variable temp
Facility input
5-40°C
Copyright © 2017 by Asperitas
Water required for ΔT of 40 °C
4187 J/kg°C * 1 kg/L = 4187 J/L per 1°C
40°C with 1 MJ/s: 6 L/s
83333 L/s AIR VS WATER
1 MJ/s
TEMPERATURE CHAINING IMPACT
Copyright © 2017 by Asperitas
DATACENTRE DESIGN
▪ Cooling options:
▪ 100% Chillers
▪ 100% Free air/adiabatic + 100% Chillers (off)
▪ High volume, low ∆T (5-20 °C)
▪ Fluid handling
▪ Spacious high capacity air ducting
▪ Air filtration
▪ Hot/Cold aisle separation
▪ Information density (avg) 1,5 kW/m2
▪ Concrete floor + Raised floors
▪ Power
▪ UPS (IT only): 100%
▪ Gensets (facility): 100%
▪ Cooling options:
▪ External cold water supply by reuser
▪ 100% Free air/adiabatic + 5% chillers
▪ Low volume, high ∆T (20+°C)
▪ Fluid handling
▪ Normal capacity water circuit
▪ Water quality management
▪ Minimal “fresh-air” ventilation
▪ Information density (mixed) 12 kW/m2
▪ Bare concrete floor
▪ Power (compared to air)
▪ UPS (IT only): 90%
▪ Gensets: 60%
DESIGNED FOR AIR DESIGNED FOR MIXED LIQUID
Copyright © 2017 by Asperitas
Minimised energy footprint
Minimised installations requirements
Flexibility by minimal environmental impact
FOCUS ON 24/7 HEAT CONSUMERS
SITE PLANNING AND QUALIFICATION
Copyright © 2017 by Asperitas
REDEFINING THE LANDSCAPE
▪ Large facilities
▪ Core Datacentres
▪ On the edge of urban areas
▪ Distributed micro facilities
▪ Edge nodes
▪ Inside the urban area
▪ Energy balancing
▪ Distributed minimised power load
▪ Focus on heat reuse
Copyright © 2017 by Asperitas
DISTRIBUTED MICRO EDGE NODES
▪ 10-100 kW
▪ Edge of network, within urban areas
▪ IoT capture and processing
▪ Data caching (Netflix, Youtube, etc.)
▪ Localised cloud services (SaaS, Paas, IaaS)
▪ Minimised facilities
▪ External cooling input
▪ 24/7 energy rejection for reuse
▪ Geo redundant
▪ Tesla Powerpack for controlled failover
▪ District data hub
EDGE ENERGY REUSE
Spas, swimming pools (100% reuse)
Hospitals, hotels with hot water loops (100% reuse)
Urban fish/vegetable farms with aquaponics (100% reuse)
District heating (100% reuse)
Aquifers for heat storage (75% reuse)
Water mains (29% reuse)
Canals, lakes and sewage (exergy destruction)
Copyright © 2017 by Asperitas
CORE DATACENTRES
▪ Large facilities
▪ No-break systems
▪ Limited cooling infrastructure
▪ 24/7 information availability
▪ Edge management
▪ Replicated Edge back-end
▪ Communication hub
▪ Industrial scale reuse infrastructures
▪ 100% 24/7 heat reuse
▪ Agriculture
▪ Spas
▪ Cooking, pressurising, sterilisation,
bleaching
▪ Distillation, concentrating, drying or
Kilning
▪ Chemical
▪ Exergy destruction
▪ Rivers/ocean
▪ Liquid-to-air rejection
Copyright © 2017 by Asperitas
EDGE MANAGEMENT
Emerging platforms for decentralised cluster management
Integration with energy and heat management
CLOUD
CUSTOMERS
COMPUTING JOBS
COMPUTING RESULTS
CLOUD DEMAND
COMPUTING JOBS
COMPUTING RESULTS
JOB SCHEDULING
HEAT DEMAND
CLOUD DEMAND
DISTRIBUTION
MANAGEMENT FOR
HEATING
.ware
INFORMATION
HARDWARE
Q.RADS, O.MAR, AIC24
HARDWARE
INSTALLATION
HEAT DEMAND
CUSTOMERS
.rads
Free and green
heat
Copyright © 2017 by Asperitas
LIQUID INFORMATION FACILITIES
▪ Reduced or eliminated technical installations
▪ Cooling
▪ No-break
▪ Reduced build cost
▪ No raised floors
▪ Reduced space for fluid handling
▪ Increased, distributed power density
▪ Reduced m2
▪ Reduced operational cost
▪ Reduced maintenance on installations
▪ High IT density
▪ Higher specs IT hardware, also for cloud
▪ Reduced software cost
Copyright © 2017 by Asperitas
LIQUID IS COMING HOW TO PREPARE?
Design for water
(redundant) to IT
Sufficient ability to
distribute piping to
whitespace
Plan for reusable heat
Plan for liquid way of
work
IT maintenance rooms
for wet equipment
Staff training for liquid
management
Proper supplies and
tooling
Copyright © 2017 by Asperitas
WHAT NEEDS TO BE DONE?
▪ Focus on heat consumption without dependency
▪ Not with an invoice
▪ Free cooling guarantee
▪ Government involvement
▪ Incentives
▪ Intermediate for (industrial) heat reuse
▪ Information footprint as part of district planning
▪ More (low grade) heating networks
▪ Focus on TCO, not CAPEX
▪ PUE – need for a new “easy” metric
▪ PUE figures are widely manipulated
▪ PUE discourages IT efficiency
▪ It needs to give insight in actual inefficiency
Cooling
37%
UPS/Nobreak
2%
Other
2%
Power supply
(15%)
9%
Fans
(20%)
10% Information (65%)
40%
ICT
59%
PUE inefficiency
FEAR OF WATER
Copyright © 2017 by Asperitas
THE DATACENTRE OF THE FUTURE?
THE FUTURE IS NOW!
Copyright © 2017 by Asperitas

Contenu connexe

Tendances

4 steps to quickly improve pue through airflow management
4 steps to quickly improve pue through airflow management4 steps to quickly improve pue through airflow management
4 steps to quickly improve pue through airflow managementUpsite Technologies
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid CoolingIceotopePR
 
Myths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotMyths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotUpsite Technologies
 
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sData Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sUpsite Technologies
 
The Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesThe Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesUpsite Technologies
 
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Upsite Technologies
 
Gaining Data Center Cooling Efficiency Through Airflow Management
Gaining Data Center Cooling Efficiency Through Airflow ManagementGaining Data Center Cooling Efficiency Through Airflow Management
Gaining Data Center Cooling Efficiency Through Airflow ManagementUpsite Technologies
 
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveFor Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveUpsite Technologies
 
Passively Cooled Data Centers (Weec 2009)
Passively Cooled Data Centers (Weec 2009)Passively Cooled Data Centers (Weec 2009)
Passively Cooled Data Centers (Weec 2009)jamilscott
 
Data center cooling infrastructure slide
Data center cooling infrastructure slideData center cooling infrastructure slide
Data center cooling infrastructure slideLivin Jose
 
Air conditioning if its not clean - its not green - ACT smart 4th sept 2014
Air conditioning   if its not clean - its not green - ACT smart 4th sept 2014Air conditioning   if its not clean - its not green - ACT smart 4th sept 2014
Air conditioning if its not clean - its not green - ACT smart 4th sept 2014Phil Wilkinson F.AIRAH F.IEAust M.ASHRAE
 
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingCooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingUpsite Technologies
 
Barangaroo South District Cooling Plant (DCP) Fact Sheet
Barangaroo South District Cooling Plant (DCP) Fact SheetBarangaroo South District Cooling Plant (DCP) Fact Sheet
Barangaroo South District Cooling Plant (DCP) Fact SheetDeanDallwitz
 
Data center free cooling use of heat wheel tech
Data center free cooling   use of heat wheel techData center free cooling   use of heat wheel tech
Data center free cooling use of heat wheel techMike DeCesare
 

Tendances (15)

4 steps to quickly improve pue through airflow management
4 steps to quickly improve pue through airflow management4 steps to quickly improve pue through airflow management
4 steps to quickly improve pue through airflow management
 
Total Liquid Cooling
Total Liquid CoolingTotal Liquid Cooling
Total Liquid Cooling
 
Myths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's NotMyths of Data Center Containment:Whats's True and What's Not
Myths of Data Center Containment:Whats's True and What's Not
 
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T'sData Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
Data Center Cooling Efficiency: Understanding the Science of the 4 Delta T's
 
The Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best PracticesThe Science Behind Airflow Management Best Practices
The Science Behind Airflow Management Best Practices
 
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
Clarifying ASHRAE's Recommended Vs. Allowable Temperature Envelopes and How t...
 
Gaining Data Center Cooling Efficiency Through Airflow Management
Gaining Data Center Cooling Efficiency Through Airflow ManagementGaining Data Center Cooling Efficiency Through Airflow Management
Gaining Data Center Cooling Efficiency Through Airflow Management
 
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually ExclusiveFor Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
For Most Data Centers, Liquid and Air Cooling Will Not be Mutually Exclusive
 
Passively Cooled Data Centers (Weec 2009)
Passively Cooled Data Centers (Weec 2009)Passively Cooled Data Centers (Weec 2009)
Passively Cooled Data Centers (Weec 2009)
 
Data center cooling infrastructure slide
Data center cooling infrastructure slideData center cooling infrastructure slide
Data center cooling infrastructure slide
 
Air conditioning if its not clean - its not green - ACT smart 4th sept 2014
Air conditioning   if its not clean - its not green - ACT smart 4th sept 2014Air conditioning   if its not clean - its not green - ACT smart 4th sept 2014
Air conditioning if its not clean - its not green - ACT smart 4th sept 2014
 
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center CoolingCooling Optimization 101: A Beginner's Guide to Data Center Cooling
Cooling Optimization 101: A Beginner's Guide to Data Center Cooling
 
Eco cooling
Eco cooling Eco cooling
Eco cooling
 
Barangaroo South District Cooling Plant (DCP) Fact Sheet
Barangaroo South District Cooling Plant (DCP) Fact SheetBarangaroo South District Cooling Plant (DCP) Fact Sheet
Barangaroo South District Cooling Plant (DCP) Fact Sheet
 
Data center free cooling use of heat wheel tech
Data center free cooling   use of heat wheel techData center free cooling   use of heat wheel tech
Data center free cooling use of heat wheel tech
 

Similaire à Datacentre of the Future

Lenovo HPC: Energy Efficiency and Water-Cool-Technology Innovations
Lenovo HPC: Energy Efficiency and Water-Cool-Technology InnovationsLenovo HPC: Energy Efficiency and Water-Cool-Technology Innovations
Lenovo HPC: Energy Efficiency and Water-Cool-Technology Innovationsinside-BigData.com
 
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingHow IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingUpsite Technologies
 
Hyper efficient data centres – key ingredient intelligence networkshop44
Hyper efficient data centres – key ingredient intelligence   networkshop44Hyper efficient data centres – key ingredient intelligence   networkshop44
Hyper efficient data centres – key ingredient intelligence networkshop44Jisc
 
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technology
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technologyLcp hybrid –_effi_cient_performance_with_heat_pipe_technology
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technologyEeu SC
 
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdf
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdfCASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdf
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdfUtkarshSingh554039
 
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsIn the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsVantage Data Centers
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingGraybar
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011Muchamad Jainuri
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiencyEurotech Aurora
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016Alan Beresford
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016Emma Beresford
 
Liquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsLiquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsEeu SC
 
Green datacenter finnal
Green datacenter finnalGreen datacenter finnal
Green datacenter finnalGreenology
 
Green computing by : Primastuti Dewi
Green computing by : Primastuti DewiGreen computing by : Primastuti Dewi
Green computing by : Primastuti DewiPrimastuti Dewi
 
DC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessDC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessCAREL Industries S.p.A
 
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, Interxion
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, InterxionVejen til lav PUE- optimering og stordrift - Flemming Søeberg, Interxion
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, InterxionMediehuset Ingeniøren Live
 

Similaire à Datacentre of the Future (20)

Lenovo HPC: Energy Efficiency and Water-Cool-Technology Innovations
Lenovo HPC: Energy Efficiency and Water-Cool-Technology InnovationsLenovo HPC: Energy Efficiency and Water-Cool-Technology Innovations
Lenovo HPC: Energy Efficiency and Water-Cool-Technology Innovations
 
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual UnderstandingHow IT Decisions Impact Facilities: The Importance of Mutual Understanding
How IT Decisions Impact Facilities: The Importance of Mutual Understanding
 
Hyper efficient data centres – key ingredient intelligence networkshop44
Hyper efficient data centres – key ingredient intelligence   networkshop44Hyper efficient data centres – key ingredient intelligence   networkshop44
Hyper efficient data centres – key ingredient intelligence networkshop44
 
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technology
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technologyLcp hybrid –_effi_cient_performance_with_heat_pipe_technology
Lcp hybrid –_effi_cient_performance_with_heat_pipe_technology
 
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdf
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdfCASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdf
CASE STUDY Hybrid Cooling IIT Delhi 13.2.2019.pdf
 
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEsIn the Data Center: Five Keys to Achieving Ultra-Low PUEs
In the Data Center: Five Keys to Achieving Ultra-Low PUEs
 
Slides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for CoolingSlides: The Top 3 North America Data Center Trends for Cooling
Slides: The Top 3 North America Data Center Trends for Cooling
 
Cirrascale forest container march 2011
Cirrascale forest container march 2011Cirrascale forest container march 2011
Cirrascale forest container march 2011
 
Aurora hpc energy efficiency
Aurora hpc energy efficiencyAurora hpc energy efficiency
Aurora hpc energy efficiency
 
Datacentre Uk
Datacentre UkDatacentre Uk
Datacentre Uk
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
General Sales Brochure 2016
General Sales Brochure 2016General Sales Brochure 2016
General Sales Brochure 2016
 
The next wave of GreenIT
The next wave of GreenITThe next wave of GreenIT
The next wave of GreenIT
 
Liquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systemsLiquid cooling package_lcp_cooling_systems
Liquid cooling package_lcp_cooling_systems
 
Green datacenter finnal
Green datacenter finnalGreen datacenter finnal
Green datacenter finnal
 
Green computing by : Primastuti Dewi
Green computing by : Primastuti DewiGreen computing by : Primastuti Dewi
Green computing by : Primastuti Dewi
 
DC inverter technology in HVAC-R business
DC inverter technology in HVAC-R businessDC inverter technology in HVAC-R business
DC inverter technology in HVAC-R business
 
Green Data Center
Green Data CenterGreen Data Center
Green Data Center
 
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, Interxion
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, InterxionVejen til lav PUE- optimering og stordrift - Flemming Søeberg, Interxion
Vejen til lav PUE- optimering og stordrift - Flemming Søeberg, Interxion
 
Afcom2010
Afcom2010Afcom2010
Afcom2010
 

Dernier

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Scriptwesley chun
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoffsammart93
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdfChristopherTHyatt
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Drew Madelung
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationMichael W. Hawkins
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonAnna Loughnan Colquhoun
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 

Dernier (20)

Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdf
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 

Datacentre of the Future

  • 1. Rolf Brink Founder/CEO +31 88 96 000 00 www.asperitas.com
  • 2. COPYRIGHT © 2017 BY ASPERITAS Asperitas, Robertus Nurksweg 5, 2033AA Haarlem, The Netherlands Information in this document can be freely used and distributed for internal use only. Copying or re-distribution for commercial use is strictly prohibited without written permission from Asperitas. If you would like to use the original powerpoint version of this document, you can send an email to: Marketing@Asperitas.com FOR MORE INFORMATION, FEEDBACK OR TO SUGGEST IMPROVEMENTS FOR THIS DOCUMENT, PLEASE SEND YOUR SUGGESTIONS OR INQUIRIES TO: WHITEPAPERS@ASPERITAS.COM Copyright © 2017 by Asperitas
  • 3. THE DATACENTRE OF THE FUTURE A DATACENTRE IS NOT ABOUT ICT, POWER, COOLING OR SECURITY. IT IS NOT EVEN ABOUT SCALE OR AVAILABILITY OF THESE SYSTEMS. IT IS ABOUT THE AVAILABILITY OF INFORMATION… Copyright © 2017 by Asperitas
  • 4. INCREASE IN INFORMATION FOOTPRINT DEMAND FOR HIGH DENSITY CLOUD CONSOLIDATION OF POWER DEMAND OVERLOADING OF OUTDATED POWER GRID GLOBAL NETWORK LOAD CREATING EXERGY THE CHALLENGES Copyright © 2017 by Asperitas
  • 5. Datacentres, Server rooms, Network hubs etc. Estimated 4% Global Electricity Production (25 PWh) 4% = 1 PWh 1000000000000000 Wh (1015) ENERGY FOOTPRINT OF INFORMATION FACILITIES Copyright © 2017 by Asperitas
  • 6. EXPLANATION PREVIOUS SLIDE (ENERGY FOOTPRINT) ▪ There is no real data available which can substantiate any sort of claim to the percentage mentioned. It is a very rough estimate for a global scale. ▪ The fact that the information is unavailable, is part of the problem. You cannot solve the problem which cannot be identified. ▪ https://yearbook.enerdata.net/electricity/world-electricity-production-statistics.html ▪ The only available source for a global figure is the European Commission where Neelie Kroes mentioned datacentres being responsible for 2% of the global electrical energy production and ICT in general being responsible for 8-10%. (2012) Copyright © 2017 by Asperitas
  • 7. GLOBAL PUE: APPROX. 1.7 Cooling 37% UPS/Nobreak 2% Other 2% Power supply (15%) 9% Fans (20%) 10% Information (65%) 40% ICT 59% Global estimated PUE breakdown Copyright © 2017 by Asperitas
  • 8. EXPLANATION PREVIOUS SLIDE (GLOBAL PUE ESTIMATE) ▪ There is no real data available which can substantiate any sort of claim to the percentages mentioned. Everything is a very rough estimate for a global scale. ▪ For both the Power supply and Fans percentages, the estimates are based on the following factors: ▪ Most datacentres (colo) have no control over the internal temperature management of the IT. ▪ Environmental temperatures in datacentres are usually too high for optimal power efficiency of IT. ▪ IT is in general greatly under-utilised (30% is already high in many cloud environments, corporate IT in colo is often worse) which creates a very high overhead for both the PSU and fans. ▪ Most cloud and corporate backoffice platforms are running on inefficient, cheap IT hardware which is driven by CAPEX as opposed to OPEX. Reason for this is that power consumption is not part of the IT budget. ▪ Virtually all high density cloud environments (5-10 kW/rack) are based on 1U servers. These are the least efficient when it comes to fan overhead. Copyright © 2017 by Asperitas
  • 9. ACTUAL INFORMATION EFFICIENCY ▪ 1000 TWh total energy ▪ Actual cooling: 471 TWh ▪ Actual power loss: 112 TWh 112 TJ/s ▪ Other overhead: 17 TWh 17 TJ/s ▪ Energy for information: 400 TWh 400TJ/s ▪ 1000 TWh/400TWh ▪ PUE equivalent: 2.5 Cooling + IT fans 47% UPS+IT Power supply 11%Other 2% Information 40% Actual efficiency Copyright © 2017 by Asperitas
  • 10. Global thermal energy production by Information 1904400000000000000 J or 1.9 EJ (1018) (529 TJ/s*3600): Energy for heat rejection EXERGY DESTRUCTION: 471000000000000 Wh ENERGY TRANSFORMATION Copyright © 2017 by Asperitas
  • 11. EXCLUDE COOLING INSTALLATIONS REDUCE IT OVERHEAD BALANCE THE POWER GRID REDUCE THE DATA NETWORK LOAD BECOME ENERGY PRODUCERS WHAT IF INFORMATION FACILITIES COULD... Copyright © 2017 by Asperitas
  • 12. 1 MW critical load, ΔT of 10°C Thermal production: 1MJ/s 1°C rise with air: 1005 J/kg°C * 0.001205 kg/L = 1.211025 J/L/s 1MJ/s requires: 1000000 J/(10°C *1.2 J) 83333L/s AIR COOLING THE CLOUD Copyright © 2017 by Asperitas
  • 13. Water required for ΔT of 10 °C 4187 J/kg°C * 1 kg/L = 4187 J/L/s per 1°C 10°C with 1 MJ/s: 24 L/s 83333 L/s AIR VS WATER 1 MJ/s THE DATACENTER OF THE FUTURE: 6 L/s AND IS AN ENERGY PRODUCER...… Liquid can travel 200 TIMES THE DISTANCE with same thermal losses Copyright © 2017 by Asperitas
  • 14. EXPLANATION PREVIOUS SLIDES (WATER VS AIR) ▪ For simplicity, the identical ΔT is maintained for both approaches. ▪ After feedback from reviewers and the audience of Datacentre Transformation Manchester, the comparison was raised to 10 °C ▪ Air will usually allow the ΔT to become higher than 10 °C, although due to poor utilization, this is not always achieved. ▪ In CRAC water circuits, the ΔT is usually below 10 °C. Copyright © 2017 by Asperitas
  • 15. DON’T ASK WHICH TECHNOLOGY TO USE CHOOSE BETWEEN AIR OR LIQUID THEN COMBINE LIQUID TECHNOLOGIES THE WRONG QUESTION Copyright © 2017 by Asperitas
  • 16. TLC - IMMERSED COMPUTING® ▪ 100% Removal of heat from the IT ▪ Highest IT efficiency by eliminated fans ▪ No air required ▪ Level of intelligence ▪ Management control and insight ▪ Automatic optimisation of the water circuit ▪ Optimised for high density cloud/HPC nodes ▪ Varying servers ▪ Flexible IT hardware ▪ Feed: 18-40°C / 55°C Extreme / max ΔT 10°C Copyright © 2017 by Asperitas
  • 17. DLC - DIRECT-TO-CHIP LIQUID COOLING ▪ Removes heat from hottest parts of the IT ▪ Increased IT efficiency by reduced fan power ▪ Requires additional cooling (ILC) ▪ Level of intelligence ▪ Management control and insight ▪ Automatic optimisation of the water circuit ▪ Optimised for HPC racks with identical nodes ▪ Very high temperature chips ▪ High density computing ▪ Feed: 18-45°C / max ΔT 15°C Copyright © 2017 by Asperitas
  • 18. ILC - (ACTIVE) REAR DOOR COOLING ▪ 100% Removal of heat from the IT ▪ Small IT efficiency by assisted circulation ▪ Acts as air handler in the room ▪ Level of intelligence ▪ Management control and insight ▪ Automatic optimisation of the water circuit ▪ Optimised for IT with limited liquid compatibility ▪ Storage ▪ Network ▪ Legacy systems and high maintenance servers ▪ Feed: 18-23°C / 28°C Extreme / max ΔT 12°C Copyright © 2017 by Asperitas
  • 19. TECHNOLOGY NORMAL INLET OUTLET CRAC (generic) 6-18°C 12-25°C ILC (U-Systems) 18-23°C 23-28°C DLC (Asetek) 18-45°C 24-55°C TLC (Asperitas) 18-40°C 22-48°C TECHNOLOGY EXTREME INLET OUTLET CRAC (generic) 21°C 30°C ILC (U-Systems) 28°C 32°C DLC (Asetek) 45°C 65°C TLC (Asperitas) 55°C 65°C OPTIMISING LIQUID INFRASTRUCTURES Copyright © 2017 by Asperitas
  • 20. 17°C CRAC Parallel • +5°C • Output 22°C ILC Parallel • +6°C • Output 28°C TLC & DLC paired • +16°C • Output 44°C TLC & DLC paired • +16°C • Output 60°C Facility output 60°C INCREASING ΔT WITH TEMPERATURE CHAINING ▪ Serial implementation of the infrastructure CREATE HIGH ∆T 3-stage cooling for low water volume Down to 35 °C, free-air Between 35-28 °C, adiabatic Below 28 °C, chiller USABLE HEAT Copyright © 2017 by Asperitas
  • 21. TEMPERATURE CHAINING EXAMPLE ▪ Closed room 3-stage configuration ▪ ILC setup maintains air temperature ▪ Water volume decreased by 85% ▪ ΔT 6°C : 29.9 L/s ▪ ΔT 40°C : 4.5 L/s ▪ Cooling options ▪ Closed cooling circuit with pumps and coolers ▪ Closed cooling circuit with pumps and reuse behind Heat Exchanger ▪ Open cooling circuit with external water source supplied for reuse +8°C +5°C+6°C+6°C+7°C+6°C +6°C+7°C+4°C+3°C+6°C +7°C +9°C +6°C +6°C +7°C+10°C+7°C +7°C+8°C+6°C +14°C +16°C +17°C +12°C Stage 1 ILC Stage 2 DLC & TLC Stage 3 Optimised DLC&TLC Facility input 20°C 26°C +6°C +5°C +8°C +7°C +2°C +5°C +7°C +5°C +9°C +8°C +4°C +6°C +5°C+7°C+6°C+2°C +7°C+6°C+2°C+3°C+5°C +9°C +7°C Facility output 60°C 400 kW TLC 40 kW DLC +19°C 160 kW TLC 60 kW DLC +15°C 120 kW ILC +6°C 45°C Copyright © 2017 by Asperitas
  • 22. REUSE MICRO INFRASTRUCTURE ▪ Micro datacentre or server room ▪ Open water circuit ▪ Reusable requirement: 65°C ▪ Variable volume ▪ Feedback loop for constant temperature +7°C +7°C Constant temp Facility output 65°C 51°C Variable temp Facility input 5-40°C Copyright © 2017 by Asperitas
  • 23. Water required for ΔT of 40 °C 4187 J/kg°C * 1 kg/L = 4187 J/L per 1°C 40°C with 1 MJ/s: 6 L/s 83333 L/s AIR VS WATER 1 MJ/s TEMPERATURE CHAINING IMPACT Copyright © 2017 by Asperitas
  • 24. DATACENTRE DESIGN ▪ Cooling options: ▪ 100% Chillers ▪ 100% Free air/adiabatic + 100% Chillers (off) ▪ High volume, low ∆T (5-20 °C) ▪ Fluid handling ▪ Spacious high capacity air ducting ▪ Air filtration ▪ Hot/Cold aisle separation ▪ Information density (avg) 1,5 kW/m2 ▪ Concrete floor + Raised floors ▪ Power ▪ UPS (IT only): 100% ▪ Gensets (facility): 100% ▪ Cooling options: ▪ External cold water supply by reuser ▪ 100% Free air/adiabatic + 5% chillers ▪ Low volume, high ∆T (20+°C) ▪ Fluid handling ▪ Normal capacity water circuit ▪ Water quality management ▪ Minimal “fresh-air” ventilation ▪ Information density (mixed) 12 kW/m2 ▪ Bare concrete floor ▪ Power (compared to air) ▪ UPS (IT only): 90% ▪ Gensets: 60% DESIGNED FOR AIR DESIGNED FOR MIXED LIQUID Copyright © 2017 by Asperitas
  • 25. Minimised energy footprint Minimised installations requirements Flexibility by minimal environmental impact FOCUS ON 24/7 HEAT CONSUMERS SITE PLANNING AND QUALIFICATION Copyright © 2017 by Asperitas
  • 26. REDEFINING THE LANDSCAPE ▪ Large facilities ▪ Core Datacentres ▪ On the edge of urban areas ▪ Distributed micro facilities ▪ Edge nodes ▪ Inside the urban area ▪ Energy balancing ▪ Distributed minimised power load ▪ Focus on heat reuse Copyright © 2017 by Asperitas
  • 27. DISTRIBUTED MICRO EDGE NODES ▪ 10-100 kW ▪ Edge of network, within urban areas ▪ IoT capture and processing ▪ Data caching (Netflix, Youtube, etc.) ▪ Localised cloud services (SaaS, Paas, IaaS) ▪ Minimised facilities ▪ External cooling input ▪ 24/7 energy rejection for reuse ▪ Geo redundant ▪ Tesla Powerpack for controlled failover ▪ District data hub EDGE ENERGY REUSE Spas, swimming pools (100% reuse) Hospitals, hotels with hot water loops (100% reuse) Urban fish/vegetable farms with aquaponics (100% reuse) District heating (100% reuse) Aquifers for heat storage (75% reuse) Water mains (29% reuse) Canals, lakes and sewage (exergy destruction) Copyright © 2017 by Asperitas
  • 28. CORE DATACENTRES ▪ Large facilities ▪ No-break systems ▪ Limited cooling infrastructure ▪ 24/7 information availability ▪ Edge management ▪ Replicated Edge back-end ▪ Communication hub ▪ Industrial scale reuse infrastructures ▪ 100% 24/7 heat reuse ▪ Agriculture ▪ Spas ▪ Cooking, pressurising, sterilisation, bleaching ▪ Distillation, concentrating, drying or Kilning ▪ Chemical ▪ Exergy destruction ▪ Rivers/ocean ▪ Liquid-to-air rejection Copyright © 2017 by Asperitas
  • 29. EDGE MANAGEMENT Emerging platforms for decentralised cluster management Integration with energy and heat management CLOUD CUSTOMERS COMPUTING JOBS COMPUTING RESULTS CLOUD DEMAND COMPUTING JOBS COMPUTING RESULTS JOB SCHEDULING HEAT DEMAND CLOUD DEMAND DISTRIBUTION MANAGEMENT FOR HEATING .ware INFORMATION HARDWARE Q.RADS, O.MAR, AIC24 HARDWARE INSTALLATION HEAT DEMAND CUSTOMERS .rads Free and green heat Copyright © 2017 by Asperitas
  • 30. LIQUID INFORMATION FACILITIES ▪ Reduced or eliminated technical installations ▪ Cooling ▪ No-break ▪ Reduced build cost ▪ No raised floors ▪ Reduced space for fluid handling ▪ Increased, distributed power density ▪ Reduced m2 ▪ Reduced operational cost ▪ Reduced maintenance on installations ▪ High IT density ▪ Higher specs IT hardware, also for cloud ▪ Reduced software cost Copyright © 2017 by Asperitas
  • 31. LIQUID IS COMING HOW TO PREPARE? Design for water (redundant) to IT Sufficient ability to distribute piping to whitespace Plan for reusable heat Plan for liquid way of work IT maintenance rooms for wet equipment Staff training for liquid management Proper supplies and tooling Copyright © 2017 by Asperitas
  • 32. WHAT NEEDS TO BE DONE? ▪ Focus on heat consumption without dependency ▪ Not with an invoice ▪ Free cooling guarantee ▪ Government involvement ▪ Incentives ▪ Intermediate for (industrial) heat reuse ▪ Information footprint as part of district planning ▪ More (low grade) heating networks ▪ Focus on TCO, not CAPEX ▪ PUE – need for a new “easy” metric ▪ PUE figures are widely manipulated ▪ PUE discourages IT efficiency ▪ It needs to give insight in actual inefficiency Cooling 37% UPS/Nobreak 2% Other 2% Power supply (15%) 9% Fans (20%) 10% Information (65%) 40% ICT 59% PUE inefficiency FEAR OF WATER Copyright © 2017 by Asperitas
  • 33. THE DATACENTRE OF THE FUTURE? THE FUTURE IS NOW! Copyright © 2017 by Asperitas