Keeping Data Centers Cool as Heat Rises Inside and Out

This post was originally published on Network Computing

With region-wide summer heat waves blanketing the U.S. as a backdrop, data center operators are searching for ways to cut growing cooling costs. That’s while taking steps to increase the performance of server central processing units to prepare for artificial intelligence (AI) and ML-driven computing loads.

Electricity use is staggering and headed upwards. Data centers account for about 1-1.5% of global electricity consumption, according to various sources, including the International Energy Agency.

This statistic is expected to increase as cloud services, edge computing, IoT, and artificial intelligence (AI) take hold. Improvements in technology efficiency will be expected to keep pace, only to be offset by ever-growing workloads and storage.

To that end, data center power consumption is driven by two factors. First is the need for ever-more computational power. Increasingly powerful CPUs, and now GPUs and other processors are being employed to meet compute demand. As processing power increases, so does power consumption. However, the industry has increasingly focused on ways to improve the power efficiency of its processors to deliver more compute power.

A focus on cooling

The other great consumer of data center electricity goes into cooling. The higher-performance CPUs typically generate more heat, which must be addressed.

The two major types of cooling

Read the rest of this post, which was originally published on Network Computing.

Previous Post

Recent Outages Highlight the Need for Digital Resilience and Experience Assurance

Next Post

How the U.S. and U.K. Can Work Together to Advance Global Telco Innovation