Article: "Achieving Data Center Energy Efficiency: Solutions Must Be Widely Adopted" - Part 2

Article: “Achieving Data Center Energy Efficiency: Solutions Must Be Widely Adopted” – Part 2

A business culture of energy responsibility must be built that sits alongside current data center practices of reliability and profitability.
In the first part of this series, the relationship between today’s data-driven world and the unprecedented growth in data centers was explored. Energy consumption and energy efficiency now need to be seen as important business criteria for leaders in data centers to consider.  The challenge for much of the industry has been that while the problems are known, the solutions have not been widely embraced and adopted.

There are three main strategies data centers can use to address power inefficiency and density issues.  As demonstrated by the practices of cutting-edge data center leaders, all of these approaches can work together to roll up to significant business change and social impact.

Well-documented practices of facility efficiency and software technology: Well-understood in many parts of the industry for the past 10 years, there are facility-wide energy improvements to be gained from the use of new environmental and cooling controls. Software brings the advantage of ease in scale and distribution. Data center infrastructure management (DCIM) systems are employed for gaining operational efficiency thru alignment of IT and facilities teams. AI, machine learning, and robotics are being used to optimize real time adjustments in complex inter-related electrical and mechanical control systems.

Renewable energy programs and multi-year commitments from future-focused leaders: Renewable energy use (including solar, wind, and hydro-electric) has largely been the pursuit of the industry leaders in designing and building new hyper-scale data centers. These companies with energy-intensive assets enter into multi-year purchase agreements with utility providers.  This guarantees them access to the reliable, sustainable, and cost-effective supply of energy that their business demands. As a leader in renewable energy, Google, announced that it reached its “100 percent renewables” goal for powering its data centers in 2017 – a journey that started in 2010.

Largely untapped solutions around hardware: In the average data center, approximately 30 percent of power is wasted as it travels through the multiple power conversions necessary to go from the electrical grid to the micro-processors of an individual server. With either a large centralized or small decentralized UPS (uninterruptible power supply), along with hundreds of thousands of server and rack power supplies in most data centers – making even a small improvement per power converter can lead to significant improvements in energy efficiency, density, and business performance. Companies that design and build their own optimized servers, as well as those that rent services from multi-tenant data centers need to be aware of this as the next tool of change around energy use.

Energy efficiency and density can be gained in three areas.

  • UPS into the data center where high voltage electricity is converted and brought into the building. (Most data centers have a large centralized unit, with the evolution underway to many smaller decentralized units for risk mitigation.)
  • Rack power supplies at the server racks where AC is converted to DC.
  • Individual server power supplies where high voltage DC to is converted to low voltage DC on the server board.

How can today’s technology help us gain these hardware-focused energy efficiencies for data centers?

GaN Semiconductor Technology Driving Hardware Change for Data Centers

GaN power transistors uniquely enable data centers to address the energy inefficiencies and power density of the UPS, rack AC/DC power, and server DC/DC power supplies. Smaller and more highly efficient GaN-based power supplies directly lower a data center’s power bills and indirectly reduce cooling system cost and rack overheating issues.  With GaN power transistors, 6 power supplies can perform the work of 10 silicon-based units in every server rack.

The resulting impact of this increased density and efficiency on operating costs, revenue, and the environment is clear. In the case of a major Tier 1 Data Center operator:

  • Increased operational savings from energy ($5600/rack): $241 million/year
  • Additional revenue from greater server density ($5100/rack): $1.1 billion/year
  • Lower capital expenditures from postponing construction: $840 million
  • Greener, more sustainable business profile

For all major categories of data centers – from the enterprise to multi-tenant to hyper-scale – hardware energy efficiencies must be an important part of the executive business strategy.

Hyper-scale and high-performance computing data centers have been pushing the envelope in many areas of power efficiencies and density, but need to now add a changed perspective on hardware as a driver of both their economics and sustainability profile. And as every company becomes a data company, both enterprise data centers and customers of multi-tenant data centers must rethink their relationship and make it a core business decision area, not an afterthought. It’s time to play catch-up and abandon 10-year-old practices and equipment – so that the enterprise data center no longer looks like and is managed like an old server closet, only many times larger.

Cultural Change and New Business Metrics Must Be Part of the Data Center Hardware-Energy Solution

While the GaN technology is here today to enable energy efficiency and power density changes through hardware, executives in all of the data center segments must also be ready to lead the organizational culture changes and development of new business metrics that must accompany it.

  • Culture: Data centers and enterprise purchasers of their services need to build a culture of energy responsibility that sits alongside the existing culture of reliability and profitability.
  • C-Suite leadership: All of the C-Suite must articulate and drive across the organization the metrics and mandate that energy efficiency is a priority – in practice, not just in words in the annual report.
  • Cross department integration and organizational re-alignment: New incentive models and collaboration must be created between groups that were largely disconnected in the decision-making process – purchasing, engineering, IT, sustainability, and finance teams.  Discussions must go beyond the unit cost for a power supply today to the total business – taking into consideration all costs and revenues.
  • Metrics and business criteria: Companies need to look at how they can optimize all aspects of data centers – include energy use. Practices of the top 5% of data centers need to be understood and embraced by the executives leading decisions in multi-tenant and enterprise data centers

 It’s Time to Act – Both for Individual Businesses and for the Planet

With millions of data centers worldwide – and growing by the minute – addressing energy inefficiencies can quickly add up to big impacts both on the business bottom line and on the environment.

Executives must proactively address data center energy efficiencies with all of the tools available – from software to renewables to hardware.  And a business culture of energy responsibility must be built that sits alongside current data center practices of reliability and profitability.

We need to focus on architecting the future of data centers with both business and the planet’s health in the forefront. Otherwise, the ‘dark side’ of our data story that could have read like a utopian fairy tale of unlimited bounty – will write a final chapter that no one is going to like.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. To read the original article, click here. To read part one, click here.