Ad Image

Unlocking the Secrets of Sustainability

Lessons from Home Storage Innovations for Enterprise Solutions

Look at any graph showing the growth in data generated, consumed, copied, and stored, which resembles a steep climb. Organizations and IT consider strategies for best managing this as data projects to grow to 180 zettabytes by 2025, up from 64.2 ZB in 2020. And all this while government agencies, communities, and users demand more sustainable solutions and practices in the data center.

Most organizations – perhaps as many as 90 percent – still run their data centers with processors built on the x86 architecture, including solutions like Dell PowerMax, NetApp AFF, Pure Storage FlashArray, and IBM FlashSystem. However, with data centers amounting to about 1.5 percent of global electricity use, the C-Suite and boards are feeling the pressure from community sentiment and government regulations. There’s a growing call to IT professionals for a more sustainable option.

That push will grow more insistent as we adopt new technologies, such as Artificial Intelligence. Researchers from the University of Massachusetts, Amherst, found that training AI models can emit more than 626,000 pounds of carbon dioxide equivalent – nearly five times the lifetime emissions of the average American car, including those emitted by the car’s manufacturer. Considering AI’s need for highly intense processing of massive amounts of data, it should be no surprise that the explosion of AI is increasing the urgency to become more energy efficient throughout the data center.

Organizations must explore all the possibilities for a more efficient data center to appease their boards and follow through on carbon-zero commitments. These options include alternate sources of compute technology, such as Arm-based processors, to power intelligent storage components and arrays.

What can home storage teach us?

Looking at consumer storage, we find some clues for how to become more sustainable. No one wants a big, loud server taking up significant space in their homes, nor do they want their data storage to add significantly to their energy bills. To stay competitive, makers of home network attached storage (NAS) systems, which have been around for about 20 years, build them to accommodate these consumer wishes. That means giving them superior storage capacity in small footprints, low thermal management requirements, and minimal power consumption. They are also easy to manage, capable of being brought up and run by nontechnical people in just a few clicks. Most of these systems –built with Arm cores – run on Marvell, Annapurna Labs, and Realtek CPUs. Consumers benefit from this storage because of improved TCO over x86 alternatives.

In the enterprise, space is precious. Every additional rack unit means additional investment beyond just the hardware – additional floor space, higher energy consumption, cooling, networking, and management overhead. Modern Arm-based processors in the enterprise offer more consistency, better TCO, and a smoother management experience.

Evaluating Arm for Integration

Exploring central processing units (CPUs) with Arm-based processors as an alternative to the much more common x86 processors is a promising solution to the sustainability challenge. However, a few service providers are innovating with CPUs with Arm cores in the data center; though it is climbing, we can see that the market share is in the single digits.

Companies that have introduced Arm include:

  • Amazon Graviton Processor
  • Ampere Altra offers an astounding 128 cores.
  • Google Cloud’s Tau T2A Compute Engine virtual machine.
  • Huawei’s Kunpeng, in the company’s cloud business.
  • Marvell’s ThunderX2, which supports up to 4TB DDR4 memory.

How Arm cores can increase data center sustainability

Contributing to the data challenge is the world’s colossal boom in data – including from IoT devices at the edge – and new, AI-enabled analytics tools that make finding insights in that data easier. Organizations also face government regulations that dictate how long they must keep data. As a result, organizations keep practically all their data just in case they find a need for it later. Unfortunately, that strains data centers and maxes out energy capacity.

A promising approach is to use Arm to run data storage. Some built-in features of the x86 processor – like built-in instruction sets and support for hardware-accelerated operations – explain why it’s usually the top choice for intensive data processing and management tasks required for enterprise storage. However, Arm’s innovative system architecture promises significant benefits.

When exploring more sustainable enterprise storage, look for the following:

  • Intelligent compute engines embedded into components.
  • Hardware compression and encryption to offload the CPU.
  • Better IO/Watt performance to improve power budget efficiency.

The trend to distribute embedded processing throughout more of the data center architecture will likely continue gaining momentum as the need for more compute continues to grow, but in parallel, remains challenged by the need to improve sustainability. Thinking about it now can put you ahead of the game.

JB Baker
Follow
Latest posts by JB Baker (see all)

Share This

Related Posts