newtech

Computational storage and the new direction of computing

We’re excited to bring back Transform 2022 in person on July 19 and virtually from July 20-28. Join leaders in AI and data for in-depth discussions and exciting networking opportunities. Register today!


Aggravation, unforeseen delays, wasted time, high costs: commuting is consistently ranked as the worst part of the day by people around the world and is a major driver of work-to-work policies. residence.

Computers feel the same. Computing storage is part of an emerging trend aimed at making data centers, edge servers, IoT devices, cars and other digital objects more productive and efficient by moving less data. In computer storage, an entire computer system—complete with DRAM, I/O, application processors, dedicated storage, and system software—is squeezed within the confines of an SSD to locally handle repetitive, preliminary, and/or intensive tasks. in data.

Why? Because moving data can consume excessive amounts of money, time, energy, and IT resources. “For certain applications such as in-player compression, hardware engines consuming less than one watt can achieve the same throughput as more than 140 traditional server cores,” said JB Baker, vice president of marketing and management. products at ScaleFlux. “It’s 1,500 watts and we can do the same job with one watt.”

Unnecessary data traffic is also not good for the environment. A 2018 Google-sponsored study found that 62.7% of computing power is consumed shuttling between memory, storage, and processor across a wide range of applications. Computer storage could therefore reduce emissions while improve performance.

And then there’s the looming capacity issue. Cloud workloads and internet traffic have grown 10- and 16-fold over the past decade and will likely grow at this rate or faster in the coming years as medical imaging enhanced by the AI, autonomous robots and other data-intensive applications will move from concept to commercial deployment.

READ MORE:  This super-smart tech could cut CPU thermals by 150%

Unfortunately, servers, rack space and operating budgets are struggling to grow at this same exponential rate. For example, Amsterdam and other cities have enforced strict data center size limits, forcing cloud providers and their customers to figure out how to do more with the same footprint.

Consider a traditional two-socket server configuration with 16 disks. An ordinary server can contain 64 computing cores (two processors with 32 cores each). With computing storage, the same server could potentially have 136:64 server cores and 72 application accelerators embedded in its disks for preliminary tasks. Multiplied by the number of servers per rack, racks per data center, and data centers per cloud empire, computer disks have the power to increase the potential return on investment of millions of square feet of real estate.

The fine print

So, if computer storage is so beneficial, how come it’s not already ubiquitous? The reason is simple – a confluence of advances, from hardware to software to standards, must come together to make a paradigm shift in processing commercially viable. These factors all line up now.

For example, computer storage drives must accommodate the same power and space constraints as SSDs and traditional servers. This means that the computing element can only consume two to three watts out of the 8 watts allocated to a drive in a server.

While some early compute SSDs relied on FPGAs, companies like NGD Systems and ScaleFlux are embracing systems-on-chips (SoCs) built around Arm processors originally developed for smartphones. (An eight-core computer disk SoC can dedicate four cores to disk management and the rest to applications.) SSDs usually already have quite a bit of DRAM — 1 GB for every terabyte in a disk. In some cases, the calculation unit can use it as a resource. Manufacturers can also add more DRAM.

READ MORE:  AMD to launch FSR 2.0 on May 12

Additionally, a computing storage disk can support a standard cloud-native software stack: Linux operating systems, containers built with Kubernetes or Docker. Databases and machine learning algorithms for image recognition and other applications can also be loaded into the player.

Standards will also need to be finalized. The Storage Networking Industry Association (SNIA) released its 0.8 specification last year covering a wide range of issues such as security and configuration; a full specification expected later this year.

Other innovations you should expect: more ML acceleration and specialized SoCs, faster interconnects, improved on-chip security, better software to analyze data in real time, and tools to merge data from distributed arrays of disks.

Over time, we may also see the emergence of compute capabilities added to traditional spinning hard drives, which remain the workhorse of cloud storage.

A double-edged advantage

Some early use cases will occur at the edge, with the computing drive acting in an edge-to-edge fashion. Microsoft Research and NGD Systems, for example, have found that computer storage disks can dramatically increase the number of image requests that can be made by directly processing data on CSDs – one of the most discussed use cases. – and that the throughput increases linearly with more disks.

Devices with limited bandwidth, often with low latency requirements, such as aircraft or autonomous vehicles, are another prime target. More than 8,000 planes carrying more than 1.2 million people are in the air at any given time. Machine learning for predictive maintenance can be performed effectively in flight with computer storage to increase safety and reduce turnaround times.

READ MORE:  The pace of digital transformation may be slowing, but there's still plenty of work ahead

Cloud providers are also experimenting with computing cloud disks and will soon begin moving to commercial deployment. In addition to helping offload tasks from more powerful application processors, computer disks could improve security by locally running scans for malware and other threats.

The alternative?

Some will say that the solution is obvious: reduce the IT workload! Companies collect far more data than they use anyway.

This approach, however, ignores one of the sad truths about the digital world. We don’t know what data we need until we already have it. The only realistic choice is to design ways to effectively deal with the massive onslaught of data coming our way. Computer disks will be an essential element in allowing us to filter the data without getting bogged down in the details. The insights generated from this data can unlock capabilities and use cases that can transform entire industries.

Mohamed Awad is Vice President of IoT and Embedded at Arm.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including data technicians, can share data insights and innovations.

If you want to learn more about cutting-edge insights and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider writing your own article!

Learn more about DataDecisionMakers

Source link

Leave a Comment