With the concept of computational storage, calculations are outsourced to the storage medium, which is more efficient and conserves resources.
Technologies such as Artificial Intelligence and IoT have almost arrived in everyday life, but there are many other innovative developments that are probably wrongly underestimated. One of these topics is computational storage, which is being talked about more and more frequently in specialist circles.
The idea behind the concept, also known as CS, is simple: For decades, data in IT has been loaded from a storage medium into the main memory, processed there, and then saved again. The main focus is on powerful CPUs or GPUs that are supposed to perform computing tasks. Instead, with this technology, tasks could be carried out immediately via the storage medium, which, according to studies, could increase performance or latency and reduce energy consumption.
There are also other solutions - such as keeping as much data as possible in the main memory. SAP Hana, for example, uses this method, which, however, leads to very high costs and reaches its limits with very large amounts of data.
Instead of an energy-intensive detour, coprocessors integrated into the storage medium can take over some of the tasks, also known as offloading or in-situ processing.
Offloading - even if not computational storage - is also used to a large extent by Apple's computers. Tasks such as encryption and video encoding are taken over by the T2 chip, while AI and ML calculations are carried out by a dedicated co-processor.
Applications Ready For The Market
In the case of so-called computational storage devices that are already in use, either a highly specialized chip such as an SoC or programmable FPGA calculations independently. The first beginnings are ever more intelligent SSD controllers, which encrypt and compress data in real-time with SSDs. In the future, these CSDs could perform calculations in areas such as AI and analytics. ARM also sees edge computing or the classification and identification of image files as possible tasks.
The first products are already available: The CSD 2000 storage medium is available from Scale Flux for data centers. Here, transparent compression enables the use of particularly inexpensive QLC storage. NGD Systems, sponsored by Air Force and Space Force, even has its own cloud application (IoT Edge) and the ML framework Tensor Flow run on an SSD.
The Newport Platform EDSFF E.1S, which is based on an integrated ARM processor and Linux, is available in capacities of up to 12 TB. But there are already offers from Eideticom and Samsung. In November, Samsung and Xilinx presented the SmartSSD, a CSD based on an FPGA from Xilinx - a Kintex Ultrascale + KU15P with at least 4 GB of RAM.
There are big differences in the tasks that these solutions are supposed to fulfill. In the beginning, the simplest tasks, in particular, are used first - preparatory calculations such as indexing or encryption. The big problem at the moment is that the corresponding applications for this offloading would have to be rewritten. In addition, there are currently no standardized APIs for this. However, the NVMe Consortium wants to include computational storage in its standard, which could help the technology achieve a breakthrough from next year.
This is original content from NewsBreak’s Creator Program. Join today to publish and share your own content.