Samsung presents the new FIMDRAM memory, with artificial intelligence integrated into the memory itself
A new type of RAM memory created by Samsung proposes to run Artificial Intelligence processes directly on the memory chip. This multiplies the performance of the AI by two, and reduces consumption by more than 70%.
In any computing device, from a PC to a mobile, the work to complete the tasks required by a certain app is carried out by the processor, either the CPU itself or the GPU (the graphics chip). Now that Artificial Intelligence is in fashion, many mobiles and computers have chips with specialized processors that are only dedicated to process Artificial Intelligence tasks.
Samsung has surprised today with its proposal of integrate artificial intelligence into one’s own memory, giving rise to a new type of RAM, the FIMDRAM, which is stored in memory chips called HBM-PIM.
Actually, what Samsung has done is quite simple to understand. Has taken his chips from High Bandwidth Memory (HBM), and added an extra layer with Processors in Memory (PIM), to obtain chips called HBM-PIM:
This high-speed HBM memory contains Programmable Computing Units (PCUs) inserted between the memory banks themselves, as explained GSM Arena.
These units are capable of 16-bit floating point operations with a limited number of instructions. They can add, multiply, and move data within memory itself.
The great advantage of this system is that no need to move data from CPU to memory or vice versa, as it is normally done, because the PCUs work directly in memory.
Samsung has managed to get the PCUs to run at 300 Mhz for a processing power of 1.2 TFLOPs per chip. And best of all, it manages to transmit data at 2.4 Gbps per pin, without increasing power consumption.
Processing artificial intelligence tasks directly in RAM, bypassing the CPU or GPU, offers two very important benefits. First of all, by not having to move the data to the processor for it to use, energy consumption of AI tasks is reduced by 71%. And since the data is already in RAM and does not have to be moved, it saves time, thereby AI performance is multiplied by two.
These are very promising data, although there is a small drawback. By having to reserve a place to put the PCUs, memory capacity is halved, from 8 to 4 Gbits. Samsung has fixed this by combining layers with PCUs and without PCUs, to obtain 6 Gbits capacity chips. He has called this new memory FIMDRAM.
Samsung has already sent test chips to its partners working in Artificial Intelligence, and expects to get a first development document in July.
It will still take a while to see this FIMDRAM memory in home devices, but its features are very promising for the development of artificial intelligence.