Samsung Announces HBM2 Memory With Integrated AI Processor
takyon writes:
Samsung's New HBM2 Memory Thinks for Itself: 1.2 TFLOPS of Embedded Processing Power
Today, Samsung announced that its new HBM2-based memory has an integrated AI processor that can push out (up to) 1.2 TFLOPS of embedded computing power, allowing the memory chip itself to perform operations that are usually reserved for CPUs, GPUs, ASICs, or FPGAs.
The new HBM-PIM (processing-in-memory) chips inject an AI engine inside each memory bank, thus offloading processing operations to the HBM itself. The new class of memory is designed to alleviate the burden of moving data between memory and processors, which is often more expensive in terms of power consumption and time than the actual compute operations.
[...] As with most in-memory processing techniques, we expect this tech will press the boundaries of the memory chips' cooling limitations, especially given that HBM chips are typically deployed in stacks that aren't exactly conducive to easy cooling. Samsung's presentation did not cover how HBM-PIM addresses those challenges.
HBM: High Bandwidth Memory.
ASIC: Application-Specific Integrated Circuit.
FPGA: Field-Programmable Gate Array.
Read more of this story at SoylentNews.