MIT helps move Neural Nets back to Analog
by staff from High-Performance Computing News Analysis | insideHPC on (#3FWHD)
MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by three to seven times over its predecessors, while reducing power consumption 94 to 95 percent. "The computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don't need to transfer this data back and forth?"
The post MIT helps move Neural Nets back to Analog appeared first on insideHPC.