Analog in-memory Computing Attention Mechanism for Fast and Energy-efficient Large Language Models
by Brian Wang from NextBigFuture.com on (#70CWZ)
A Nature paper describes an innovative analog in-memory computing (IMC) architecture tailored for the attention mechanism in large language models (LLMs). They want to drastically reduce latency and energy consumption during inference. The design leverages gain-cell crossbar arrays-capacitor-based memory devices made from oxide semiconductor field-effect transistors (IGZO or ITO)-to store key (K) and value (V) ...