New Chip Can Process and Classify Nearly Two Billion Images Per Second
upstart writes:
New Chip Can Process and Classify Nearly Two Billion Images per Second - Technology Org:
In traditional neural networks used for image recognition, the image of the target object is first formed on an image sensor, such as the digital camera in a smartphone. Then, the image sensor converts light into electrical signals, and ultimately into binary data, which can then be processed, analyzed, stored, and classified using computer chips. Speeding up these abilities is key to improving any number of applications, such as face recognition, automatically detecting text in photos, or helping self-driving cars recognize obstacles.
[...] The current speed limit of these technologies is set by the clock-based schedule of computation steps in a computer processor, where computations occur one after another on a linear schedule.
To address this limitation, [...] have removed the four main time-consuming culprits in the traditional computer chip: the conversion of optical to electrical signals, the need for converting the input data to binary format, a large memory module, and clock-based computations.
They have achieved this through direct processing of light received from the object of interest using an optical deep neural network implemented on a 9.3 square millimeter chip.
[...] "Our chip processes information through what we call 'computation-by-propagation,' meaning that, unlike clock-based systems, computations occur as light propagates through the chip," says Aflatouni. "We are also skipping the step of converting optical signals to electrical signals because our chip can read and process optical signals directly, and both of these changes make our chip a significantly faster technology."
Read more of this story at SoylentNews.