MIT boffins cram ML training into microcontroller memory
Neat algorithmic trick squeezing into 256KB of RAM, barely enough for inference let alone teaching
Researchers claim to have developed techniques to enable the training of a machine learning model using less than a quarter of a megabyte of memory, making it suitable for operation in microcontrollers and other edge hardware with limited resources....