PSyLab

(Processing Systems Lab)

Feb 03, 2018

PSyLab's paper on hardware-algorithm co-design receives Best Paper award at DATE.

The paper exploits the specfic structure of memory failure in SRAM cells and combines it with a modification to the backward propagation algorithm to allow efficient acceleration of deep neural networks (DNNs).

Memory constitutes a major component of DNN power dissipation, in part due to the large number of weights that need to be stored, and in part due to the limited scalability that Vdd-scaling offers SRAM structures. This paper exploits the observation that reading SRAM cells powered with supply voltages below Vmin lead to errors that are random in space, but largely static over time across multiple reads. Resulting memory errors are consistent, much like a static memory fault, and can be "learned around" by the neural network itself.

The project is the result of a collaboration between PsyLab and Luis Ceze's group at the University of Washington, which provided the neural network accelerator hardware description upon which this work was built upon. More details can be found on the MATIC page

Contralutations to Sung Kim and the rest of the team!