Quantum-tunneling memory could boost AI energy efficiency by 100x

Quantum-tunneling memory could boost AI energy efficiency by 100x

There's a potential solution on the cards to the energy expenditure problems plaguing AI training, and it sounds simple: just strengthen the "synapses" that move electrons through a memory array. 


Electrical and Systems Engineering Professor Shantanu Chakrabartty and two of his colleagues at Washington University in St Louis, USA, have authored a Nature-published paper explaining how they have used the natural properties of electrons to reduce the energy used to train machine learning models. 


The project had the researchers trying to build a learning-in-memory synaptic array that had digital synapses that operated dynamically instead of statically, such that they only need energy when changing a state, but not to maintain one.

To test their concept, the team built CMOS circuits with energy barriers they said were strong enough to be non-volatile, and which would become stronger (i.e., able to better maintain non-volatility) as the array's training progresses. 

The result, Chakrabartty said, is a more efficient array that could reduce the energy requirements of ML training by 100x – "and this is a pessimistic projection," Chakrabartty told The Register. 


That 100x improvement is for a small-scale system, Chakrabartty said. Larger-scale models would show an even greater improvement, especially if the memory were integrated with the processor on a single wafer – which Chakrabartty said he and his team are currently working to achieve. 


How to get your digital synapses firing


Machine learning model training is incredibly energy inefficient. Washington University in St Louis said that training a single top-of-the-line AI was responsible for more than 625,000 pounds (283.5 metric tonnes) of CO2 emissions in 2019 – ..

Support the originator by clicking the read the rest link below.