For anyone not really close to this technology, this might seem really far-fetched, but stick with me here – because some MIT employees are doing a lot of nanoscale projects that promise new efficiencies and new designs for systems cutting-edge AI!
The strategy is called analog deep learning and involves using programmable resistors in a network to process data differently.
Basically, processes are carried out in memory, rather than sending the relevant data through a processor. Hardware setup involves the use of machines called analog-to-digital converters, which is what it sounds like.
What are analog-to-digital converters used for in deep neural networks? Some major use cases involve radar and other cases of providing an analog input into a digital system that will attempt to break it down and understand it.
In many cases, the data arrives in real time or is robust in some way.
One of the main contributions of the ADC process is energy efficiency: taking all this data and processing it is extremely energy expensive.
So now researchers are looking at how to put an end to some traditional work. Specifically, the folks at MIT are using protons for a model that drives processing in the networks and evaluating things like conductivity to create the new models.
“The mechanism of operation of the device is the electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity,” says lead author Bilge Yildiz, MIT professor of nuclear science and engineering. and materials science and engineering. in an internal news published last July. “As we are working with very thin devices, we could accelerate the movement of this ion using a strong electric field and push these ionic devices to the nanosecond operating regime.”
Take a look at the rest of this explanation from MIT News, or this presentation by Tanner Andruliswhich explains why we need ADC systems and how to manage ADC range.
Andrulis presents an interesting twist on this that suggests you can narrow your ADC range and find ways to deal with aberrant demand, in order to get even more efficiency.
Watching the rest of the video, you can see it relates ADCs to neural network performance.
What does all this have to do with AI? This is another type of infrastructure for mimicking the biological synapses of the human brain. One could argue that the reason we are faced with powerful generative AI and other types of artificial intelligence today is based on this capability: the ability of systems to take something analog and simulate it digitally. This type of research isn’t slowing down either, so keep an eye on this blog for more!
Portrait of Tanner Andrulis