Made between two metal interconnect layers, the structure has four layers: TiN at the bottom, Si-doped HfO2, a Ti scavenging layer, and then TiN on the top.
As it stands this is a ferroelectric capacitor structure but, after a single electrical forming operation – which grows conductive threads through the hafnum oxide – it becomes a re-programmable memristor structure.
This allows either device to be made at will anywhere in this layer.
“Ferroelectric capacitors allow rapid, low-energy updates, but their read operations are destructive making them unsuitable for inference,” according to CEA-Leti. “Memristors excel at inference because they can store analog weights, are energy-efficient during read operations, and support in-memory computing. [However] while the analog precision of memristors suffices for inference, it falls short for learning, which demands small, progressive weight adjustments.”
In use, the idea is that off-chip training could initially be used to programme an inferencing model into the conductances of an array of memristors.
The IC would subsequently run a learning algorithm that stores its neural network weights digitally in an array of ferroelectric capacitors, and these new weights would be used to update analogue weights in the memristors.
“Forward and backward passes use low-precision weights stored in analog in memristors, while updates are achieved using higher-precision ferroelectric-caps,” said research leader Michele Martemucci. “Memristors are periodically reprogrammed based on the [three] most-significant bits stored in ferroelectic caps.”
Made over a standard 130nm CMOS IC, the proof-of-concept device has 16,384 ferroelectric capacitor cells (each cell has one capacitor and one transistor) and 2,048 memristor cells (one memristor and one transistor).
10 capacitors store a single signed digital integer, and the memristors are operated in pairs to store a differential (and therefore signed) value.
To avoid needing DACs to transfer data from the digital capacitors to the analogue memristors, the most significant three capacitors have area (and therefore charge) ratios of 1:2:4, allowing a simple analogue summation to be used to set a memristor.
CEA-Leti worked with Université Grenoble Alpes, CEA-List, CNRS, University of Bordeaux, Bordeaux INP, IMS France, Université Paris-Saclay, and Center for Nanosciences and Nanotechnologies.
For detail, read the Nature Electronics paper ‘A ferroelectric-memristor memory for both training and inference‘. This clearly-written paper can be read in full without payment.
According to this paper: The IC-based neural network “was trained using a stochastic gradient descent algorithm… results obtained across several edge benchmarks are competitive with those achieved by floating-point precision software models, without the endurance limitations associated with hardware constraints. We observed month-long stability of the memristor states… weight transfer corrects any drift in the memristor conductance levels”.
The illustration is a semi-physical artists impression – the sandwich would be there the capacitors are drawn – credit E.Vianello M.PlouseyDupouy CEA