Product Name: Weebit ReRAM IP
Manufacturer: Weebit Nano
Product Category: Computing
Supporting Documentation (if available)
In edge AI, a significant part of on-chip memory is used to store synaptic weights needed for artificial neural network calculations. Designers can use clever algorithms on DRAM or SRAM, but this is increasingly challenging with ever stricter power and cost requirements. DRAM and SRAM are also volatile – a problem for the many applications that must quickly awaken from standby.
Traditional non-volatile memory (NVM) is challenged since embedded flash can’t scale below 28nm, so flash can’t be integrated in a single SoC with an inference engine in an advanced process. Today’s designers generally use two-chips, but this is too expensive for edge products, and continuously fetching weights from external memory increases latency and power consumption.
What’s needed is embedded NVM that can do the same level of inference as SRAM or DRAM at extremely low power/cost, embedded on a single die.
Weebit ReRAM is a new embedded NVM designed to be the successor to flash for edge AI. Because a ReRAM cell is ~3-4x smaller than a typical SRAM cell, significantly more memory can be integrated on-chip to support larger neural networks for the same die size and cost. Weebit ReRAM scales well below 28nm, is non-volatile for fast memory access, and is inherently secure.
Because of its physical and functional similarities to biological synapses, ReRAM is also ideal for neuromorphic in-memory computing.
Weebit’s first embedded ReRAM IP is now available through SkyWater US foundry and is fully qualified for customers’ confidence. Customer engagements are underway.
Weebit ReRAM IP
Category
Computing