Research brings analog computers just one step from digital – The Source – Washington University in St. Louis Newsroom

The future of computing may be analog.

The digital design of our everyday computers is good for reading email and gaming, but today’s problem-solving computers are working with vast amounts of data. The ability to both store and process this information can lead to performance bottlenecks due to the way computers are built.

The next computer revolution might be a new kind of hardware, called processing-in-memory (PIM), an emerging computing paradigm that merges the memory and processing unit and does its computations using the physical properties of the machine — no 1s or 0s needed to do the processing digitally. 

At Washington University in St. Louis, researchers from the lab of Xuan “Silvia” Zhang, associate professor in the Preston M. Green Department of Electrical & Systems Engineering at the McKelvey School of Engineering, have designed a new PIM circuit, which brings the flexibility of neural networks to bear on PIM computing. The circuit has the potential to increase PIM computing’s performance by orders of magnitude beyond its current theoretical capabilities.

Zhang

Their research was published online Oct. 27 in the journal IEEE Transactions on Computers. The work was a collaboration with Li Jiang at Shanghai Jiao Tong University in China.

Traditionally designed computers are built using a Von Neuman architecture. Part of this design separates the memory — where data is stored — and the processor — where the actual computing is performed.

“Computing challenges today are data-intensive,” Zhang said. “We need to crunch tons of data, which creates a performance bottleneck at the interface of the processor and the memory.”

PIM computers aim to bypass this problem by merging the memory and the processing into one unit.

Computing, especially computing for today’s machine-learning algorithms, is essentially a complex — extremely complex — series of additions and multiplications. In a traditional, digital central processing unit (CPU), this is done using transistors, which basically are voltage-controlled gates to either allow current to flow or not to flow. These two states represent 1 and 0, respectively. Using this digital code — binary code — a CPU can do any and all of the arithmetic needed to make a computer work.

The kind of PIM Zhang’s lab is working on is called resistive random-access memory PIM, or RRAM-PIM. Whereas in a CPU, bits are stored in a capacitor in a memory cell, RRAM-PIM computers rely on resistors, hence the name. These resistors are both the memory and the processor.

The bonus? “In resistive memory, you do not have to translate to digital, or binary. You can remain in the analog domain.” This is the key to making RRAM-PIM computers so much more efficient.

“If you need to add, you connect two currents,” Zhang said. “If you need to multiply, you can tweak the value of the resistor.”

But at some point, the information does need to be translated into a digital format to interface with the technologies we are familiar with. That’s where RRAM-PIM hit its bottleneck — converting the analog information into a digital format. Then Zhang and Weidong Cao, a postdoctoral research associate in Zhang’s lab, introduced neural approximators.

“A neural approximator is built upon a neural network that can approximate arbitrary functions,” Zhang said. Given any function at all, the neural approximator can perform the same function, but improve its efficiency.

In this case, the team designed neural approximator circuits that could help clear the bottleneck.

In the RRAM-PIM architecture, once the resistors …….

Source: https://source.wustl.edu/2021/12/__trashed-6/

Leave a Reply

Your email address will not be published. Required fields are marked *