Indiana University      Research & Creative Activity      April 1998 Volume XXI Number 2



Analog

by Leigh Hedger

Each new decade sees a new wave of innovative technology. Experts in the 1950s thought that the world's computing needs could be supplied by half a dozen computers. The 1990s saw digital processors become so inexpensive that a car could use dozens of computers to control it. Today just one Porsche contains ninety-two processors. According to Paul Saffo, director of the Institute for the Future, changes just as dramatic will be caused by massive arrays of sensors that will allow computers to control our environment.

In Saffo's vision of the future, we will have smart rooms that clean themselves with a carpet of billions of tiny computer-controlled hairs. Airplanes will have wings with micromachine actuators that will automatically adjust to handle turbulence. These smart machines will require computers that can process vast amounts of sensor data rapidly, far faster than today's digital computers. To make Saffo's vision a reality may well require a rebirth of technology that many consider obsolete in this era of digital computers--analog computers.

Mills
Johnathan Mills, Associate Professor of Computer Science, Indiana University Bloomington --credit

At present, only a few researchers are delving into the world of general purpose analog computing. Part of the vanguard, Jonathan Wayne Mills, is an associate professor of computer science at Indiana University Bloomington and director of the Adaptive Systems Laboratory.

All analog computers use direct analogies to perform mathematical functions. To add 3 and 2, for example, an analog computer would add voltages that correspond to those numbers and instantaneously answer "5." Mills's patented new analog computer uses radically simplified electronic components and "continuous value logic" circuits that make his computer able to work incredibly fast and process more sensory inputs than a digital computer can handle.

Digital computers were invented in the mid-1940s. The inventions of the transistor and the integrated circuit resulted in their explosive growth. Digital computers represent the world around them using ones and zeros. To deal with even simple data, such as the colors on a computer screen, a digital computer must use many ones and zeros. To manipulate this data, a digital computer uses a program, a sequence of tiny steps. Animating a picture may require a digital computer to execute millions of steps every second.

But digital computers are limited by the "clock rate," the number of steps that can be executed in a second. There physics takes its toll: the clock rate cannot be made arbitrarily fast, for eventually the ones and zeros would have to travel through the computer faster than the speed of light. Electronics also limit digital computers. As digital chips are built with tens of millions of transistors, they consume increasing amounts of power, and the odds increase that a chip will be defective.

Chips
The integrated circuit on the left is a conductive sheet. Although it has no transistors, it can simulate the operation of nervous tissues faster than a digital computer. The chip on the right is a Lukasiewicz Logic Array that was patented by Indiana University. Although it is built from transistors, it computes continuous logic values instead of binary zeroes and ones. - credit

Analog computing systems--which include slide rules--have been used for more than a century and a half to solve mathematical equations. In the 1950s and 1960s electronic analog computers were used to design mechanical systems from bridges to turbine blades, and to model the behavior of airplane wings and rivers. The analog equivalent of the digital computer is a refrigerator-sized box that contains hundreds of special electronic circuits called operational amplifiers. On the front of the box is a plugboard, similar to an old-fashioned telephone switchboard, that is used to configure the analog computer to solve different problems. The analog computer is not programmed, like a digital computer, but is rewired each time a new problem is to be solved.

Digital computers replaced general purpose analog computers by the early 1970s because analog computers had limited precision and were difficult to reconfigure. Still, for the right kind of problem, such as processing information from thousands of sensors as fast as it is received, analog computers are an attractive alternative to digital computers. Because they simulate a problem directly and solve it in one step, they are faster than digital computers. Analog computer chips use less power and can be made much larger than digital computer chips.

Mills calls his new kind of analog computer a Kirchhoff-Lukasiewicz Machine, named after physicist Gustav Robert Kirchhoff and mathematician Jan Lukasiewicz. Kirchhoff used thin sheets of copper to study electrical current flow and heat transfer problems in the mid-nineteenth century. Lukasiewicz was among the first to study continuous-valued logics. In Mills's computer, the thin substrate of a computer chip, a sheet of silicon without transistors, solves partial differential equations just as Kirchhoff's copper sheets did 150 years ago. Arrays of transistors, connected to act like simple electronic diodes, compute the functions that the silicon sheets cannot. Combinations of these two simple components, silicon sheets and diodes, can be configured to simulate complex systems in real time, processing massive amounts of data from sensors.

Learning to use such simple components to build analog computers that solve complex problems was not easy. Mills has been working on the problem since 1990. His work with analog started when he became interested in computing functions with Lukasiewicz logic. Working at IUB with J. Michael Dunn, Oscar R. Ewing Professor of Philosophy and professor of computer science, and two graduate students, Charles Daffinger and M. Gordon Beavers, Mills began designing integrated circuits based on the continuous-valued Lukasiewicz logic, instead of the two-valued Boolean logic used by digital computers.

Using Boolean logic, digital programs are broken down into true or false statements represented by ones and zeros. Lukasiewicz logic gates recognize the grays that are not found in the black and white Boolean world. Analog computers see zeros, ones, and all the values in between. This continuous system of logic, Mills realized, was an ideal match for the processing style of analog circuits.

While Daffinger worked to improve the performance of the circuits they had already built, Mills began to look at various applications. "At the time, I thought these circuits were more powerful than they actually were," Mills says. "I thought a single Lukasiewicz logic gate could act like a neuron in the brain. I was wrong about that."

After Daffinger died accidentally in 1991 and the research changed course, Mills returned to the design of Lukasiewicz logic gates. That was when he realized he was developing an entirely different way of building analog computers. "The fundamental gate was nothing more than a diode, which is not even as complicated as a transistor," Mills says. "So where we initially had circuits with nine transistors, it now turned out that we could make the same circuit by throwing away those nine transistors and replacing them with one diode." But that, Mills says, was a stumper. If diodes were so good for computation, why weren't they being used in computers already? He was stumped for two years.

His cure for researcher's block was to go back to his childhood fascination with butterflies. He began studying butterfly wing patterns and trying to model them with an array of Lukasiewicz logic gates. He developed a type of electronic retina in which patterns of active cells resemble the patterns on butterfly wings. "Wing patterns are produced by a chemical process called reaction-diffusion, which is also described by differential equations," Mills says. "So I thought, maybe if I put enough of these logic gates together I would get an analog computer that would solve partial differential equations."

Analog
Although analog circuits are small, almost all of the space on this board is used to interface them to a digital computer for testing. After testing, the analog circuit can be used by itself. - credit

After this realization, Mills spent the next six months simulating this computer before attempting to build one. He found that his readings in "old-fashioned" analog from the late 1950s and mid-1960s described circuits that did things similar to his Lukasiewicz circuits, "except without having to have all of the sea of Lukasiewicz logic gates. You could collapse all of those discrete gates into a continuous conductive sheet and compute the same thing, but now with a circuit that needed no transistors or diodes at all--which seemed unbelievable."

He continued this research on his own, having reduced the logic to a diode, then reducing the sea of thousands of diodes to a conductive sheet. At this point, he was stumped again. "I was left with logic gates composed of diodes, and a partial differential equation solver that I could simulate with carbon paper," Mills says. It was so simple. But the next question became: What can we do with this?

The answer was not easy to find. It involved learning how to replace his knowledge of digital computing with its analog equivalent. The core of configuring Kirchhoff-Lukasiewicz Machines lies in "fitting" problems into the conductive sheet, then "shaping" the sheet's outputs with Lukasiewicz logic gates. This requires visualizing the problem to solve, or the system to model, and mapping it to the machine. To do this, he had to throw off the mindset he had learned in digital programming and invent a new style of analog computing--turning complex systems into machine configurations without using mathematical equations. "It was constant blindness punctuated by flashes of insight. There was no cookbook that told me how to use these devices or how to shape problems," he said.

During one seminar, Mills and his students spent sixteen weeks studying these circuits. At the end, "we weren't quite sure how to build even the simplest neural network circuit, a kind of analog circuit that models the robust way that the brain learns to recognize patterns. We thought we were getting close, but it was very frustrating for the students who were seeing firsthand a professor grappling with a problem. There were no books, no papers, no previous lecture notes for students searching for answers. Then they realized: nobody knows this stuff."

Learning how to overcome these kinds of frustrations has had a cautionary impact on Mills's approach to research and sense of responsibility. "If you advocate something that seems so implausibly simple, it's your responsibility to be able to explain it in understandable detail. My attitude is that if it's out in left field, it's my job to be able explain it well and answer questions so that other people can learn how to use it and develop it further."

That has been how much of Mills's time has been spent over the past few years. In 1995, for example, he helped develop a proposal for an analog controller for the IU Cyclotron. In this application, the conductive sheet accepts four sensor outputs- north, south, east, and west--to model a cyclotron beam cross-section. The Lukasiewicz logic gates are used to generate a control signal. If the beam drifts away from center, the Lukasiewicz logic gates will generate increasingly stronger signals to correct the error. While trying to understand this application, Mills continued to learn more about the sheets--transistorless computing devices--and how to use them with continuous-value logic gates as sensors and processors.

In the past years, more people have been discussing analog computers as models for cognitive and sensory processes. "Right now, the computer is a blind organism," Mills says. "It has very few senses--such as a keyboard or a camera. Researching intelligent devices that will observe, measure, and react within their physical environment would have an impact worldwide." In other words, computers of the future will also be sensing devices, capable of reacting in real time, in the real world to physical stimuli.

Mills and a team of graduate students now are constructing a model of the brain of a barn owl. This model will simulate several owl behaviors sparked by sensory inputs. For example, the brain model is given a simulated auditory input corresponding to a prey animal making a noise, as well as inputs about its own (simulated) owl anatomy: its crop is empty, its body contains the chemical produced by hunger. All these inputs are represented as voltages and currents and are applied to a Kirchhoff-Lukasiewicz Machine that contains structures analogous to the cochlea, crop, portions of the cortex, and other organs. The analog computer then produces simulated "action" outputs, in this example, voltages that correspond to the gross motor behavior of the owl's wings so that it "flies" out to hunt. The "behavior" of the barn owl analog model may, however, change as its sensory inputs change. It may not "hunt" in reaction to prey noise if the inputs indicate the owl's crop is full and therefore it's not "hungry."

Being able to model a brain will be a computing breakthrough. Says Mills, "Saffo's vast arrays of sensors will need computers that are as complicated as the brain, and that will merge sensing with computation. There are a lot of exciting technologies out there that might work--quantum computing, DNA computing. I've chosen to continue research in analog because I think there is a lot of life left in silicon computers. It's just that they won't look like the digital computers we're familiar with."

The speed and simplicity of fabrication of Kirchoff-Lukasiewicz Machines suggests that analog computers do have a future. Mills's technology is capturing attention outside academia, including calls from NASA and Nortel, the Canadian telecommunications company, to discuss possible applications. "I'm thinking that within five to ten years, we will find a niche in which these processors are superior, efficient, and cost-effective," Mills says. "We may develop sensors that would detect chemicals in the environment or toxins within our bodies, such as life-threatening cholesterol levels. We might develop an implant that could predict heart attacks- sort of a biological beeper." Analog computing may offer us surprises that pale beside what we can imagine that lies ahead. dingbat


Return to the Table of Contents