Indiana University      Research & Creative Activity      September 1999 Volume XXII Number 2

Big Gottlieb

Steven Gottlieb, professor of physics, Indiana University Bloomington --credit


Steven Gottlieb is quiet, friendly, and more approachable than the esoteric nature of his work might suggest--at least to those uninitiated into the mysteries of contemporary physics. In conversation he is willing to discuss any aspect of what he does, though the discussion, on quarks, bosons, space-time lattices, and the like threatens from time to time to drift into areas where an advanced appreciation of modern mathematics might come in handy. The discussion moves from supercomputers and their evolution ("What is the definition of a supercomputer? One that costs ten million dollars.") to MIMD (multiple instruction, multiple data) or "parallel processing" systems, to the present limitations on our ability to predict and understand nature at its smallest levels. As we head back from the Union after a bite to eat toward the building where Gottlieb, a professor of physics at Indiana University Bloomington, works--he wants to show me a parallel processing system he has constructed in the basement of Swain West--I ask the question that always seems so pedestrian. What are the practical applications of this work? "None, that I know of," he laughs, "except answering the big questions."

Since its earliest days, the field we now call "physics" (and, until relatively recently, "natural philosophy") has concerned itself with fundamental questions. All the dizzyingly complex math, the mind-bending concepts, the meticulous laboratory work is in the service of filling a basic human need: the desire to know what we are. What is the universe made of? What is matter? How does it work? How many times could you cut something in half before you couldn't halve it again (the question that motivated astronomer Carl Sagan to study physics)? Or, in other words, what is the world really like at the level of the atom?

Gottlieb is in the business of answering some of these questions. In various forms they have been with us at least 2,500 years, since the days of Thales of Miletus (625547 B.C.E.), credited with being the first Greek philosopher. The difference between the ancient Greek atomists, natural philosophers, and modern-day physicists, however, is that only the latter group has access to technology that allows us for the first time to arrive, not just at impressive conjecture, but at verifiable answers. The questions are old, but the techniques that may finally answer them are as current as this year's generation of software. In Gottlieb's case, those answers are coming about at least partly as a result of the revolution in computing.

Resolution/Perception
Gottlieb's lattice picture of the proton can be distorted by lack of resolution, just as for this famous painting. At the lowest resolution, of 9 by 12 pixels, most people would not recognize this work. At 12 by 16, it is still obscure, At the next finest resolution, people recognize the painting as being the Mona Lisa. With two more refinements of the resolution, you can begin to see her eyes and smile, but only at the finest resolution does her beauty really show.
Resolution/Perception

Though today's physicists stand at the end of that long line of inquiring minds, Gottlieb himself is less the philosopher on the mountain and more of a focused researcher, with his eye on the most minute parts of reality. His area of inquiry is quantum chromodynamics, and he concerns himself with such curious entities as quarks and gluons. A quark, fancifully named by physicist Murray Gell-Mann after a line in James Joyce's novel Finnegans Wake, is to our present knowledge one of the smallest units of matter. As far as 1999 physics is aware, you can't go smaller than quark: there simply is no cutting them in half. Gluons are the carriers of the strong force, so called because they make the quarks "stick" together. They are analogous to photons, the energy packets that we perceive as light, such as the ones bouncing off this page and hitting your eyes right now.

Beyond these somewhat familiar items, modern physics has uncovered a surprisingly vast array of subatomic particles--especially surprising to physicists earlier in the century who believed the atom itself was as small as things were going to get (the name "atom" means "indivisible," though we now know atoms are nothing of the sort). Gottlieb is familiar with them all. His particular concern, however, is with lattices. "Imagine space-time as a lattice of points," he says, drawing a crosshatching of lines on paper. Because of the Heisenberg uncertainty principle, a particle according to quantum field theory is best described not by its position, but by the probability that it will be at any position. Physicists call this probability function a field.

That's where supercomputers come in. A modern supercomputer can run the calculations necessary for describing each point on a space-time lattice that is much more finely woven than any we constructed in the past. The principle itself is not new, but the ability to compute enough points to significantly reduce the amount of error in our complete picture is. The finer the lattice lines and the more we can use, the more points available to us. The more points available, the better our ability to describe the object. With supercomputing, the number of points being calculated, both in three dimensional space and one-dimensional time, rises into the hundreds of thousands.

The process is purely a mathematical one. "We are doing numerical solutions for situations where we can't do analytic," he says. In other words, supercomputers allow him and the research group within which he operates (the MIMD Lattice Computation Collaboration, or MILC) to solve many different equations for many different points on the lattice and then to integrate the results into a general picture. That picture is a description of how reality should work, based on our best current theory. Whether reality actually works that way requires comparing the calculated mass of the pi-meson and states made of other arrangements of quarks with the values determined by experimentalists. According to this picture, different arrangements of quarks produce particles with different masses. If the lattice gauge calculations are correct, the particle masses which they predict will be equal to the mass values measured in experiments. But the two methods of approaching the problem are intimately related; the closer our mathematical approximations come to predicting real-world data, the more confidence we can have in our models.

"We want to solve the equations that describe gluons and quarks," Gottlieb says, reducing the process to its simplest form. "There's nothing mysterious about the equations themselves. They've been around since 1973. It's just that no one could provide solutions to them." Until now.

When Gottlieb was starting in physics, "people did computations on whatever was available to them . . . an old VAX (computer), say, which could handle maybe a half million operations per second." These days, the fastest supercomputers max out at between one and ten trillion operations per second (though they generally are run at much lower speeds). How many operations per second can the computer cluster he built at IU run? Gottlieb thinks about it.

"Three hundred fifty million . . . times thirty-two," he smiles. "I'll let you do the math." Three hundred fifty million is also the number of dollars the government has put recently into various information technology initiatives. The latest generation of supercomputers here and around the world is being used to model phenomenon too complex for previous computing systems to have described with any useful degree of accuracy. Supercom- puter modeling is being applied toward understanding how combustion works at a minute level; how weather systems interact, a notoriously difficult pursuit known as "climate modeling"; how airplane parts will behave under the stresses and strains of actual flight before anyone leaves the ground. Systems exist that even describe how nuclear explosions would occur, without anyone having to detonate anything. Called the Strategic Simulation Initiative, the drive to purchase computers and develop software that can produce such reliable simulations was announced recently by Vice President Al Gore and could also be of great usefulness to researchers in high-energy physics.

MILK
A comparison of two sets of lattice calculations. In blue are shown calculations using the quenched approximation. This approximation was invented at Indiana University in the early 1980s by Professor Donald Weingarten. In green are results when this simplifying approximation is removed. The quantity plotted is the ratio of masses of two particles: a nucleon made of three quarks, and a vector meson made of a quark and antiquark with their spins pointing in the same direction. The horizontal axis represents the lattice spacing, and the continuum limit is the left end of the lines drawn through the points. Note the difference between the two curves. --credit

This program follows the Department of Energy's "Computational Grand Challenge," whose aim was to promote the current generation of supercomputing capability. From meteorology to aviation to the pure physics in which Gottlieb's group is involved, more research is taking place in the hypothetical space of equations and models. More of what we understand about the outside is coming from what massively parallel systems predict on the inside.

In the basement of the physics building here at IU, Gottlieb shows me the room where many Intel Pentium II PCs are placed on a rack. He bought individual parts, and physics students help assemble the parts into PCs and then to network the PCs during two days of Thanksgiving break. There are thirty-two PC's that can be used in parallel. "Look at that," he smiles. "With a little work you can build your own supercomputer!"

This homemade system operates on the same principle of massively distributed, parallel processing computing as do the largest IU computers on which he generally works. Those machines are housed at the Wrubel Computing Center, their cost far greater than this homespun version, but the idea is much the same.

Gottlieb says, "I can run the same types of problems in the basement of Swain that I run on the other IU supercomputers. This system even runs faster per node than the Paragon (supercomputer), though it's slower than the Origin 2000. The Paragon was purchased in 1993. It actually has much less memory per node, so I can run bigger problems on the cluster in Swain."

In themselves the thirty-two PCs in Swain Hall form a closed net, with each machine computing between 10,000 and 40,000 lattice points. There's something appealing about the symmetry of metaphors here: one thinks of large nets with big holes in them used by people fishing in the time of Thales. Those lattices could snare the biggest, slowest moving secrets of nature and bring them up into the light, but the smaller ones slipped through with ease. Now, with the introduction of MIMD computing, the hardest-to catch prizes of the subatomic deep are gradually being hauled to the surface.

For more information on the Web:

Return to the Table of Contents