by William Orem
Within the next decade, the burgeoning electronic activities we now loosely call "information technology" will change just about everything. The flow of data going across various computerized networks will increase by a factor of 10,000 or more. The environments in which we work and live--meeting rooms, offices, perhaps even outdoors--will become ever more populated by networked sensor instruments, devices that can connect us at any moment to that silent roar of transmitting data. People will talk to computers and the computers will talk back. People will have computers of one sort or another wired into their clothing (or as some have speculated, into their bodies) so that the distinction between being online and off line will become pedantic. "This," Dennis Gannon says, "is just the obvious stuff."
The student members of the CAT (Component Architecture Toolkit) team have provided much of the design and almost all of the implementation of the system, which won a high-performance computing award at the Supercomputing 98 conference. From left to right, row one: Andrew Whitaker, Juan Villaccís, Madhu Govindaraju row two: Shridhar Diwan, Prafulla Deuskar, David Stern. --credit
The negative readings of such a world, of course, come to mind with histrionic ease. No doubt George Orwell will be frequently invoked in the years to come, though his vision of a despotic state that spies on its members through helicopters and hidden cameras is quite a different scenario. For a more optimistic tomorrow, however, ask Randall Bramley, an associate professor of computer science, and Gannon, a professor of computer science and chair of the Department of Computer Science at Indiana University Bloomington. They no doubt will point most readily to the young representatives of humanity who have helped them to design our common future: their students.
Bramley and Gannon experience a refreshingly high degree of camaraderie with their protégés--so much so that Gannon suggests it would be more accurate to credit the younger workers and leave their own names off any article describing their work. "To tell the truth, most of the actual work was done by our gang of bright and highly motivated students," Bramley insists. And Gannon agrees: "They make all this happen."
Bramley, Gannon, and their students are collectively endeavoring to revolutionize the ways in which individuals will interact--in this case, across vast computer networks that will allow any particular member to tap into the abilities of the whole. The potential of this new technology is vast, its capacity for awe more than slight. But in understanding it, we are first encouraged to think about hair dryers.
"When you plug in your hair dryer in the morning, do you care where the power is coming from?" Gannon asks. Though discussion of nationwide, intercontinental, and even worldwide computing networks that will make today's Internet "seem pretty prehistoric" may give vertigo, to the generation that grows up using such technology it will be as commonplace as household electricity to us. And, perhaps, as little questioned: that energy that seems always to be sitting there just waiting for a hair dryer to come along is, in fact, arriving from some distance away. A power grid connects multiple generators, any one of which could have been the "source" of the power that you use to dry hair, heat coffee, and burn the toast. But even conceptualizing the issue in this way allows us to see the error in our thinking. The entire grid itself is the "source," and asking which part, in which state, actually burned your toast seems mere pedantry.
The situation is--or will be--the same, though on a vastly more powerful level, with what is being called the National Center for Supercomputing Applications (NCSA) Alliance Grid. The idea behind the grid is to build a network of computers across the entire country, and IU is already an important hub in that burgeoning data mesh. Unlike a simple electricity grid, however, each node on the information grid will provide something different, "bringing its unique capabilities to the whole," as Gannon has it.
The CAT (Component Architecture Toolkit) is being used to solve a large sparse linear system distributed across several machines on the Internet. In this mode, the CAT functions as a rapid application tool for scientists interested in analyzing and developing solution strategies to linear systems. In this example, the user has specified the input linear system via a parameter dialog, and then clicked the execute button in that dialog to propagate the system to connected components. The user can also view the effects different filter components have on the original linear system, which may help guide him or her toward more effective solution strategies. The CAT facilitates access to the underlying computational resources, thereby enabling users to focus more on higher level problem solving. --credit
For example, why assume that an application that you decide to launch on your desktop computer at home need necessarily also run on your desktop at home, rather than somewhere else? Remembering the power grid that dries hair, why not have an information grid into which your machine could tap like a plug, connecting it with a whole range of linked computers? The command sent from your keyboard (or spoken into your computer's electronic ear) would then activate a series of programs that could run on separate machines, giving an integrated result that would be far superior to the electronic abilities of today.
"Our efforts are about making the boundary between your desktop machine and the world of giant computers on big networks vanish completely," Gannon says. "If you are asking your application to do some big computation or mine some large remote data, there is no reason that the application can't spread itself out into the network and run parts of itself on some big supercomputer or remote data analysis engine. That is what we mean by eliminating the boundaries."
For that matter, why have the memory storage right there inside the unit on your desk, when it could be stored and retrieved from elsewhere (as is already the case in, say, electronic lockers)? Why have to turn your machine, even briefly, into one capable of doing complex math or music or graphics when you could tap into another machine, or machines, devoted exclusively to that kind of computation? Why not have a computer that could be dropped in your pocket when you feel like going out for lunch?
"Wireless computing and communication will merge within the next few years," Gannon is sure. "That much is obvious. What is not obvious is that shortly after that, your electronic desktop will be able to follow you automatically when you move from your workstation, existing just as easily in your laptop, and then soon after in your hand-held wireless system." Another way to say this is that your "desktop" will be spread out all over the information grid, making it as easy to access "your" module for word processing as "your" module for calculating the best route, via satellite, to that new restaurant you heard of when you asked "your" information-mining robot to go find some really great Italian food.
One of the exigencies of living just when we do, at a time when the technological world is moving so rapidly, is that even cautiously conservative scenarios that look no further than six to ten years into tomorrow have the flavor of an overheated imagination. So what are the concrete, specific issues Bramley and Gannon's teams work with? Gannon says there are several.
First, there is the "Component Architecture Toolkit." This system, built by the students themselves, allows for parts of applications being run on separate machines to become connected to each other across network lines. In effect, this process causes a meta computer to come into existence, employing the powers of many different computers in remote locations. More formally, and somewhat less grandiose, this kind of operation is called a "distributed application." Though the term is not in popular use, distributed application approaches to problem solving are commonplace. Any time you have held a pot-luck dinner, put on a play, or sat around a campfire singing rounds you have engaged in a distributed application of a sort. If you don't see how, imagine doing any of these things alone. A better metaphor for computerized distributed application, though, might be the magazine you are now reading. Five people singing rounds are all engaged in staggered versions of the same action. This magazine, however, is the creation of several different operational nodes, some of which specialize in photographs, others in writing, still others in editing and layout. Another part of this metaphor is that some of those actions can be simultaneous, to speed up production. The same thing motivates many distributed computer applications; the work is spread out and parts can be done simultaneously so that results can be obtained faster.
Second, distributed computations must be programmed as well; this is not at all the same thing as simply building the system. For example, you can put two violinists, a violist, and a cellist in a room together--your architecture problem is now solved--but this in itself will not be enough to produce Mozart. Common scores, a common ability to read the score, and a host of smaller communication issues must be established between the players. So it is with distributed computers. There is relatively little difficulty in establishing coherent communication between word processing programs (for example), even if they initially submit their output in different formats. Anyone who has converted a document from WordPerfect to Microsoft Word is aware that, if we are willing to endure some persistent translation mistakes, the changeover can be made. Having so few companies control so much of the programming for these systems contributes to their general uniformity.
Solving large sparse linear systems of equations is the major bottleneck for many scientific computations. This image shows the coefficient matrix for one system with 22,926 unknowns. Entries are colored according to their magnitude, with black indicating zero. The second image is a zoom shot, showing part of the matrix where individual entries can be discerned. The CAT/LSA is a high-level problem-solving environment for manipulating such systems dynamically and returning solutions to an application program running anywhere in the world. --credit
Not so with scientific programming, a third major issue Bramley and Gannon work with. In that arena, it is necessary not just to convert one kind of word-filled document into another without mistakes, but to create a system that can handle intensely detailed graphics, 3-D modeling, and various high-level mathematics. Another specialty, such as climate modeling, will employ different techniques still. All of which leads to the issue of scientific computation and visualization. "Bramley is the leader here," Gannon says.
"Throughout the history of computing for science and engineering," Bramley believes, "three trends have been invariant. Software constructs have become larger, the amount of software sharing has grown, and the number of tasks for which we use computers has grown." In the first trend, users have gone from writing programs that solve particular problems to "software libraries" that can be accessed to solve a range of problems. "Originally, computational scientists simply wrote all the software they needed for their work." When they came to rely on libraries, though, that "allowed the physicists to do what they do best--physics research--instead of spending their time trying to master the intricacies of floating-point arithmetic on computers."
The trend has continued in this direction. "It's unlikely that any chemist or physicist," Bramley says, "could develop competitive software for solving large systems of equations, when compared to the ones readily available over the Internet. Our work carries this trend even further, so that a scientist does not even need to download the software. Instead, his or her program can dynamically find and use software systems developed by experts anywhere on the Internet."
Beyond all this is the arena of other "computer-driven" activities, such as "the remote control of instruments, like telescopes at remote observatories; the real-time acquisition of data from instruments; mining large-scale databases of results from experiments; and the visualization and presentation of those results."
Are Gannon and Bramley themselves wowed by the speed with which technology is changing our world? "The pace of change is amazing," Gannon admits. "A good computer science department has to completely revise its entire undergraduate and graduate curriculum every two or three years."
For more information on the Web:
Return to the Table of Contents