skip navigation
Indiana University Bloomington
School of Education, Indiana University Bloomington: Preparing educators, advancing knowledge, improving education

Instructional Systems Technology (IST)

Final Report:
Design-Research for the
Indiana University Bloomington
World Wide Web

Theodore Frick
Michael Corry    Lisa Hansen
Barbara Maynes

Department of
Instructional Systems Technology

School of Education
Indiana University Bloomington
August 22, 1995

Table of Contents

Synopsis of the Design-Research
for the IUB Top-Level Information Structure


The World Wide Web is currently undergoing an incredible growth period. Its capability for providing information on-line in a timely and visually attractive manner has made it an ideal vehicle for electronic information environments. University administrators and academic departments are beginning to realize that the Web is a further way of recruiting students to attend their institutions. Indiana University is no exception.

When Web browsers visit Indiana University electronically, they normally first encounter the IU System Home Page for all eight campuses. When they next select IU Bloomington, they see the IUB Home Page, which is the electronic gateway into a growing number of Web sites at the Bloomington campus. Does the IUB Home Page provide an image of Indiana University that will help attract students to come here? Does it invite them to further explore what our campus has to offer? Does it provide information that addresses questions on their minds, as well as meet the needs of students, faculty and staff who are already on campus?

The IUB Home Page Evaluation and Remodeling Team

To begin to address these questions, the IUB Home Page Evaluation and Remodeling Team was formed in early spring, 1995, and began its task of deciding if and how to revise the top-level organization of the IUB Web site. This interdisciplinary committee consisted of faculty and staff from the Bloomington campus, and was chaired initially by Caroline Beebe and subsequently by Toby Sitko, both from University Computing Services.

After several preliminary meetings and discussion, it became apparent that we needed to find out the sorts of things people would expect to find at the IUB Web site. We considered our target audience to be prospective students and their parents, as well as students currently enrolled, faculty and staff who are employed at IUB, and alumni. It became clear that a needs analysis should be conducted.

The IST Design-Research Team

In late April, Ted Frick, a member of this Remodeling Team and a faculty member in the Department of Instructional Systems Technology, volunteered to assemble and direct a team of IST graduate students for the purpose of conducting design research. The results of this research were expected to provide empirical evidence to support design decisions for the IUB Home Page. Thus, these decisions would not be a matter of opinion, preference or authoritarian power, but would be supported by data collected from observations of typical users as they interacted with the design.

Instructional systems technologists routinely perform needs analyses, create design prototypes, and do formative evaluation during the instructional development process. Frick saw this as an opportunity for IST graduate students to gain further practical experience, and as an opportunity to build on the success of his and his students' previous design of the IST Web site in the School of Education. This site has gained national recognition among peer programs in instructional technology, and has already proved to be a further recruiting tool for the top-ranked IST graduate program.

The approach used by the IST Design-Research Team is also known in recent years as user-centered design. The methodology largely consists of conducting usability tests of design prototypes with the target audience to identify and remedy problems in an iterative manner. One of the best examples of such an approach occurred over 20 years ago at Xerox PARC. While Xerox chose not to market their Alto computers, their interface design was later adopted by Apple Computer. The Macintosh was born, and the rest is history, as they say.

This report discusses the efforts and results of this IST Design-Research Team for the IUB Home Page. The Design Research Team was coordinated by doctoral students Michael Corry and Lisa Hansen. Other team members included: Robert Carteaux, Charles Kalnbach, Barbara Maynes, Cynthia Schultz, Nancy Schwartz, Melanie Stallings, Wendy Tamborrino, Godfrey Whyte, and David Winer.

Tasks Completed by the IST Design-Research Team

Between May 5 and August 21, 1995, the Design-Research Team:
  1. Conducted a needs analysis to determine what questions are asked most often of information providers at IUB;
  2. Created a new information structure, based on common themes that emerged when sorting and grouping these questions;
  3. Conducted initial usability tests on this new information structure (on paper), as well as on the current IUB home page (printed from the Web);
  4. Determined that the current home page was less effective than then new one when users tried to find answers to those frequently asked questions (it also took 2 - 3 times longer when using the current page);
  5. Modified the new information structure based on early usability testing, and then conducted more usability tests which lead to further refinements;
  6. Converted the paper version of the new information structure to a Web prototype (on computer), making adjustments to accommodate differences of "page" sizes on computer screens vs. paper;
  7. Revised further the Web prototype based on usability tests conducted on computers using Netscape, Mosaic and Lynx browsers on Macintosh and DOS/Windows platforms; and
  8. Made recommendations to the IUB Home Page Evaluation and Remodeling Team.

Synopsis of the Design-Research
for the IUB Top-Level Information Structure

Needs Analysis: Initial Information Gathering

Two concurrent information-gathering processes were begun in early May. One team of graduate students examined e-mail sent to the IUB webmasters and Web usage records from UCS. Suggestions, questions or complaints that Web surfers posed to the UCS webmasters revealed some user needs. Also, knowing which documents were being requested most often on the current IUB Web indicated further interests and needs of users.

The other and more long-term team began what would prove to be the most exhaustive portion of the project -- a series of campus interviews to find out what information is being requested. This list of frequently asked questions would provide the basis for deciding how information would be organized.

Needs Analysis: Interview Process

We began by identifying offices and departments that were considered to have a high volume of phone calls, in-person visits, and e-mail from people who were requesting information. Thirty-three major offices were identified as having key roles in providing information and services to the campus community. These offices included the College of Arts and Sciences, all professional schools (e.g., Business, Education, Optometry, SPEA), Research and the University Graduate School, Admissions, Halls of Residence, Registrar, International Student Services, University Division, University Computing Services, Main Library, Athletics, External Relations, Communications Services, Financial Aid, Parking Operations, Campus Life, IU Foundation, Alumni Association, Division of Continuing Studies, Human Resources Management, IU Publications, Auditorium, Musical Arts Center, and the IMU Bookstore.

These offices were contacted three times to arrange for interviews of information providers -- first, to find out who the key information providers were; second, to send a letter explaining the project and providing a copy of the interview questionnaire; and third, by phone to arrange for on-site interviews of the identified contacts.

After these arrangements were made, the Design Research Team members were assigned individual interviews, which took place over a two-week period in late June. Two offices simply returned the completed questionnaire as their schedules would not permit scheduling an interview. The Bursar's Office was reluctant to participate and was not interviewed.

Overall response was very positive, as we stressed that the reason for conducting these interviews was to help reduce the number of calls these office personnel must field on a regular basis. Interviewers asked office personnel to report their most frequently asked questions, how often, when, and by whom they were asked, and how the questions were answered.

See Appendix A for the cover letter and interview form. Appendix B contains the lists of frequently asked questions, which are divided into groups:

It is interesting to note that the large majority of the questions that are asked more than 1,000 times per week occur at the Office of Admissions and the Halls of Residence. Also, students and their parents often want specific information about degree programs that IU offers.

Needs Analysis: Organizing the Data

After all of the data were collected, we conducted a card sort of the frequently asked questions. We put each question and its associated data on an index card and sorted the cards by topics that emerged. Over thirty discrete topic areas were created from the 330+ questions that were generated from the interviews. As the sorting progressed further, six broader categories emerged as larger groups, under which our numerous stacks could be placed. These groups are currently named:

From these data, we designed the first draft of new home page using the six topic areas as the primary organization for the information structure. We decided that the information needed to be as compact as possible, and that the number of options at the top level be kept to seven or less -- due to human short-term memory limitations. We also took advantage of the hypertext environment to allow for several paths to some information (e.g., the library can be reached via academic departments and research as well as through services). With the home page and the second level (subsidiary) pages in place, usability testing of the new information design began in mid-July.

It is also important to note that the ordering of categories on subsidiary pages was based on frequency of occurrence of questions asked. Thus, those areas where questions were asked most often were at the top of the list (e.g., admissions and housing under "Attending IUB"). See Appendix D for details.

Usability Testing

The Design-Research Team members were given three printed documents: a numbered and ranked list of 339 questions from the interviews, a copy of the existing IUB Web home page and its subsidiary pages, and a copy of the first draft of the new design. Team members evaluated both the old and new versions of the home page through usability testing; after all, if the existing page performed as well as the new material, then there would be no reason to make changes. Each team member was assigned questions so that nearly all of the questions would be tested.

Each usability test took about 30 minutes. A time and place was chosen which was convenient for the user. The user was instructed that he or she should try to answer each question by looking one page at a time through our printed information structure (see Appendix D). Users were asked to "think aloud" in order to tell us what they were looking at, about things that confused them, things they did not understand, etc. The purpose of this kind testing is to find problems with the information structure in order to improve it. There were no "right" or "wrong" ways to use the information structure.

Testers were asked to note the path the participant took in finding the information and any comments made (see the recording form in Appendix C). Testers also recorded how long it took the user to find an answer, or to quit trying. We also wanted to ensure that a good representation of the population of question-askers was represented, so we made every effort to include a balance of prospective students, parents, faculty and staff, as well as current undergraduate and graduate students.

After the first round of paper tests, several findings emerged:

A second round of paper testing on the revised draft of the new design was needed. We had concluded that the existing "old" IUB Web pages were less effective and less efficient for information searches during the first round, when compared to the new design. Thus, there was clear evidence to support the need for revising the IUB Home Page design, and that there was no need to further test the old one. This additional round of paper testing was needed to evaluate the changes we made in the initial version of the new information design.

Nearly 40 people participated in these two rounds of paper usability tests that occurred during late July and early August.

At about the same time, we created and tested prototypes of corresponding on-line Web documents. These allowed us to identify usability problems that arose because of computer implementation that could not be detected in the paper usability tests of the information structure. These on-line tests provided even more important data:

Although this testing was done with a very small sample, the problems inherent in the design were so evident that we revised the computer design immediately. Considerations in our redesign included:

The resulting redesigns were labelled Versions X and Y. Later, as a result of further on-line testing, we added Version Z.

Further Usability Testing of the Electronic Design

The last round of usability testing reported here was done with IUB students, their parents, and IUB staff during early August. We tested three different versions of the page on three browsers: Netscape, Mosaic and Lynx. We tested 11 people over about 10 days, with about the same number of students, parents and staff (3 or 4 each). The usability tests each took between 60 and 90 minutes to complete. The questions used are typical of those from the 330+ derived from the needs analysis. The fifteen questions and versions tested can be viewed at URL:

The usability test questions and hardcopy approximations of how the pages appeared in Version Z with Netscape can be found in Appendix E. The IUB Home Page, all second-level pages, and a sample of third-level pages are provided. The entire information structure and its links out to other Web sites at IUB can be viewed electronically at the above URL.

During each usability test, the user was asked to find the answer to a question, and the path that was chosen was observed. Also, if the user backtracked and tried another path, this was observed and recorded. We observed how long it took for the page to load on the screen, how long it took for a user to read the screen as they searched for answer and any problems that the user had in determining the path they wanted to follow. We asked users to "think aloud" as we did in the paper tests. The only important difference was that a computer was used instead of paper documents.

Descriptions of the Versions Tested

Versions X and Y were designed and tested early on. Later, as a result of the preliminary on-line testing, Version Z was created and also tested based on what was observed with X and Y.

Version X has home page menu items by name only (no exemplars). Version Y has the same home page menu items by name, plus a few exemplars, all of which are links (hot). There are a few minor word variations in the menu items of Versions X and Y. Version Z is similar to Y with exemplars listed, but they are not hot. Wording in Version Z is consistent with X. Versions Y and Z have "double spacing" between the choices. Version X is largely "single-spaced". Versions X, Y and Z are the same at the 2nd and 3rd levels. They differ only at the top level -- the IUB Home Page.

Results of Computer Usability Tests

By the end of usability testing, users were able to "find" answers to most of the 15 questions in test.html -- at least they went down the paths that we would expect them to, and if not, usually did find the "right" path on the second or third try.

The information structure seems to work a little better with those users who know something about IUB -- e.g., that you need to register for classes before you can take them, that you need a permit to register, etc. IU is in many ways like a foreign culture, even to those parents and students from the U.S. We should keep that in mind in promoting IU to the outside with our Web site.

Interestingly, there are no clear preferences thus far for X, Y or Z.

Some users like X because it is clean and simple. The screen is not crowded and you can see all the choices at a glance. It requires minimal arrow-key pressing in Lynx, since the first choice on the menu is one arrow-key press away. Some users do not like X because it does not have the exemplars, so "you don't know what you're choosing until you get there" (meaning the second-level menu).

Some users like Y because they can see the exemplars, and can click on them at the first level to go to one of the choices (e.g., advising). Some users do not like Y, particularly in Lynx, because it's too slow (too many arrow-key presses to make, and they often down-arrow past the choice they want because of lag time, and then need to up-arrow back to what they want). Some users do not like Y because it is too busy or crowded, does not provide all the exemplars, and appears rather "ugly" in Lynx, though it looks fine in Netscape or Mosaic to these same people.

Some users like Z because you can see the exemplars, although some would prefer them to be hot in Netscape or Mosaic (as they are in Y). On the other hand, Lynx users think Z is faster because fewer arrow-key strokes are needed to make a choice.


We recommend that we go with Version Z, substituting the final version of graphics being developed by another group.

Further Needs Identified from the Usability Testing

In order to carry out the usability tests on the Web, we were required to create some "pages under construction" because such pages do not exist (e.g., for parking, transportation, complaints, facilities reservations, graduation and the job market, student personal records, etc.). As can be seen when comparing Appendix D with Appendix E, some categories were omitted such as complaints and facilities. Furthermore, there was a paucity of general information about IUB, particularly graphics of campus and building maps that would help answer the frequent "where is ___?" questions.

And most significant, there are relatively few academic program descriptions on-line at this time. Yet questions about degree programs offered at IUB were reported to occur more than 1,000 times per week. While much of this information appears in printed academic bulletins published by the College of Arts and Sciences and Professional Schools and Divisions, it is currently not accessible in electronic form on the World Wide Web at Indiana University Bloomington. Nor could we find any web site at IU that includes the current tuition rates, fees, etc. (It would seem that the Registrar's web site would be a good place to put this information, under "Attending IUB -- the basics.")

It is not surprising that prospective students and their parents often want to know what you can learn at IUB as an undergraduate or graduate student, and how much it costs. IUB should address these basic user needs on the World Wide Web.


A. Interview cover letter and questionnaire

B. List of frequently asked questions at IUB (over 330)

C. Usability test form (for recording observations)

D. First generation of usability testing pages (printed versions)

E. Computer usability test Web pages

The full research article is now available:

Corry, M., Frick. T. and Hansen, L. (1997). User-Centered Design and Usability Testing of a Web Site: An Illustrative Case Study. Education Technology Research and Development, 45(4), 65-76.