Computer-based user interface evaluation by analysis of repeating usage patterns in transcripts of user sessions
Files
TR Number
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
It is generally acknowledged that the production of quality user interfaces requires a thorough understanding of the user and that this involves evaluating the system by observing the user using the system, or by performing human factors experiments. Such methods traditionally involve the use of videotape, protocol analysis, critical incident analysis, etc. These methods require time consuming analyses and may be invasive. In addition, the data obtained through such methods represent a relatively small portion of the use of a system. An alternative approach is to record all user input and system output onto a file, i.e., log the user session. Such transcripts can be collected automatically and over a long period of time. Unfortunately, this produces voluminous amounts of data. There is therefore a need for tools and techniques that allow an evaluator to extract from such data potential performance and usability problems. It is hypothesized that repetition of user actions is an important indicator of potential user interface problems.
This research reports on the use of the repetition indicator as a means of studying user session transcripts in the evaluation of user interfaces. The dissertation discusses the algorithms involved, the interactive tool constructed, the results of an extensive application of the technique in the evaluation of a large image-processing system, and extensions and refinements to the technique. Evidence suggests that the hypothesis is justified and that such a technique is convincingly useful.