New! Sign up for our free email newsletter.
Science News
from research organizations

Computer Science Engineers Improve Video Game Testing By Analyzing The User

Date:
June 8, 2006
Source:
University of Southern California
Summary:
USC engineers are perfecting a games user testing tool that captures and analyzes play experience to automatically detect weakness and flaws -- and it may soon gauge player emotional involvement.
Share:
FULL STORY

USC engineers are perfecting a games user testing tool that captures and analyzes play experience to automatically detect weakness and flaws -- and it may soon gauge player emotional involvement.

User testing is a critical and key element crucial element of creating a new game - books have been written about it. But it remains a highly subjective and quite unstructured exercise. "Traditionally," says Tim Marsh, a post-doctoral researcher at the USC Viterbi School of Engineering's Integrated Media Systems Center, "game companies hire teenagers, and turn them loose trying to find flaws and gaps in the game," which they report either verbally or in writing, along with their impressions."

This is neither systematic nor scientific says Marsh, who will present what he believes is a better way at a conference presentation entitled "Continuous and Unobtrusive Capture of User-Player Behavior and Experience to Assess and Inform Game Design and Development, to be given at the Fun ‘n Games 2006 Conference in England on June 26, 2006.

Marsh's method analyzes "immersidata." USC Viterbi School computer scientist Cyrus Shahabi, one of the researchers on the project, coined the term several years ago to refer to the machine-readable record of commands sent to the computer by keyboards, joysticks and other controls, collected in parallel with a videotape recording of the player at the game session.

An IMSC-developed tool called "ISIS" (Immersidata AnalySIS) can "identify data of interest and index events within the videotape. For the game development application, ISIS can return indexed examples of six different kinds of occurrences, or "points" in the immersidata/video record

* Activity completion points, when the player has finished a final task associated with a mission.
* Task completion points, a subset under this, allowing a researcher to go back over the performance of a task.
* Break points, times when nothing seems to be happening; the player isn't moving and no events occur. This can be distraction, or a break, but "break is a very important concept … because it provides clues to what interrupts players.”
* Wandering points, somewhat similar times when the user-player is moving, but doesn't select any objects .
* Critical events. Some elements of the game are the hardest, and these can be pre-selected, so that action leading up to accomplishment or non-accomplishment can be studied
* Navigation errors. Collisions with a wall or object potentially point to inadequate or poor design causing user disorientation.

By backtracking from the points, investigators can see how the point developed. Similar patterns backing up parallel points can be clear indications of problem in the game.

Marsh and Shahabi used for their tests a "serious" (i.e., teaching) game designed to instruct students in human anatomy and physiology. The study analyzed sessions by 16 undergraduate students, with sessions of 13 of them intensively studies.

Though Marsh and the group tested the technique on a serious game, “the techniques are for use testing all game genres, entertainment and non-entertainment,” Marsh said.

The system already works extremely effectively to find problems in the areas it is set to look for, Marsh reports. Improvements are already in the works to add functionality to find and identify other potential problem areas — to recognize repetition patters by players, and to replace and/or supplement the video capture with a replay of the game from the player's point of view, for example.

Marsh is also working on ways to use immersidata and to capture more aspects of the game experience, including particularly the emotional/empathetic elements. Marsh recently wrote a chapter on this, entitled "Vicarious Experience: Staying There Connected With and Through Our Own and Other Characters" in a new book, Gaming as Culture (McFarland Press, 2006)

In addition to Marsh and Shahabi, USC computer science doctoral candidate Kiyoung Yang played a key role on the project, Marsh said. Shamus Smith of the University of Durham also participated.

The research was funded by the National Science Foundation, and by Professor, Shahabi's Presidental Early Career for Scientists and Engineers (PECASE) grant.


Story Source:

Materials provided by University of Southern California. Note: Content may be edited for style and length.


Cite This Page:

University of Southern California. "Computer Science Engineers Improve Video Game Testing By Analyzing The User." ScienceDaily. ScienceDaily, 8 June 2006. <www.sciencedaily.com/releases/2006/06/060608091810.htm>.
University of Southern California. (2006, June 8). Computer Science Engineers Improve Video Game Testing By Analyzing The User. ScienceDaily. Retrieved April 26, 2024 from www.sciencedaily.com/releases/2006/06/060608091810.htm
University of Southern California. "Computer Science Engineers Improve Video Game Testing By Analyzing The User." ScienceDaily. www.sciencedaily.com/releases/2006/06/060608091810.htm (accessed April 26, 2024).

Explore More

from ScienceDaily

RELATED STORIES