Useless Comparisons

Which is more effective: training games or instructional videos?

This is a frequently asked question. It is also a frustrating one.

This question is very much like, Which is more popular: apples or oranges? The answer to this question obviously depends on a number of factors.

  • What specific type of the fruit are we talking about? I like golden delicious apples but hate the Granny Smith variety. I like navel oranges but not the thick-skinned variety.
  • What purpose are we talking about? Oranges are more popular for making breakfast juice but not popular for making pies.
  • Who are we talking about? Some people prefer oranges and some prefer apples. Some people actually don’t like one or the other of these fruits. And some people are allergic to one or the other.

The comparison between training games and videos (or lectures, or textbooks, or e-learning, or webinars, or any other training technique) is as meaningless as comparing apples and oranges.

What type of game?

When we say game, what exactly do we mean? I am familiar with 20 interactive experiential tools that fall within the category of game. Obviously, there is an enormous difference between a simulation game that authentically reflects workplace processes and an icebreaker that requires participants to match the lines of a limerick. Similarly, what do we mean by the term video? Are we referring to a talking-head video of a lecture by an expert or an award winning documentary of creative behaviors, or a segment of a feature movie that is used to illustrate some instructional content? How about a video that presents critical customer-relations vignettes, pauses after each vignette, requires teams to analyze the situation and come up with recommendations, and score points based on the similarity of these recommendations to those from a panel of experts? Do we classify this hybrid technique as a training game or a video?

Two decades ago, Richard Clark at University of Southern California did a meta-analysis of controlled research studies that compared different training media such as educational film and classroom instruction. Not surprisingly, Clark came up with the conclusion that media don’t make any difference. What makes the difference is the design elements. For example, we can use a discovery-learning design approach in an educational film or a classroom lesson. The critical factor is not the medium but some specific feature of the medium. If we extrapolate Clark’s findings to our initial question, we will conclude that it is not games but specific features of games (such as active participation, score points, interaction among team members, and competition among teams) that make a difference. It is not videos in general, but the critical features (such as realism, motion, and audiovisual capabilities) that make a difference.

What Is the Purpose?

Whether a game is more effective than a video also depends on the purpose for which it is being used. A simulation game is effective when used for helping participants acquire certain analytical skills. However, it would be ineffective for helping participants get acquainted with each other at the beginning of a training session. An icebreaker with lines of limericks effectively serves the purpose of getting acquainted but it will be perceived as being silly when used in the middle of a workshop on cost-benefit analysis. Whether we compare a training game with a video, or one training game with another, or one video with another, it is important to specify the purpose for which the training technique is being used.

Who Are the Participants?

Different people react differently to the same training game. For example, a game that highly motivates a typical US group may be perceived as fluff and irrelevant by a typical Canadian group and downright threatening by a typical Japanese group. Similarly, a video that excites a group of Millennials can confuse a group of baby boomers. The opposite could also be true: A traditional video that appeals to senior citizens may bore members of the twitch-speed generation.

Toward the Answer

Comparing training games with other training techniques is a meaningless exercise unless we exactly specify these elements:

  • The specific features of the game and the other training technique
  • The purpose for which the game and the other training technique are used
  • The participants who use the game and the other training technique

Here’s how our initial question could be rephrased:

Which of these two techniques is more effective in helping a group of experienced hotel employees acquire customer-service skills: an authentic simulation game that incorporates critical incidents and includes a lengthy debriefing by an expert facilitator or a documentary-format video with workplace vignettes and graphics and captions followed by a group discussion?

The answer to this version of the question is obvious: Either technique could be effective.

Useful Questions

Here are three questions that avoid useless comparisons and provide useful answers:

  1. How can we select the most appropriate training technique that effectively matches the specific instructional purpose and preferences of a specific group of participants?
  2. How can we modify a training game (or any other training technique) to better suit the instructional purpose and participant preferences?
  3. How can we combine different training techniques to increase their joint impact?

These questions are definitely worth looking into.