Rule 2. Traipse Through the Instructional-Design Steps

I use (and train others to use) a dozen powerful principles for faster-cheaper-better instructional design.
Here’s one these principles:

Keep your training session as close to on-the-job training as possible.

Long time ago, I was indoctrinated into the instructional design model with the steps of analysis, design, development, evaluation, and implementation. Since it was my first course in graduate school and since I was unnerved by the professor, I used all the steps faithfully, always in the correct sequence. For about a year, I felt guilty whenever I scrimped on an ID step. I fantasied that a mythical doctoral committee and a pool of blind jurors were carefully scrutinizing all my design activities. I even wrote a confessional, Help, I Am Trapped Inside an ID Model.

Nowadays, I use alternative models for designing training materials and methods. Even when the context and the client steer me toward the traditional ADDIE, I cheat by meandering through the steps with devious contortions. I omit, simplify, combine, and rearrange the instructional-design steps.

All design steps are intended to produce training materials rapidly and inexpensively to improve learning and application. This is the justification I use to rationalize my messing with the steps.

Design and Development. I frequently combine the steps to form larger processes. For example, I combine design and development to form a bigger activity (that I call prototyping). This successfully conceals the fact that I could not logically differentiate between the design and development. Combining them gives me greater flexibility to move around in converse directions.

Analysis and Design. I combine analysis and design activities to escape this confusing catch 22: I could not conduct useful analysis until I select the training medium and method. I could not select the medium and method without completing my analyses. So, I dance around the two steps by making tentative selections and retrograde analyses.

Evaluation and Everything Else. Evaluation has never been an isolated step in my design activities. For example, needs analysis is just evaluation in the absence of a training intervention. The twofold purpose of evaluation is to improve the training and to prove its effectiveness. I repeatedly evaluate (through expert reviews and participant tryouts) and improve the latest version of training module until I get desirable results in a consistent fashion. A part of evaluation involves constructing performance test items—which is also the initial part of design. Evaluation is also an essential part of implementation. We need to continuously collect data to improve and to prove our implementation system.

I used to agonize over omitting, simplifying, combining, and re-sequencing instructional-design steps. But lately I have noticed that the ID gods have not struck me blind. So, I plan to continue playing with the steps with audacity and chutzpah.