22 October, 2008


With the proliferation of computers in many classrooms nationwide, music educators have often found themselves with a computer (or several) at their disposal, along with numerous software programs that are either useful or useless. In an attempt to deal with this problem, a number of educators have attempted to create their own programs for their classrooms and may have even offered the programs to other music teachers. This particular phenomenon is not limited to the K-12 population, however, since distance learning and the added use of multimedia programs has been adopted and implemented in a number of universities and colleges. Professors, along with public school and private music teachers, may now find themselves in the position of having to create websites and multimedia programs for use in other classes, such as music appreciation and theory. However, the knowledge of how to do this effectively based on instructional design criteria used by professional educational software designers is often not available to music educators operating at ground zero. The purpose of this paper is to help fill this void.

"ADDIE" is the acronym for the Analysis, Development, Design, Implementation and Evaluation Instructional Systems Design (ISD) Process, which is currently accepted as an industry standard for instructional design of educational media. This particular paper presents ADDIE as proposed by Walter Dick and Lou Carey (1976). Generally speaking, this process has not been effectively implemented in the creation of music education instructional computer media, in that many musicians who create the media may be working based on their personal knowledge, skills, and instincts–and may end up with programs that work well or with programs with many great ideas that fail. The ADDIE process can assist the music educator who is interested in software development to create a well-planned, well-organized and effective presentation that can be more beneficial in the long run. Although the ADDIE process is intensive and time consuming at the beginning for those not familiar with it, it gets easier with time and the information collected in the first few forays into ADDIE can facilitate the quick, efficient development of new software. In addition, the reader also needs to know that most effective software development generally occurs in a team situation, with specialists involved at all levels. An individual can implement ADDIE, however, but should be prepared to share the program with others to gain necessary feedback.
The remainder of this paper will present the ADDIE process and how it may be implemented at an elementary yet operational level by music educators. However, the reader is strongly urged to become familiar with the process as presented in detail in Dick and Carey's The Systematic Design of Instruction, now in its fifth printing.
As stated earlier, ADDIE can be broken down into five distinct areas of inquiry and activity in the instructional materials creation process. Each area will be discussed at length later in the paper.
The first area, Analysis, is probably the most intellectually and research-oriented component, in that the designer needs to do a number of analyses to determine the content of the program, the needs of the learner, psychological, affective and psychomotor development of the learner, the outcomes, special considerations, and type of program format to be used. Luckily, much of the information obtained in this area can be applied to future software programs with additional information added as discovered or needed.
Development is also an intensive step in which the designer needs to investigate the kinds of instructional strategies that are to be used, the kinds of audio and visual media required, and the actual form and function of the computer interface. Knowledge of the process used for effective interface design is crucial for success at this level of ADDIE. Generally, a rapid-prototype of the program is created as a model for the final program.
The Design process deals with the testing of the rapid-prototypes using a limited number of subjects who represent the target population of learners and by sharing it with content area specialists. Based on what is revealed, the developer can tweak problem spots and then finally create the full program.
The Implementation level occurs when the entire product is mass-tested under various or specific circumstances, with data regarding the program's efficiency collected and analyzed. The final Evaluation process occurs when the developer critically assess the data concerning the program's effectiveness, and makes changes as needed.
In Detail: Analysis
The analysis component deals specifically with a number of analyses, with most emphasis placed on Learner Analysis, Needs Analysis, Goal Analysis, and Content Analysis.
Learner Analysis refers to the applications of knowledge concerning psychomotor skills, intellectual skills, verbal information, attitudes, motivational strategies, context, and relevance. The developer needs to have information concerning all these areas at the entry-level of the learner to avoid making too many assumptions. For example, some glaring examples of lack of application of learner intellectual and verbal analysis may be seen in some internet-available programs that are supposedly geared for preschool children, but when the youngster selects the "?" button, a screen full of words pops up.
Attitudes of the learners are important as well. As music educators, becoming personally familiar with the preference research of the learners–as well as being familiar with future learners–will be of use. Psychomotor skill knowledge of the learners can be exhibited by the selection of tasks the program requires of the learner, such as clicking the mouse at either a leisurely rate or a fast, timed rate, or by having the learner click on very small places rather than larger ones. Learners also need to be analyzed in other areas, such as in their entry behaviors, prior knowledge of the topic, attitudes toward the content and potential delivery system, motivation (ARCS Model–attention, relevance, content, satisfaction; Keller, 1987), learning preferences, and learning environment. However, analyses are not restricted to these particular areas.
Needs Analysis is conducted to ascertain 1) the optimal desired level of performance or achievement that is required and 2) what actual levels already exist in these areas. The difference between what is actual and what is optimal is considered the "need." For a music teacher, the optimal level of performance may be for a student to accurately identify all the notes of the bass and treble clef, while the actual level may be no knowledge of either clef, lines, spaces, etc. Therefore, the need is indicated by the gap in between. The need, then, is generally used to derive the instructional goals.
According to Dick and Carey (1996), goals are usually stated as skills, knowledge, and attitudes that some group of learners must acquire to satisfy the identified need. These kinds of goal statements usually use action verbs, such as identify, create, perform, write, defend, and–for music–appreciate, enjoy, etc. Their focus is upon what the learner will be able to do at the completion of the instruction. Unlike many areas of instructional design, music education designers also need to address the "fuzzy" goals, such as appreciate, like and understand. To address the systematic approach to fuzzy goal analysis, Mager (1997) suggests that the goal should first be written down, and then the designer needs to identify and write down what behaviors the learners will demonstrate that will indicate that the goal has been met. These behaviors can be written as behavioral objectives, which cumulatively will result in the attainment of the fuzzy goal. Since computer educational programs are strongly stimulus-response oriented, behavioral objectives fit right in. For example, a student will indicate mastery of understanding the notes of the bass and treble clefs by selecting 100% correct answers by selecting the correct buttons on the screen.
Another area of inquiry concerns the content of the proposed program. What particular activities, skills, and cognitive knowledge will most effectively represent and address mastery of the content? This area can be particularly tricky for music educators who may want to crowd as much material as possible into a program, which in turn may cloud the view of the instructional goal. An example of this kind of problem may be evident in a program that claims to solely concern the Brahms' Requiem, but also includes recordings of other Brahms' pieces, recordings of pieces by Brahms' contemporaries, a glossary of musical terms, and recipes for Brahms' favorite meals and pictures of his pets. Effective Content analysis will help the designer avoid this kind of problem. In addition, a content analysis will also reveal crucial steps in the instruction that must be included or confusion may result. Content analysis may also be approached within the area of context. Concerning computer instruction, this analysis will effect the type of program format to be adopted (drill and practice, edutainment, informational, database, tool application, etc.)
Other areas of analysis that also need to be addressed include: identifying subordinate skills that must be included in the instruction and entry behaviors and knowledge that the learners need to have before instruction begins, such as the ability to discriminate between high/low before reading notes and associating them with pitches on a treble clef.
Overall, Dick and Carey suggest that a complete goal description should include the following:
• statement identifying the learners
• statement of what the learners will be able to do
• description of the context in which the skills will be applied
• description of the tools that will be available to the learners
In Detail: Development
The development process includes the creation of the instructional materials, selecting the progam delivery format, the tests, and the manual. The instructional materials contain the information that the student will use to achieve the objectives. This includes the materials for the major objectives as well as remedial or enrichment materials. According to Dick and Carey, all instructional materials should include tests. Both pre- and post-tests can be of great use. In a computer program, instead of a formal testing situation, the learner can be tested via games that require the use of skills and knowledge presented in the instructional materials. Finally, an instructor's manual, even an abbreviated one, would be useful. However, some manuals for computer programs have become very dense and wordy, negating the often-made claim that the computer program is "intuitive."
Decisions made concerning the above include the creation of the instructional design, communication/writing style, and the format of the design and selection of materials.
Overall, Dick and Carey suggest the following outline of components be addressed in depth during the process of developing an Instructional Strategy:
1. Preinstructional activities
a. Motivation–to continue working through the program (ARCS)
b. Objectives–focus of instruction explained at the beginning
c. Entry Behaviors–what knowledge or skills are required prior to the new learning
2. Information Presentation
a. Instructional Sequence–small manageable steps, systematic
b. Information–relevant and in context; appropriate language
c. Examples–relevant and in context
3. Learner participation
a. Practice
b. Feedback/Reinforcement--provided after each practice activity
4. Testing
a. Pretest–to check for previous knowledge
b. Posttest–to assess learning of the new knowledge
5. Follow-through activities
a. Remediation
b. Enrichment
c. Memorization and transfer
The ability to communicate within the computerized format of instruction is also of paramount importance. Designers are cautioned to make sure that the text directions (either on the screen or stated out-loud) are:
• straightforward
• concise
• use appropriate vocabulary (e.g.: don't use musical terms if they have not already been introduced in the program)
• absent of jargon
• use humor (within the context)
Finally, the largest area category is that concerned with the design and format of the instructional materials. Concerning the design of the interface, instructional designers and researchers have strongly suggested that the following be included:
• headings at the top of each screen to guide the learners
• unused space on each screen (avoid clutter)
• key words and definitions highlighted
• various visual and aural clues to facilitate memory and transfer
• appropriate supporting graphics
• fictional characters to guide younger learners
• a "save" option if applicable (say, a drill program)
• "Help" and "Exit" buttons on each screen
• sans serif headers, serifed text in body
• areas that look or insinuate that they are clickable should be clickable (eg., doors)
• consistent orientation of repeating graphics
• left to right orientation
Computer interface designers often go through a four-step process to design a functional program. First, the designer creates a mock-up of each screen. Each mock-up can be accompanied by a form and function grid that explains precisely what each graphic and audio file on the screen will do and why it is there. Then the mock-ups are assembled into a flow chart to indicate sequence and program navigation. At this point, a rapid-prototype technique may be used to see if the program works, not only for the designer, but for other professionals involved in the program design and also for a test group of learners (see Development). Finally, the entire program is completed. By following this process, many problems can be isolated and fixed early on in the creation of the program, and hence are not included in the final program.
In addition, reinforcement and assessment strategies also need to be addressed. Concerning the reinforcement or feedback, will the program provide answers and move on, will it allow for multiple tries for a correct answer, or will the program move on whether the answer selected is correct or not? Will assessment be based on attaining mastery level, or will there be activities that shape or fade the behavior, or provide summative or formative evaluation? Including these concerns in the design process can greatly strengthen the quality of the program and its educational value.
For more information concerning interface design strategies, the reader is encouraged to consult the books in the reference list at the end of this paper.
In Detail: Design
The Design process generally refers to the formative evaluation of the software program at this point in its development. This means that the developer takes the rapid-prototype and allows other specialists in the field to try it out, as well as a small group of learners that are representative of the intended audience. Dick and Carey recommend that three learners of three different types be used in this process: a low-level learner, a middle-level learner, and a high-level learner. All specialists and learners are interviewed to gain descriptive feedback and suggestions, which the designer uses to further "tweak" the program and present it again to the specialists and learners. When the program is at an acceptable level for all those involved, then the final steps of the ADDIE process can begin.
In Detail: Implementation
The implementation of the program usually occurs on a large scale. As much as possible, the designer or a group of trained evaluators observes the learners as they interact with the program, often collecting quantitative and descriptive data for further analysis. A music teacher creating his or her own program, for example, can ask that other teachers of the same target group of learners also try the program and provide feedback. Programs can be out for up to a year undergoing this process.
In Detail: Evaluation
Finally, the analysis of the data is undertaken, conclusions are made, and revisions are begun. Often, this requires visits back to the first two ADDIE processes (Analysis and Development). In addition, patches, bug-fixes, system incompatibilities, and updates are undertaken. It is important to note that, at this point, constant evaluation is part of the continuing development of the materials.
This paper presented the Instructional Systems Design procedure of ADDIE, based on its interpretation by Walter Dick and Lou Carey. The purpose of the ADDIE model (Analysis, Development, Design, Implementation, and Evaluation) is to provide a systematic sequence of instructional design that can have a strong positive effect on the success of the instruction. The author of this paper suggests that this procedure be taken into consideration by a teacher who wishes to create software for use in an instructional situation. Although it may be a bit tedious at first, as the designer gains familiarity with it and also with the learners, it becomes easier and will ultimately provide the teacher with more effective computer programs for teaching.
Briggs, L.J, Gustafson, K.L., & Tillman, M.H. (1981) Instructional design; Principles and application. Englewood Cliffs, NJ.; Educational Technology Publications.
Castellan, N.J. (1983). Strategies for instructional computing. Behavior Research Methods and Instruction, 15, 270-279.
Dick, W & Carey, L. (1996) The Systematic Design of Instruction, (4th Ed.).New York: Harper-Collins.
Heinrich, R., Molenda, M., Russell, J., & Smaldino, S. (1996). Instructional Media and Technologies for Learning (6th Ed.). Upper Saddle River, NJ.; Prentice Hall.
Kristoff, R. & Satran, A. (1995). Interactivity by Design. Mountain View, CA.; Adobe Press.
Lopuck, L. (1996). Designing Multimedia. Berkeley, CA.; Peachpit Press.
Mager, R.F. (1997) Goal Analysis (3rd Ed.). Atlanta, GA: The Center for Effective Performance
Sales, G.C. and Williams, M.D. (1988). The effect of adaptive control of feed back in computer-based instruction. Journal of Research in Computing in Education, 21 (1), 97-111.
Williams, R. (1994) The Non-designer's Design Book. Berkeley, CA: Peachpit Press.

Creating Effective Multimedia Programs for Teaching (DI PETIK DARI VALERIE L. TROLLINGER)Case Western Reserve University


Post a Comment