"Which software should I choose?" is a question that more and more teachers are asking themselves. Computer labs are becoming common in many schools. Software companies, and even mainstream textbook publishers now bombard teachers with a plethora of new language learning software, making evaluation criteria essential. In order to help teachers and those responsible for software purchases to make sense of this skillfully marketed software, this paper proposes a set of evaluation criteria structured along the lines of Richards and Rodgers (1986) Approach, Design, Procedure model which was used by Hubbard (1992) as a basis for the design of effective language teaching software.
Hubbard, a Senior Lecturer in Linguistics at Stanford University as well as a software designer, uses the Richards and Rogers model for comparing language teaching methodologies as a framework for courseware development. This paper attempts to extend this model to include criteria for the evaluation of CALL software. As an example I shall use the model to evaluate the commercial computer game Where in Space is Carmen Sandiego? as a component of an advanced reading program.
Approach
Richards and Rodgers consider Approach to "refer to theories about the nature of language and language learning that serve as the source of practice and principles in language teaching." (Richards and Rodgers, 1986; 16) While the focus of this paper is the practical application of the framework to courseware evaluation, a quick summary of Schema Theory and Parallel Distributed Processing (PDP) is useful for a deeper understanding of the cognitive principles underlying reading comprehension.
Schema Theory
Traditional L2 comprehension focused on the meaning "in" the language, but Schema Theory highlights the importance of background information (Carrel and Gisterhold, 1987; 218-21). Reading has been described as a "psycholinguistic guessing game" (Goodman, 1967), where the reader engages in a cyclic process of picking and choosing relevant parts of the total information. In Schema Theory reading skills depend on the interaction between world knowledge and linguistic skill. The written text itself doesn't carry meaning, it only provides direction for the retrieval and construction of meaning from previous knowledge. This mental model is called a schemata.
Parallel Distributed Processing (PDP) : Johnson-Laird (1988) describes PDP or "connectionism" as a cognitive theory whose rules do not have an explicit structure. This is in contrast to the Production Theory of mental architecture which has a linear sequence governed by an explicit rule system. Johnson-Laird likens PDP to a hologram where long-term memory is distributed over a number of processing units simultaneously. People recognize printed words from cues that are matched to all the contents of memory at once, and the pattern is recognized by finding the best fit. One of the most important concepts in PDP is that symbols do not necessarily represent separate entities, instead they can be described as the parallel processing of distributed representations created by the merging of many separate experiences. Thus, understanding text involves all of the cognitive levels simultaneously, both the "Bottom-up" processing of vocabulary, grammar, and syntax, and the "Top-down" processing of General Knowledge of the world. The individual text triggers the mental model of a scenario or "script" and produces expectations about what will happen next.
Chun and Plass (1997) write that these schema interact with each other in a non-linear and non-sequential manner, even though the mental model is built sequentially, word by word, and sentence by sentence. They go on to point out that the formation of mental schema, i.e. text comprehension, occurs more rapidly and with a greater depth if it is aided by the simultaneous use of sound and graphics, as is common with multimedia computer software.
Approach: Evaluation of CALL Software
How does Approach-based design relate to the evaluation of CALL software? To effectively teach ESL/EFL reading, a program must conform to a theory of learning and language. Hubbard proposes that good language learning software should:
- give meaningful rather than mechanical practice with discourse larger than a single sentence.
- provide various hints to lead students to the correct answers.
- accept alternative correct answers .
- give optional explanations for why correct answers are correct.
- anticipate incorrect answers and give explanations.
Where In Space Is Carmen Sandiego? conforms to the first three criteria in that it is an example of the problem-solving genre of adventure games. The language is meaningful and authentic, and must be understood in order to extract the clues necessary to proceed to the next step, or to backtrack if a mistake is made. To process the information containing the clues, students must access a database. Early CALL software focused on audiolingual pattern practice techniques, but that behaviorist approach violates Hubbard's first criteria.
Another feature which causes the Carmen Sandiego software to be more appropriate for EFL/ESL students than other game simulations is the science fiction setting. Many games designed for native speakers have a restricted cultural context. Since this game is set in outer space, the background knowledge of astronomy remains equal for any student. In fact, the original purpose of the game was to provide incentive for junior and senior high school students to learn astronomy. Thus all the necessary background information is provided by the computer database and searching for and retrieving the proper facts is the key to solving the puzzle. This problem-solving activity provides the opportunity to make mistakes, to correct the mistakes, and to improve both world knowledge and reading skills simultaneously.
Hubbard's fourth and fifth criteria deal with the explanations of correct and incorrect answers. In my opinion, this is where the teacher should be actively involved. The language teacher should act as a resource and facilitator, and in a reading program involving CALL software, especially an authentic adventure game, the task of explaining mistakes and pointing out the correct path is best left to the instructor. This prevents the student's frustration with dead ends, and keeps the learner's motivation high.
Design
Not all of Hubbard's criteria for software production are relevant to the evaluation of this specific example of CALL software. The design features which aid evaluation are: learner variables, language difficulty, program difficulty, content, learning style, program focus, and hardware considerations
Learner Variables: Hubbard lists the following six learner variables, and I shall evaluate the Carmen Sandiego software's design for each in turn.
- Age
- Native language
- Proficiency level
- Sex
- Learner needs
- Learner interests
The software was originally designed for native English speakers aged twelve to adult. I feel that this software would probably be too difficult for the general L2 student below high school age. The L2 student's Japanese language should pose no interference to the task of teaching reading with this program. The lexical, grammatical, and sociolinguistic levels of this program preclude efficient use by beginners. I would recommend this program for high-intermediate or advanced students only, thus precluding most high school students in Japan, as well. The learner needs a software package which can teach reading, and Where In Space Is Carmen Sandiego? provides ample opportunities to improve this skill. This software will appeal to those learners who are interested in science, technology, detective stories, and computer games.
Language Difficulty: Hubbard proposes four areas of language difficulty--variety, transparency, familiarity, and length. The variety of registers range from the colloquial to the academic. The clues are embedded in four different contexts--a witness to the crime; an informant; a wiretap; and an interstellar message. These clues employ the first and third person, and reported speech. The database is taken from an astronomy textbook. This allows the student more than one chance to understand the clue if one or more modes are beyond his comprehension level. This feature helps to overcome problems related to all four areas of language difficulty, and is a point in favor of using this particular software package.
Program Difficulty: A control panel on the screen is mouse-activated. The database is composed of a system of hierarchical menus common to most computer software, and is quickly learned. The game's instructions can be explained by the teacher in five to ten minutes. Thus the program difficulty of Carmen Sandiego is minimal.
Program Focus: The focus is on improving L1/L2 reading skills, but the software also includes listening practice.
Hardware Considerations: Carmen Sandiego uses the exploratory principles of hypermedia, especially"Hypertext." To accommodate the sophistication of this program, a computer with a color monitor, mouse control, and audio speakers is necessary.
Procedure
Richards and Rodgers state that procedure "encompasses the actual moment-to-moment techniques, practices, and behaviors that operate. . . It is the level at which we describe how a method realizes its approach and design" (1986, p26).
Activity Type: Where In Space Is Carmen Sandiego? is an example of a problem-solving adventure game. As educational software, it incorporates the tutorial format into its problem-solving framework. Tutorial format, of course, indicates that the software functions in the capacity of a personal teacher. Such programs promote the reading skills of skimming, scanning, and culling the desired information.
Presentational Scheme: The goal of Carmen Sandiego is to identify the correct suspect from a group of fifteen aliens, each with different appearance, sex, favorite food, favorite author, and favorite astronomer. The player follows clues which lead to various planets and moons in the solar system, and gathers information about the criminal. This information is embedded in various styles of text. Thus, the main computer output consists of text clues embellished with NASA photographs, plus entertaining multimedia graphics and audio. The learner's task is to scan and cull information. They must comprehend the output to proceed to the next clue. This makes Where In Space Is Carmen Sandiego? an excellent source of comprehensible textual input which will improve the student's reading skills.
Input Judging: Hubbard lists four basic considerations for input judging, and I shall deal with them each in turn:
- Is there only one acceptable answer to an item, or more than one?
- If the input takes the form of a word or phrase, how are misspellings and inflectional/derivational errors dealt with?
- If the input takes the form of a sentence, how are grammatical errors dealt with?
- How are other anticipated errors (e.g. word choice) dealt with?
The Carmen Sandiego program only allows one correct answer for each maze branching (see the Feedback section below). This encourages the students to make an effort to understand the clues.To access the database computer "VAL 9000," the name must be spelled correctly or it will show the words: '"NONE FOUND" after a search for a key term. This feature encourages accurate spelling. As the software is focused on reading and understanding, all the student inputs are personal and place names for the database. No sentences are input. The answer is similar to that for question #3. Input is limited, thus word choice errors will receive a "NONE FOUND" reply from the database.
Feedback: Feedback is given after each action or decision by the learner. If the student takes an incorrect branch of the maze, i.e. goes to a wrong planet or moon, there will be no additional clues waiting, and the student must backtrack. After too many wrong turns he "runs out of fuel," and the game is lost. If a wrong suspect is arrested, then the student loses the game, and can start another one at the same skill level.
Positive feedback is given for correctly identifying the criminal. Winning allows the student to proceed to a game requiring a higher skill level. Each complete game is short, between fifteen minutes and half an hour. In this way, the student's attention span is not overly strained.
Control Options
The two main control options are automatic control imposed by the developer, and programs that are completely under the student's control. The Carmen Sandiego program seems to fall between these two extremes. On the one hand, the language is fixed by the computer program, but, on the other hand, control of how the clues are processed is left up to the learner, with the teacher acting as the facilitator. I feel that this is a proper balance for an advanced reading program of this type.
Help Options: There are two basic types of "Help" options; "review" and "hints." The Carmen Sandiego game does not provide an optional review of the instructions, for they are basic and simple to learn, but it does provide additional hints in the guise of "launching a probe." Each game has a maximum of two probes to provide extra clues if the player is having trouble deciding on the next step.
Screen Layout: Screen layout covers such variables as the print size and spacing, the use of color, the quality and relative position of the graphics, the presence of animation, etc. Rather than tediously detail the exact specifics of the software, I shall merely evaluate the screen layout as a whole. This game is part of a series of popular educational games by Brodurbund, a very successful software company. Since the series is designed for a large native speaker market, the professionalism and quality of the layout and multimedia graphics are much superior to most found in the smaller EFL/ESL CALL software market. This is a reason in favor of adapting software originally designed for native speaking young adults as an adjunct to a language learning program.
Conclusion
Although evaluating CALL software is more complex than just using checklist, having an evaluation framework will focus attention on the key points necessary to make an informed decision. Keep in mind that computers and software are merely tools to help teachers and students towards their goal of more efficient language learning, and a tool is no better than the hand that wields it.
References
Broderbund. Where in Space is Carmen Sandiego? <http://www.broder.com/education/programs/science/carmenspace/#bot>
Carrel, P. & Gisterhold, J. (1887). Schema theory and ESL reading pedagogy. In Richards, J.C. & Long, M. (Eds.), Methodology in TESOL: A book of readings, New York: Newbury House.
Chun, D.M. & Plass, J. L. (1997). Research on text comprehension. Multimedia Environments, Language Learning & Technology. 1(1), 60-81. <http://polyglot.cal.msu.edu/llt/>
Goodman, K.S. (1967). Reading - A Psycholinguistic Guessing Game. Journal of the Reading Specialist, 126-135.
Hubbard, P. (1992). A Methodological framework for call courseware development. In Pennington, M.C. & Stevens, V. (Eds.), Computers in Applied Linguistics. Clevedon, England: Multilingual Matters.
Johnson-Laird, P. (1988). The computer and the mind. London: Fontana.