Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A Comparison Of Two Methods Of Adapting Self-Instructional Materials To Individual Differences Among Learners
(USC Thesis Other)
A Comparison Of Two Methods Of Adapting Self-Instructional Materials To Individual Differences Among Learners
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
This dissertation has been microfilmed exactly as received 6 7 — 2113 MELARAGNO, Ralph Jam es, 1931- A COMPARISON OF TWO METHODS OF ADAPTING SELF-INSTRUCTIONAL MATERIALS TO INDIVIDUAL DIFFERENCES AMONG LEARNERS. U niversity of Southern California, Ph.D ., 1966 Psychology, general U niversity M icrofilms, Inc., A nn Arbor, M ichigan A COMPARISON OF TWO METHODS OF ADAPTING SELF-INSTRUCTIONAL MATERIALS TO INDIVIDUAL DIFFERENCES AMONG LEARNERS by Ralph James Melaragno A Dissertation Presented to the FACULTY OF THE GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF PHILOSOPHY (Psychology) August 1966 UNIVERSITY O F S O U T H E R N C A L IF O R N IA T H E G R A D U A T E S C H O O L U N IV E R S IT Y PA R K L O S A N G E L E S . C A L IF O R N IA 9 0 0 0 7 This dissertation, w ritten by Ralph James Melaragno under the direction of h i S .. .Dissertation C o m mittee, and a p p r o v e d by all its m em bers, has been presented to and accepted by the G raduate School, in partial fulfillm ent of requirements fo r the degree of D O C T O R O F P H I L O S O P H Y ...C?: ..... Dean Date Sept erobor 3, ...1?66 DISSERTATION COMMITTEE , iiu A .. J ' / ! ' Chairman fiiu- .... TABLE OF CONTENTS Page LIST OF TABLES................................... iv LIST OF FIGURES................................... v Chapter I. THE PROBU5M............................... 1 Introduction The Problem Setting of the Problem Scope of the Investigation Definitions of Terms Organization of the Remaining Chapters II. REVIEW OF THE LITERATURE.................... 14 Studies of Branching Studies Related to Prediction Experimental Procedures for Programmed Instruction III. METHODS OF PROCEDURE AND SOURCES OF DATA ... 23 Phase I: Empirical Trials Phase II: Experimental Investigation IV. ANALYSIS OF FINDINGS....................... 47 Tests of Hypotheses Methods of Data Analysis Analysis of Datfc Additional Analyses of the Data ii Chapter Page V. SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS . . 60 Summary Conclusions Recommendations BIBLIOGRAPHY..................................... 72 APPENDIX A, Samples of Questions from the Six Pretests......................... 79 APPENDIX B. Posttest and Answer Sheet........... 88 APPENDIX C. Branching Structure for Geometry Program........................... 101 APPENDIX D. Sample Items from the Geometry Program 108 APPENDIX E. Instructions to Subjects........... 114 ill Table 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. LIST OF TABLES Page Prediction Criteria for Decision Points in the Program................................. 35 Pretest Scores for the Three Treatment Conditions............................... 38 Posttest Scores and Training Times for the Three Treatment Conditions............... 49 Posttest Scores and Training Times for the Three Treatment Conditions (Means and Standard Deviations) ..................... 50 Box's Test for Homogeneity of Multivariate Dispersions . 53 Rao's Test for Equality of Multivariate Mean Vectors................................. 54 Values of F for Comparisons of Pairs of Bivariate Centroids .... ............. 55 Bartlett's Test for Homogeneity of Variances, and Analysis of Variance, for Posttest Scores................................... 57 Bartlett's Test for Homogeneity of Variances, and Analysis of Variance, for Training Times................................... 57 Values of t for Comparisons of Mean Training Times................................... 59 iv LIST OF FIGURES Figure Page 1. Structure of Intellect Model .............. 27 2. The Laboratory Facility.................. 41 3. The Response Device ................ 42 4. Graphical Representation of Training Times ~nd Posttest Scores................... 51 v CHAPTER I THE PROBLEM Introduction I Scientific psychology has tended to develop in two separate directions, sometimes referred to as "experi mental” and "correlational" (15). A major distinction i between these two approaches centers around the importance j each gives to individual differences in behavior. As Cronbach points out, "Individual differences have been an annoyance rather than a challenge to the experimenter. . . . [The] goal in the experimental tradition is to get those embarrassing differential variables out of sight. . . . [However,] the correlational psychologist is in love with just those variables the experimenter left home to forget" (15:674). Cronbach goes on to say that it has been the experimental psychologist who has done most of the psycho logical theory developing, and his theories have also Ignored individual differences in a manner similar to their avoidance in experimentation. Since so much of psychological theory has been devoted to learning, It Is | not surprising for Eckstrand to conclude that, M. . . it may be said that the present day psychology of learning presents a model of the organism which permits little in the way of Individuality in the learning process'* (17:408). To a considerable extent, theories of teaching and instruction have evolved from theories of learning. While the fields of education and training recognize that indi vidual differences in the learning process exist, little is done about individuality because of the dearth of factual information derived from experimentation (17:411). I Numerous psychologists have called for a closing of the gap between the experimental and the correlational traditions of studying behavior (15, 17, 28, 32, 38). In particular, these calls have been for an approach to human learning that is not limited to either the study of vari ance among organisms or the study of variance among treat- i aents# There is, however, one approach to human learning that has paid at least nominal attention to individual differences, that of programmed Instruction. From its earliest formalization, programmed Instruction has been described as an instructional technique capable of taking into account individuality in the learning process. In 1932, Pressey pointed out this benefit in his "... simple I machines [that] might both score tests and--with certain 3 types of material--actually teach” (41:668), while Skinner's now classic paper, "The Science of Learning and the Art of Teaching” (50), Mentions the Individualization of Instruction Inherent In his prototype teaching machine. The early fonts of progressed Instruction provided for Individual differences only In the learners' rate of progress through the Instruction. A basic assumption In I early progressed Instruction was that the content of the Instruction (the "program”) was appropriate for all I learners, so that the only variation In performance would I be In terms of the rate variable. This form of programmed j Instruction has come to be called "linear,” reflecting the | fact that all learners progressed through the program In a straight-line fashion, I.e., with all learners receiving the Identical sequence of Instruction. (It should be men tioned that both Pressey [40] and Skinner [51] also I developed features In their earliest machines which would allow for variability In the amount of Instruction by causing the learner to repeat material which had been responded to erroneously. Such repetition was seldom practiced, however, and linear prograssmd Instruction has continued to develop without the Inclusion of the repeti tion feature, largely due to the method of program devel opment employed which attempts to reduce erroneous responses to a minimum.) A second form of programmed Instruction, developed independently froa the linear fora, provide* for another aspect of individual differences. Crowder's "intrinsic" fora of prograaaed instruction provides varying aaounts and kinds of instruction, depending on the perforaance of the I learner (16:89). When the learner responds correctly, he ! progresses to a new unit of instruction; when the learner j responds erroneously, be is presented with instruction i designed to reaedy his error. The decision as to whether the student should progress or should receive reaedlation is a function of the learner's response to a single ques- i tion. In addition to the varying aaounts and kinds of Instruction involved in intrinsic prograaaed instruction, individual differences in rate of progress are also con sidered. Thus, soae learners aay receive a certain ainlaal set of Instructional aaterial, and others aay receive aaounts of instruction up to a aaxiaua; in all cases, rate of progress is simultaneously varying. This has resulted in strikingly wide variations in both aaount of Instruction and training tlae. A third, and aore recent, fora of prograaaed instruction has evolved through the Introduction of the digital cooputer into the physical aakeup of the teaching aachine. Coaputer-based instruction (or coaputer-assisted instruction) uses a fora of prograaaed instruction called "branching," in which learners are branched, at frequent tines during training, to the Instruction which is supposed to be most appropriate to their individual needj. While numerous physical arrangements and instructional strategies have evolved (3, 8, 10, 52, 53), the basic branching system involves the computer keeping track of all learner perform ance data and selecting the next sequence of instruction on the basis of prior performance. As with intrinsic pro grammed Instruction, branching programmed Instruction takesj into account individuality in amount of instruction, kind of instruction, and rate of progress; the major difference between these two forms of programmed instruction rests on the greater degree of complexity that is available in computer-based instruction for making decisions about learner performance. That is, in computer-based instruc tion it Is possible to record all student performance, and to base decisions on what to do with a particular learner on such criteria as responses to sets of questions (adjacent or remote from each other), latencies in respond ing to Individual questions or sets of questions, as well | as on responses to individual questions. { Psychologists concerned with programmed instruction have departed from traditional experimental practice in i that they have attempted to treat individual differences , in rate of learning, with subsidiary emphasis on perform ance variations as criteria for sequencing of instruction, j Recently, there have been proposals that the prelnstruc- tlonal characteristics of the learner be used as criteria for the selection of appropriate types of instruction. | Saltzman (45) has suggested that "bright students ... could go through the program at an accelerated pace by i I | skipping some of the practice frames, hut not skipping any of the statement [i.e., factual] frames and consequently ' i not sriLsslng any of the course content." Stolurow and | !Davis, in discussing computer-based teaching machines, have gone further and indicated that the entry behavior of the learner (such as ability and performance measures) be input i to the machine and used for specifying the instruction | appropriate for each student (52). j Two distinct procedures, then, have been proposed for the modification of instruction as a function of the i individual differences among learners. In one case, adap- tatlon of a basic sequence is made on the basis of learner performance on prior Instruction; the sequence of instruc tion any learner receives is dependent upon his own unique j characteristics while performing on the learning task. In the second case, adaptation is made on the basis of the entry behavior of the student; the Instructional sequence for any learner Is determined by his own unique measured capabilities. The questions then arise: Which, if either, of these procedures is the more effective method of adapt ing programmed Instruction to Individual learner differ- ;ences? And, is either of them more effective than the | typical programmed instruction procedure in which the | 7 sequence is fixed and only rate of learning varies among jlearners7 The Problem j The purpose of this study was to investigate the relative effectiveness of two alternative wethods of !adapting self-instructional materials to individual differ- 1ences aaong learners in a computer-based instructional system, and to compare the effectiveness of each method with a third method in which there was only minimal adapta- i tion to learners' individual differences. I One adaptation procedure, hereafter referred to as the branching condition, used data from learner performance on the instructional materials as a basis for adjusting subsequent instruction to Individual differences. The second adaptation procedure, the predicted condition, used data from learners' entry behavior, as measured by five tests administered prior to instruction, for predicting the I best sequences of instructional materials for each learner individually. The third procedure, linear condition, provided only for individual differences in rate of learn ing by giving the identical instructional sequence to all I learners. The study was conducted in two phases. During the j first phase, empirical trials of a self-instructional i j program In geometric inequalities were conducted with small groups of learners (fro* one to five at a tine) In order to determine strategies to be employed for the branching sequence and the predicted sequence during the experiment I In the second phase. In these first phase trials, data I : were collected from thirty-two subjects to discover appro priate criteria that Indicated whether or not a learner ! had mastered an Instructional unit, and to discover what [ adjustments In the sequencing of Instruction were appro priate for learners who had or had not achieved this mastery. In addition, data were also collected on the relationship between entry behavior of learners (as meas- j j ured by pretests) and the behavior of learners on the Instructional materials. As a result of the first phase trials, a branching structure for the self-Instructional program, and a method of predicting sequences of the j program, were both derived empirically. | In the second phase of the study, the relative effectiveness of the branching sequence, the predicted sequence, and the linear sequence was experimentally Investigated In a computer-based Instructional laboratory. Three randomly-assigned groups of high school students (N-44) were studied In the experiment. One group was presented Instruction using the branching condition, the second group was presented Instruction using the prediction condition, and the third group was presented the common j linear condition. At the conclusion of each sequence a 9 test of the material preseated was administered; in addi- i tion, training tine for each learner was recorded. The ; I scores on the test, and the training tines, were compared simultaneously. Answers were sought to the following questions: 1. In terms of both amount learned and training time, will either the branching condition or the prediction condition be more effective than the other In a computer-based Instructional system? 2. In terms of both amount learned and training j time, will either the branching condition or the prediction condition be more effective than the linear condition in a computer-based instructional system? i Setting of the Problem i The first phase of the investigation was conducted in unused classrooms of two high schools. Paid volunteer subjects came to these classrooms at the end of the normal school day and were administered the instructional program by the investigator. The second, experimental phase of the investigation was carried out in an experimental laboratory, containing a computer-based instructional system, maintained by the i System Development Corporation, Santa Monica, California. 10 Paid volunteer subjects were selected from e local high school (not the same as those Involved In the first phase) 1 and brought to the laboratory at the end of normal school i j i hours by bus. j j ! i Scope of the Investigation The study was limited to the analyses of learning ; I | I ’ and training time of subjects taking part in the second- | phase experiment. Forty-four subjects were randomly j assigned to the three treatment conditions, with fourteen subjects In one condition and fifteen subjects in each of ' the other two conditions. Equality of the three groups was | determined by comparing scores in five pretests. ! The study was limited further by the nature of the subject matter of instruction and the characteristics of the subjects. Geometry tends to be a well-integrated subject matter, with considerable interdependence among elements of topics. And, learners who have reached the level of enrolling in geometry represent a fairly restricted population within a high school, due to their success in prior mathematics and to the fact that geometry l is not a required subject. In addition to the limitations above, the variables under study were limited by their definitions. Definitions : of critical terms appear in the next section. Definition# of Terms The following definitions were employed in this study: Computer-based instructional aystew. A computer- based instructional system is a physical arrangeaent for carrying out prograaaed Instruction (q.v.) which includes a digital coaputer. The coaputer controls the sequence of the prograa (q.v.). provides the learner with knowledge of the correctness of bis responses, and records all pertl- i nent data during the training period. | Prograa. A prograa is a set of verbal, figural, and symbolic units presented to the learner. A prograa is a set of instructional aaterlals designed to produce learning in subjects using them. Progrnmard instruction. Prograaaed instruction is j a aethod of instruction carried out without involving a human instructor. It Is characterized by a lengthy set of brief units, aost of which call for a response froa the learner, and a means for informing the learner of the correctness of his responses. j Linear sequence. A linear sequence of a prograa (q.v.) is a common set of instructional materials presented to all learners in the identical order. Since all learners 12 receive the same sequence, only the rate of progression through the program varies. Prediction sequence. A prediction sequence of a program (q.v.) Is one In which a unique sequence of Instructional materials Is determined for each learner, based on results of pretests (q.v.). Branching sequence. A branching sequence of a program (q.v.) Is one In which each learner generates his own unique sequence of Instructional materials, with branches from one unit to another being a function of the learner*s performance on the learning task. Pretest. A pretest Is a test, administered to a learner prior to the learning task, which measures some ability of the learner. Posttest. A posttest Is a test, administered to a learner at the conclusion of the learning task, which measures mastery of the content of the Instructional materials. Organization of the Remaining Chapters A review of the literature pertinent to this Investigation Is presented In Chapter II. The methods and procedures for the two phases of the Investigation are found In Chapter III. Chapter IV contains an analysis of 13 the findings fron the Investigation. The sunaary, conclu slons, and reconaendatlons appear In Chapter V. CHAPTER II REVIEW OF THE LITERATURE In Chapter I, sources were cited which dealt with [ the need for the study of individual differences in the learning process. While it was pointed out that programed instruction is one area in which the relationship between individual differences and learning has been investigated, some researchers (7, 25) have argued that the issue of individual differences has not received sufficient study j even in the design and utilization of programmed instruc tion. Literature surveyed in this chapter is devoted to i three areas: (1) studies of branching as a means for adapting to individual differences; (2) studies related to | prediction as an adaptation procedure; and (3) requirements for special experimental procedures in studying programmed instruction. Studies of Branching i I i The use of learner performance data for the modifi cation of the sequencing of instructional materials has 15 followed three different approaches. The first Involves minimal Modification, In which subjects are recycled ! I through the sane Materials. In the second a Moderate degree of Modification is carried out by providing alter nate routes through the Materials, based upon relatively simple criteria. The third utilizes elaborate criteria j | for sequence Modification, and a complex mechanism (usually! i I ; i a digital computer) is used to record and monitor learner performance. Minimal Adaptation i Holland and Porter (31) had subjects repeat itesm of a program which they responded to incorrectly on the first presentation. When compared with subjects who only received the program once, with no repetition, the subjects who did repeat scored significantly higher on a posttest. The repetition group also spent more time in training, but no statistical comparisons of training times were carried | out. A more gross repetition procedure was used by Sllberman (47), who bad subjects repeat entire units when their error rate on the unit was high. Another version of the program did not allow for repetition (and also differed along other dimensions). The repetition version caused significantly more learning to occur. A related study of minimal adaptation was performed I by Silberman, Melaragno, Couleon, and Estavan (49:Exper. I) i using three treatment groups. Members of the first ! received a linear sequence, subjects In the second received the same sequence but were allowed to back up In the mate- i rials at their own options, while the same materials cast ! i into textbook format were given to the third group. The ;textbook procedure produced learning superior to that of the linear program; the back branching procedure did not differ in learning from the linear program; and no differ- I : I iences among the three groups' training times were found. | Moderate Adaptation A number of studies have employed a technique known as "bypassing," in which subjects are allowed to skip certain portions of the materials when they respond correctly to critical questions. The usual procedure has j the subject receive a minimal amount of instruction on a topic, then respond to a question; if the response is correct, the subject skips the remaining material on that topic. Coulson and Silberman (14) took an existing linear program and provided for skipping of redundant materials. No significant difference was obtained in posttest scores, but the branching group took significantly less time to complete the program than did a nonbranching group. In a similar study (23), a branching program was created by combining Items of a linear program; when subjects answered i i ! correctly the first portion of the combined item they were skipped past the remainder of the Item. Once again, when compared with the linear prograa made up of all the Items, the bypassing group was not different from the linear group. A more Involved bypassing study was conducted by : Campbell (6). Three prograa versions were developed: 1 a short linear, with 15 basic step Items; a long linear, j with 100 Items; and a bypassing, with the same 100 Items j I but the provision for skipping If the subject answered one of the 15 baslc-step Items correctly. On the posttest the | long linear version was superior to the bypassing, which In turn was superior to the short linear. In terms of training time the exact opposite ordering of program ver sions was found. A second type of moderate adaptation Involves the use of an alternate version of a program, sometimes referred to as a parallel version. When learning from such an arrangement, the subject responds to critical questions at the end of a topic and If he Is Incorrect, he receives the alternate version for that topic. Campbell (5) used a single question to determine learning of each } topic, while Roe (44) used two questions per topic. In neither case did the use of alternate sequences result In higher scores on a posttest, or In differences In training times, when compared with a linear sequence. i Id a somewhat different type of study, Beane (2) compared four prograaaed instruction conditions with a con trol condition in which "conventional Instruction" was ! I used. The four experimental conditions were: an all- | i ; linear sequence, an all-branching sequence, a one-half j i linear and one-half branching sequence, and a one-half branching and one-half linear sequence. Beane found no significant differences among the four experimental groups in either posttest scores or training time. (He also found no differences for posttest scores of experimental groups and the control group, but did find that the experimental groups used significantly less time to complete instruction than did the control condition.) Complex Adaptation While a large number of computer-based instruc- j tional systems have been developed for carrying out complexj adaptation to individual differences (see, for example, ' Stolurow and Davis [52:196-197]), few evaluative studies i of such systems have been conducted. One study compared a computer-based procedure with a conventional lecture condi- i tlon (4). The nine subjects who received instruction in i the computer-based system were presented with a linear sequence, and could obtain alternate sequences If they I requested them. Posttest scores showed the computer-based I group equivalent to the lecture group; training times of the former group were less than those of the lecture group, but no statistical evaluation of these times was made. i i The two major studies of complex adaptation using |a computer-based system are those of Silberman, Melaragno, Coulson, and Estavan (49:Exper. II), and Coulson, Estavan, Melaragno, and Silberman (12). In each study, a complex branching procedure was compared with a fixed sequence (e.g., linear sequence), but different procedures were i !followed in the two experiments. In the first study (49), subjects received the branching program, and the sequence generated by each subject was recorded. Then, subjects in the fixed-sequence condition were randomly paired with those in the branching condition, and the sequence generated by the branching member of the pair was given to his mate as a linear sequence. Training time was used as a covariate, and posttest score was used as the dependent variable. No difference was obtained between the two groups. For the second study (12), gross revisions were made to the branching procedure; in particular, instruc tional material prepared for the remediation of learning | difficulties was greatly different from that used in the i first study. During this second experiment a linear sequence, common for all subjects, was administered to one i | group, while a branching sequence was generated by each | subject in the second group. Training time and posttest 8core8 were analyzed separately; no difference was found for training time, but a significant difference In favor of the branching condition was found for posttest scores. Studies Related to Prediction There are no known published studies of the use of predicted sequences of Instruction. The possibility of predicting a subject's sequence from knowledge of his pre- | ; i {Instructional characteristics appears to be based on find- lings of relationships between learning and such subject characteristics as Intelligence (33), verbal ability (35), and under- and over-achlevement In mathematics (48). I A related study by Rigney (43) Involved the classl- flcation of learners as low IQ, average IQ, and high IQ, and the administration of two repetitions of a program to all subjects. In terms of errors on the programmed mate rial, the dullest group's mean after the second trial ! approached that of the brightest group's for one trial. One Interpretation of this result (19:388) Is that varia tions on a repetition sequence could be based on measured Intelligence. i One Important study that showed the possibility of |prediction was conducted by Shay (46). Three versions of a | I program were developed, varying In length and degree of {difficulty; each version was given to subjects of high, j iaverage, and low IQ In a balanced design experiment. 21 Results, in terms of posttest scores and training times analysed separately, were interpreted as showing that i ". . . the bright student ought to be allowed to skip entire sections of the program ..." (20:144). Experimental Procedures for Programmed Instruction I i ; i ' j It should be noted that many of the previously- cited studies of programmed instruction have involved the use of both posttest scores and training times as dependent variables. Luasdaine (34:309-310) has indicated that both variables are of major interest, but that it is difficult to interpret the results of comparisons showing one proce- ( dure producing higher achievement scores but also requiringI more time. The most typical procedure has been to report both sets of facts separately. One solution to this problem has been the use of efficiency scores, i.e., amount learned per unit time, : computed from a ratio of posttest score to training time | (2, 24, 55). A number of arguments have been offered against the use of such "efficiency scores," because they I violate principles of measurement. For example, it has i been held that posttest scores form only an ordinal scale rather than the required interval scale (30:102); that the two variables are not known to be linearly related (34:310); and that alternate methods of calculating such ratios lead to varying "efficiencies'* (54:546-555). A more satisfactory solution to the problem of two dependent variables Is the use of multivariate analysis of variance (more specifically, the blvariate analysis of variance-covariance model). Multivariate analysis of variance provides for the simultaneous evaluation of a msnber of dependent variables, and the blvariate case Is uniquely useful for evaluating posttest scores and train ing times (48:Pilot Study No. 1). CHAPTER III METHODS OF PROCEDURE AND SOURCES OF DATA Phase I: Empirical Trials ! i Subjects | ! J For the empirical trials conducted during the first phase of the Investigation, thirty-two subjects were obtained from two high schools. Subjects were recruited from first semester geometry classes, and were paid $3.00 ' for their voluntary participation. Subjects were drawn from classes In the first semester of geometry because the Instructional program dealt with geometric Inequalities, which Is taught normally some time during the second semester of geometry, and It was necessary to have subjects who were naive to the content of the program, yet possessed necessary mathematical prerequisites. Materials and Apparatus ! The program In Inequalities was prepared originally for another study (11), and was used subsequently In other i Investigations (13). In Its original form the program was made up of six units: the first fivs units trsstsd ! theorems, axioms, and postulates of inequality; the sixth unit contained instruction on methods for preparing to : I prove theorems, A brief quiz was presented at the end of each unit, covering the material contained in the unit, and an additional set of materials devoted to discussions of | | questions in the quiz followed each quiz. I The program was written in multiple choice format, with from two to five options for each question. Many of the steps in the program did not call for a response, but contained material which was only to be read by the learner. Each step of the program was presented on a separate page. j For the first phase of the present investigation, the correct answer to each question in the program was written on the back of the page containing the question. Measuring Instruments i In order to obtain a wide sampling of relevant I learner entry behaviors, seven tests were selected as ! potential pretests. These prospective pretests were studied for their validities in predicting instructional sequences during Phase I. One of these tests concerned j geometry fundamentals, and the other six were found by using Guilford's Structure of Intellect model (27) as a | means for determining intellectual abilities that should i bear a"relationship to the learning of geometry. The Structure of Intellect Model1 The Structure of Intellect Is . . . e unified theory of human intellect, which organizes the known, uni<|ue or primary intellectual abilities into a single system, . , . The dis covery of the components of intelligence has been by means of the experimental application of the method of factor analysis. . . . Each intellectual component or factor is a unique ability that Is needed to do well in a certain class of tasks or ! tests. <27:469-470) i In developing this theory of human Intellect, Guilford has j come to classify the components along three dimensions: j operations, contents, and products. Operations are the major kinds of intellectual activities that a person carries out with raw materials of information. Five operations are specified in the theory: cognition, the comprehension or understanding of informa tion; memory, the retention of Information; divergent production, the generation of a variety of information from a single source; convergent production, the generation of information from a number of sources; and evaluation, the making of decisions about informatlon. Contents are general varieties of information, and four types are known; flgural, with information In the form of images; symbolic, with information in the form of Appreciation is due Dr. Ralph Hoepfner, Aptitudes Research Project, University of Southern California, for his assistance in examining the Structure of Intellect and in selecting the six tests from the model. 26 signs; semantic, with information in the form of meanings i j i to which words become attached; and behavioral, with essentially nonverbal Information that is involved in humanj i interactions. i Products are the results of a person1s processing of information. Guilford has identified six products: j I units, circumscribed items of information; classes, aggre- I gates of items of information with common properties; rela tions, connections between units of information based on applicable variables; systems, structured aggregates of Information; transformations, changes in existing or known | information; and implications, extrapolations of Informa tion. The Structure of Intellect model has been repre sented frequently as a three-dimensional rectangular solid, shown in Figure 1. (The brief treatment of the Structure of Intellect contained in the preceding section is based on 27:4-5.) | Pretests from the Structure of Intellect In examining the Structure of Intellect for its relationship to the learning of geometry, six intersections of operations, contents, and products seemed appropriate. They were: 1. Cognition of semantic systems. This has been defined as the apprehension or comprehension of relatively 27 FIGURE 1 STRUCTURE OF INTELLECT MODEL PRODUCTS OPERATIONS CONTENTS C M D N E Cognition Memory Divergent Production U: Units C: Classes R: Relations Convergent Production S: Systems F: Figural S: Symbolic M: Semantic B: Behavioral Evaluation T: Transformations I: Implications i J 28 complex semantic materials, and has come to be known as a general reasoning factor (26, 36, 37, 39). This factor was i ; felt to be related to understanding the verbal nature of j the statement of geometry problems. A test of this factor is Problem Solving. (See Appendix A for an example of test items for this and other tests used in Phase I.) I 2. Cognition of symbolic implications. This has been defined as the discovery or awareness of symbolic consequents (26, 39). This factor was felt to be related to the ordering of symbolic steps in the development of a geometry proof. A test of this factor is Symbolic Group ing. 3. Divergent production of figural transforma tions. This has been the factor of adaptive flexibility with figural components, and was felt to be related to the i ! manipulation of figural elements of a geometry problem (21, 27). A test of this component is Match Problems. 4. Convergent production of symbolic systems. This factor has been defined as convergently producing a system starting with a few symbolic units (26, 39) and was felt to be related to the development of a geometry proof l from the limited set of "given" information. A test of this factor is Operations Sequence. : j 5. Convergent production of figural transforma- I tions. This factor has been defined as figural redefini- ; tion and, sometimes, as flexibility of closure (27). It was felt thet the factor was related to the figural redefi nitions of the figural component in a geometry problem. A teat of this component is Hidden Figures. ! 6. Evaluation of symbolic implications. This | 1 factor has been defined as sensitivity to the adequacy of consequents of symbolic content (29, 39), and was felt to be related to the ability to judge the adequacy of the steps in the proof of a geometry theorem. A test of this factor is Symbolic Reasoning. Geometry Pretest The seventh test selected as a potential pretest was a test of fundamentals of geometry which was developed ; previously and used in earlier investigations carried out j with the programs in inequalities (11, 13). The test is made up of thirty-one questions, and treats material typically taught in the first semester of geometry as well as algebraic calculations needed in geometry. For example, test questions assess measurement of angles and line , segments, equality theorems ("If things that are equal are subtracted from things that are equal, the differences are equal"), and the solving for an unknown in a linear equa tion. Samples of test questions are found in Appendix A. j ! Posttest | The posttest used to evaluate subjects' learning I after receiving the program in geometry inequalities is a 30 i modification of the test used In previous Investigations (11, 13). Modifications to the earlier version of this instrument were made to clarify ambiguities and to elim- j i inate irrelevant questions. The posttest was composed of thirty-three ques- tions, arranged as follows: seventeen questions in multiple choice format which called for responses either | j Identical with or highly similar to the responses called for in the instructional program; ten constructed response questions calling for responses similar to responses made to multiple choice items in the program; four problems cast in the form of geometry proofs, in which the subject was required to write out the reasons for given steps in the proof in a fashion similar to portions of the program; and two problems in which the subject was required to take a given geometry statement (which was never presented in the program), draw a figure appropriate to the statement, and develop a proof of the statement. The six problems in the latter portion of the test called for multiple answers, so that the total score pos sible for the test was 57. The multiple choice questions and the four problems in which the student supplied reasons; i l for given statements directly sampled material from the i program; the constructed response questions and the two , i proofs to be developed represented transfer tasks. In i | terms of the 57 points possible on the test, 19 points were 31 directly from the program, and 28 point! vara transfer i situations. The complete posttest is found in Appendix B. Methodology Lists of subjects willing to participate in the | Phase I trials were obtained froa two high schools. The subjects caae froa five classes in one school, and one j class in the second. The experlaenter aet with the sub* jects in saall groups, explained the nature and purpose of i the investigation, and administered the seven prospective pretests. Thirty-five subjects received the pretests, and i thirty*two of them coapleted their participation in the investigation. Following the pretesting, subjects were contacted individually and arrangements made for the empirical trials of the program. The first twelve subjects were adminis tered the program individually, the next ten in pairs, and | i the final ten received the program in two groups of five subjects each. In administering the program, the experimenter observed each subject's progression through the materials, responded to questions asked by subjects, and kept detailed[ ! records of the kinds and specific locations of unusual performances. In general, the experintenter observed sub jects to ascertain where in the program branching was | called for, and what Indications there were of the need for branching. In particular, two distinct types of i branching were determined from the trials in Phase I: locations in the program where subjects could afford to skip past redundant Instruction when their performance was satisfactory; and locations in the program where subjects' performance indicated the necessity for additional, remedial instruction. j The branching structure that finally evolved from j these empirical trials had the following characteristics: nine points in the program at which satisfactory perform ance allowed the subject to branch past redundant instruc- j i tion; and fourteen locations at which poor performance on quiz questions at the end of each of the six units of the program would cause the subject to be branched to remedial instruction. In summary form, the final branching struc ture was made up of 248 items in the main portion of the program, twenty items devoted to end-of-unit quizzes and feedback to the quizzes, and thirty-eight remedial items, for a total of 306 items. A subject who performed so well that he branched past all redundant Instruction and never branched to remedial instruction would receive 225 items, while the subject who performed so poorly that he never branched past redundant instruction and branched to all 2 the remedial instruction would receive 306 items. 2 M.B. NO subject in the Phase I trials reached either of these extremes. The best subject received 233 items, and the poorest subject received 293 items. Appendix C contains a complete specification of the branch ing structure derived empirically during Phase I. After the empirical trials, the experimenter scoredj J all seven pretests for the participating subjects and explored ways in which scores on the pretests were related to performance on the program, In order to develop a method of predicting instructional sequences from pretests. j A number of possible strategies were considered, and aban- j doned because of their poor predictive validity, before the| final one was determined. I The technique used was that of multiple cutoff scores for decisions made at each of the branch points in the program. That is, for each of the branch points sub- j jects were divided into those who had or had not branched; then a pattern of pretest scores which would most effec- i tively discriminate between the two groups was determined. ; Pretest raw scores per se did not prove useful, so the scores on each test were dichotomized at the mean value and these dichotomies were then related to passing or falling at a branching decision. In essence, this involved a 2x2 table for each decision point; the paradigm is given below. Prediction Criterion Branching Decision Pass Fail____ Met By applying this technique, It was possible to i ipredict whether or not any given subject should branch on the basis of a pattern of high (above the mean) or low : (below the mean) scores on certain pretests. This method | resulted In the elimination of two pretests which showed no relationship to branching decisions, the Symbolic Seasoning and Mstch Problems tests. For the experimental Investigation of predicted sequences carried out In Phase : II, then, the following pretests were used: Hidden Figures, Symbolic Grouping, Problem Solving, Operations Sequence, and geometry fundamentals. Table 1 contains the prediction structure for the decision points In the program, Including tests for Independence for each 2x2 table. As can be seen In Table 1, some of the prediction procedures were not highly effective In discriminating between subjects who did and did not branch at certain points. Phase II: Experimental Investigation The trials conducted In Phase I provided empirical evidence for branching procedures and prediction procedures with the Inequalities program. In Phase II, the two questions of the Investigation were studied through an j experiment comparing a branching condition, a prediction | condition, and a linear condition In a computer-based j instructional laboratory. TABLE 1 | I PREDICTION CRITERIA FOR DECISION POINTS IN THE PROGRAM | i Decision Point Operation Prediction Criterion* tf2 P 1 Skip 2 Items None: too few Ss failed to skip i i j 2 Skip 2 Items None: too few 3s failed to skip I 3 Receive 7 Items Low PS 7.33 -c.Ol 4 Receive 1 item None: too few Ss received Item 5 Receive 1 Item None: too few 3 s received item 6 Receive 1 Item More than 1 low of HF, PS, SG, G 5.22 < .05 7 Receive 3 items More than 1 low of HF, PS, SG 6.13 4.05 8 Skip 5 items High G 2.08 >.05 9 Receive 3 items More than 2 low of all 5 tests 1.37 >.05 10 Receive 2 items More than 1 low of HF, PS, OS, G 6.54 <.05 11 Receive 3 items More than 1 low of HF, PS, OS, G 2.56 >.05 12 Skip 3 items High HF and high PS 0.98 >.05 13 Skip 2 items None: too few Ss failed to skip 14 Skip 7 items More than 2 high of all 5 tests 10.12 4.01 15 Receive 3 items More than 1 low of OS. SG, G 5.49 4.05 16 Receive 5 items More than 2 low of all 5 tests 2.11 >.05 17 Skip 11 items More than 2 high of all 5 tests 7.97 4.01 18 Skip 14 items More than 2 high of all 5 tests 7.97 *.01 19 Skip 7 items More than 2 high of all 5 tests 7.97 4.01 20 Receive 5 items None: too few Ss failed to receive items 21 Receive 2 items Low OS “ 10.14 41.01 22 Receive 2 items Low SG and low on 2 other tests 4.10 4.05 23 Receive 2 items More than 0 low of HF, SG 3.39 >.05 *HF - Hidden Figures SG ■ Symbol Grouping w PS - Problem Solving G - Geometry Fundamentals m OS ■ Operations Sequence Subjects Subjects for the experiment conducted in Phase II were recruited at a different local high school from those j used In Phase I. The subjects were obtained from seven | separate classes In the second semester of geometry. In i i order to assure naivete on the part of the subjects with respect to geometric Inequalities, teachers of these seven classes postponed any treatment of Inequalities until after the experiment was completed. The experimenter obtained subjects by Indicating that the material covered by the program would be useful to them later In the semester, and by agreeing to pay $5.00 for their participation. Fifty-three students agreed to participate, and were administered the pretests. Subjects I were randomly assigned to three treatment groups, with eighteen assigned to two groups and seventeen to the third. Due to an Influenza epidemic at the time of the experiment, only forty-four subjects completed the study, fifteen in each of two treatment groups and fourteen In the third. Treatment Groups The three treatment groups hereafter will be j referred to as Linear, Prediction, and Branching. Subjects In the Linear group (N-14) all received a common set of Instructional materials, composed of 285 items. Subjects 37 In the Prediction group (1^15) had their sequence of i instructional materials predicted on the basis of their performance on the five pretests. The Branching group (N-15) had their own unique sequences determined on the I basis of their responses to certain items in the program, I including their responses to the quizzes located at the end | of each of the six units of the program. In order to test the effectiveness of the random assignment of subjects to the three groups, each of the five pretests was subjected to analysis of variance for differences among means. Table 2 summarizes the pretest scores for the three groups, and includes the F values obtained for each pretest. As can be seen, all differences among means were nonsignificant except that for the Opera tions Sequence pretest. Thus the three groups do not differ materially from one another. Design of the Experiment The design of the experiment was a one-way, random effects multiple analysis of variance, with the three treatment conditions as independent variables, and with scores on the posttest and training times as dependent variables. The following hypotheses were tested in the experiment: i 1. There is no significant difference, in terms TABLE 2 PRETEST SCORES FOR THE THREE TREATMENT CONDITIONS Pretest Linear Prediction Branching F* P Hidden X 7.000 7.933 8.533 1.599 > .10 Figures <r 2.541 2.344 2.065 Problem X 6.785 4.933 5.400 2.306 o • A Solving < r 2.665 2.491 2.028 Operations X 12.071 8.933 11.933 3.857 < .05 Sequence < 5 * 3.361 3.453 3.594 Symbol X 6.000 5.400 5.733 0.191 > .10 Grouping G 2.745 2.261 2.814 Geometry X 22.357 21.800 23.600 0.790 > .10 Fundamentals G 5.153 3.233 3.459 39 | of the Joint effects of posttest score and training times, between the Prediction condition and the Branching condi tion. 2. There is no significant difference, in terms of ; the joint effects of posttest score and training times, I j between the Prediction condition and the Linear condition. ! 3. There is no significant difference, in terms of| the joint effects of posttest score and training times, j between the Branching condition and the Linear condition. Materials and Apparatus i The final version of the program in Inequalities, as described earlier in this chapter, was used as the experimental vehicle. The program was contained in loose- leaf notebooks, with each item on a separate page. At the end of each of the six units in the program a quiz was j i presented, covering the topics included in that unit. The i subjects were directed to answer questions in the quiz ; directly on the quiz sheet, then to tear out the quiz and i go on to the next item in the program. The next item(s) j contained the correct answer(s) to quiz questions, and | I asked the subject to evaluate the adequacy of his own answers by choosing a multiple choice option that most closely described his performance. Appendix D contains sample Items from the program, including a quiz and an evaluation item. I The experiment was conducted In CLASS, a 20- statlon, computer-based instructional laboratory at the System Development Corporation, Santa Monica, California. Each station was composed of a desk, with walls and a front to form an Isolated carrel, a notebook containing the pro gram, a device for the student's responses to questions in the program, scratch paper and a pencil. Figure 2 is a view of CLASS, which is a temperature-controlled, acous tically-treated laboratory of approximately 27 feet by 31 feet. All three treatments were administered in C1ASS, to control for novelty of the environment. The student's response device, shown in Figure 3, was connected to a Fhllco 2000 computer. Each student was directed to a page in the program by a digital readout in the four windows near the top of the response device. The student responded by pressing one of the five buttons labeled a, b, c, d, and e, then pressing the ENTER bar. (This enabled students to change answers, since the com puter accepted the last answer button depressed before the pressing of the ENTER bar as the student's response.) Depressing the ENTER bar activated a light beneath the selected response button, acknowledging the student's selection. Feedback to the student was provided in two ways. A green or red light at the top of the device was lighted if the selected response was correct or Incorrect, j FIGURE 2 THE LABORATORY FACILITY FIGURE 3 THE RESPONSE DEVICE 43 respectively; In addition, a light immediately to the left jof the correct response also went on. Thus, the student was informed both that he was right (or wrong) and also ! what the correct response was. In case the student made an illegal response (e.g., failed to respond to an item, or responded with a choice that was not one of the available ones), all three lights at the top were lighted. Appro priate action by the student would turn off the illegality indication. Other buttons on the response device were not operational during this experiment. When a subject had completed an item, the ENTER bar was again depressed. This turned off all lights, released the response button, and caused the number of the subsequent item to appear in the four windows. When a subject encountered an item which did not call for the selection of a response, he read the item and depressed the ENTER bar once to advance. Three computer programs were used for the experi ment. One, MENTOR, monitored and controlled the student response device, and recorded on binary tape all actions taken by each subject. The second, Lesson Assembler Program, took punched-card inputs describing the instruc- tlonal program and created a binary tape to be used by MENTOR during experimental runs. The third, GRADER, read i the binary tape of recorded student actions and printed out a complete history for each student, containing: 1. the numbers of all Items seen by the student; 2. the responses given by the student; 3. the correctness of each response; 4. latency In responding, once an item was pre sented; and 5. a summary of items, time, and errors. The posttest, described earlier in this chapter and reproduced in Appendix B, was used to measure the subject's learning. The experimenter recorded the time taken by each subject to complete the instructional program. Methodology Prior to the beginning of the study, subjects were administered the pretests in four separate testing sessions in an unused classroom of their high school. The subjects were brought to the laboratory for experimental runs by chartered bus at the end of normal school days. Each treatment condition was run independently. The Linear group was run first, then the Branching group, and finally the Prediction group. On the first day a treatment group came to the laboratory, a standard set of directions was read to the subjects, explaining the opera tion of the equipment and procedures to be followed. The text of the instructions appears in Appendix E. After receiving directions, the subjects began 45 working on the Instructional program, with each subject progressing at his own rate. Questions about the equipment were answered by the experimenter, but when a subject asked about the content of the instructional program he was told that all necessary instruction was to be found in the mate rials. When a subject had completed the program he remained at his desk, studying from his school textbooks; a subject was not allowed to look over the inequalities program once he had completed it. The posttest was administered to subjects in a group, for each condition. It was given the day following the completion of the program by all subjects in the group, and was administered in an unused classroom of the sub jects' high school. Linear Condition For the Linear condition, all fourteen subjects received the same 285 items from the program. This included 248 main instruction items, twenty items of quizzes and feedback on quizzes, and seventeen remedial instruction items. Prediction Condition Prior to the first day of instruction, the five pretest scores for the fifteen subjects in the Prediction condition were stored in the MENTOR control program. At 46 each decision point in the inequalities program MENTOR would examine these pretest scores and direct each subject to the next item that had been predicted as appropriate for him. Branching Condition At each decision point in the inequalities program, the MENTOR control program would examine certain prior performances of each of the fifteen Branching condition subjects and determine whether or not to branch the student. Most of the decisions concerning branching were made as a function of subjects' evaluations of their own performance on the end-of-unit quizzes. (Evaluations of quiz performances were Ignored in the Linear and Prediction conditions.) CHAPTER IV ANALYSIS OF FINDINGS Tests of Hypotheses The scores on the posttest and the training times were used to test the hypotheses under investigation. The hypotheses tested were: : There will be no difference between the Linear condition and the Prediction condition, in terms of the joint effects of posttest score and training time. H2: There will be no difference between the Linear condition and the Branching condition, in terms of the joint effects of posttest score and training time. H^i There will be no difference between the Prediction condition and the Branching condi tion, in terms of the joint effects of post test score and training time. 48 Methods of Data Analyst* ! Individual poettest scores and training tines (in i i minutes), by groups, are presented in Table 3. The use of | two dependent variables suggested that the data be analysed by multivariate analysis of variance techniques. A gener alization of Student's .t-ratio technique was used for the !tests of the individual hypotheses. In addition, each of the two dependent variables was analyzed by techniques of i analysis of variance, including multiple comparisons among treatment conditions. j Analysis of Data The means and standard deviations for both depend ent variables, by groups, are presented in Table 4. To show the relationships among the three conditions, Figure 4 contains a bivariate plot of all data points for the two variables and the three treatment conditions; the bivariate I dispersions and centroids are also shown in the figure. As a preliminary step, Box's test of homogeneity of dispersions was performed to test the hypothesis of random ness of samples from a population with a comnon dispersion (see Cooley and Lohnes [9:62-63]). (This preliminary step Is analogous to performing a test for homogeneity of vari ance when computing univariate analysis of variance.) The resulting calculations are shown in Table 3. An F value of TABLE 3 POSTTEST SCORES AND TRAINING TIMES FOR THE THREE TREATMENT CONDITIONS Linear Condition Prediction Condition Branching Condition Postteat Score Training Tine* Posttest Score Training Tine* Posttest Score Training Tine* 50 210 33 197 46 157 38 190 42 227 42 131 34 184 41 148 35 127 14 184 29 163 32 149 30 232 41 202 45 122 26 212 39 160 32 122 37 147 37 143 38 176 52 166 35 141 44 136 40 174 26 141 35 226 41 231 48 171 43 147 40 187 42 158 47 120 40 186 42 137 37 118 40 181 43 144 45 159 41 247 36 197 50 169 26 200 46 148 *In minutes 50 TABLE 4 POSTTEST SCORES AND TRAINING TIMES ! FOR THE THREE TREATMENT CONDITIONS (MEANS AND STANDARD DEVIATIONS) i Linear Condition Prediction Condition Branching Condition POSTTEST SCORES Means 37.357 37.333 41.133 Standard Deviations 9.483 6.510 5.804 TRAINING TIMES* Means 195.071 168.600 147.133 Standard Deviations 27.841 28.729 38.550 *In minutes Posttest Scores FIGURE 4 GRAPHICAL REPRESENTATION OF TRAINING TIMES AND POSTTEST SCORES o - Linear Group; O a» Prediction Group; & □ - Branching Group; 0 Centroid; Centroid; Centroid; enscribes dispersion enscribes dispersion enscribes dispersion ss SO 45 40 is 50 ZS 20 IS /O -------------------------------------------------------------- O ( 10 1 30 QO 140 ISO Ito {TO 190 190 ZOO 210 I lO H o 140 ISO O OO - o a — o - Training Time (Minutes) 52 0.622 was obtained. For 6 and 4700 degrees of freedom, this value has a probability of occurring by chance between .50 and .75, so it is not significant. Therefore, the I hypothesis of sampling from a population with common dis- | persion could not be rejected. Failure to reject the I hypothesis supports the belief that these samples are not i I heterogeneous in bivariate dispersions, and that if the I multivariate analysis of variance were to yield a signifi- | cant value, the rejection of the hypothesis would most probably be due to differences in mean vectors• The multivariate analysis of variance was computed j using the technique presented by Rao (42:258-264). The j results of the analysis are given in Table 6. The analysis yielded an F value of 11.571, with 4 and 82 degrees of freedom, which is significant beyond the .0005 level of confidence. (An F of 5.67 is significant at the .0005 level, with 4 and 82 degrees of freedom.) Since the hypothesis of homogeneity of dispersions could not be rejected, and the hypothesis of random sampling from a common population was rejected (by the multivariate analysis of variance above), it seems probable that the variation in the mean vectors of the experimental treatments is responsible for the significance of the ! obtained F. Therefore, tests of the three hypotheses were performed. | The hypotheses under investigation were tested 53 TABI£ 5 BOX'S TEST FOR HOMOGENEITY OF MULTIVARIATE DISPERSIONS l Di J (Linear Condition) 83.515306 -14.954081 -14.954081 719.780612 59889.07353 l°2| (Prediction Condition) 39.555556 -6.000000 -6.000000 770.373333 - 30436.54551 (Branching Condition) 31.448889 -8.551111 -8.551111 760.782222 - 23852.63415 M- 3.993191, l£7000 - .62170, ,50<£ C.75 54 TABLE 6 RAO'S TEST FOR EQUALITY OF MULTIVARIATE MEAN VECTORS M 2234.28095 -427.62381 -427.62381 33044 * 26190 - 73647302.747101 2376.18182 -1705.72727 -1705.72727 49713.15909 115217999.324804 639199 F - 11.571, df - 4 and 82, £41.005 55 o using Hotelling's T , which is an extension of Student's o t-ratio for univariate samples (42). T is evaluated for i significance by transfoming it to F, and calculating the :appropriate degrees of freedom. Multiple regression proce dures were used to obtain F; this Is identical to computing 2 F froe T (see Anderson [1:86-103]). The values of F obtained to test the three hypotheses of this study are presented in Table 7. TABLE 7 VALUES OF F FOR COMPARISONS OF PAIRS of ‘ Bivariate centroids F if Linear vs. Prediction 3.057 2,26 .10 > p > .05 Linear vs. Branching 10.593 2,26 pc .001 Prediction vs. Branching 3.263 2,27 .10 > p > .05 For the first hypothesis, that the Linear condition and the Prediction condition would not differ with respect to the joint effects of posttest score and training time, an F of 3.057 was found. For 2 and 26 degrees of freedom, this value was not significant (for 2 and 26 degrees of freedom, an F of 3.37 is significant at the .05 level of confidence). Therefore, the first hypothesis was not | rejected. 56 To test the second hypothesis, that the Linear con dition and the Branching condition would not differ with respect to the joint effects of posttest score and training time, an F of 10.593 was found. This F, with 2 and 26 degrees of freedom, exceeded the 9.12 required for signifi cance at the .001 level of confidence. Then, the second hypothesis was rejected. For the third hypothesis, that the Prediction and the Branching conditions would not differ with respect to posttest score and time evaluated simultaneously, an F of 3.263 was obtained. For 2 and 27 degrees of freedom, an F of 3.35 is necessary for significance at the .05 level; the obtained F was not significant, so the third hypothesis was not rejected. Additional Analyses of the Data In order to gain further information on the effects of the treatment conditions, the two dependent variables were also subjected to univariate analyses. As preliminary steps, Bartlett's test for homogeneity of variance (18:125- 128) was performed on each variable, then each variable was subjected to analysis of variance (see Tables 8 and 9). 57 TABLE 8 i BARTLETT'S TEST FOR HOMOGENEITY OF VARIANCES, AND ANALYSIS OF VARIANCE, FOR POSTTEST SCORES Bartlett's Test: - 3.618, df - 2, £>.10 i 1 _ Analysis of Variance Source df Sums of Squares Mean Squares F Between 2 141.901 70.950 1.301* Within 41 2234.281 54.494 Total 43 2376.182 *£ > .10 TABLE 9 BARTLETT'S TEST AND ANALYSIS OF FOR HOMOGENEITY OF VARIANCES, VARIANCE, FOR TRAINING TIMES Bartlett'8 Test 0.014, df - 2, £ > .10 Analysis of Variance Source df Stans of Squares Mean Squares F Between 2 16668.906 8334.453 10.341* Within 41 33044.265 805.957 Total 43 49713.171 .005 i -j ! Analysis of Posttest Scores Bartlett's test for homogeneity of variance yleldedj a %x of 3.618 which, for 2 degrees of freedom, is not significant at the .05 level. Since the hypothesis of variance homogeneity could not be rejected, an analysis of variance was performed. This yielded an £ of 1.301; with 2 and 41 degrees of freedom, this value was not signifi cant. Analysis of Training Times In applying Bartlett's test for homogeneity of I variances, a XZ °f 0.014 was obtained which, for 2 degrees of freedom, is not significant at the .05 level. The analysis of variance was then performed, and an F of 10.341 was found. For 2 and 41 degrees of freedom, this F was significant beyond the .005 level (an F of 6.07 is signifi cant at .005, with 2 and 41 degrees of freedom). | Having found a significant difference among the three treatment conditions, the £-test was then applied to evaluate differences between pairs of treatment means (18:140-144). As shown in Table 10, all comparisons yielded significant values for t. That is, the Prediction condition had a mean training time that was significantly less than that for the Linear condition (t - 2.510, i £<•05); mean training time for the Branching condition was significantly less than that for the Linear condition (t - 4.545, £<.01); and the Branching condition had a significantly lower training time than did the Prediction condition (t - 2.071, £<.05). TABLE 10 VALUES OF t FOR COMPARISONS OF MEAN TRAINING TIMES t df Linear vs. Prediction 2.510 41 £< .05 Linear vs. Branching 4.545 41 £ <.01 Prediction vs. Branching 2.071 41 £ <.05 CHAPTER V SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS Suw*ry i A two-phase study of methods of adapting self- instructional materials to individual differences among learners was performed. During the first phase, an adap tation procedure based on subjects1 performance on the learning task (Branching condition) and an adaptation procedure based on subjects' pretraining abilities (Pre- I diction condition) were determined empirically. In the |second phase, an experiment was conducted in a computer- based instructional laboratory in which comparisons were made among the Branching condition, the Prediction condi tion, and a control condition involving only minimal adaptation to individual differences (Linear condition). The Branching and Prediction procedures were developed in Phase I by administering an existing program in geometric inequalities to thirty-two high school sub jects individually or In small groups. Performances of i these subjects on the program were observed and recorded, 61 and twenty-three locations In the program were found at |which subjects either could receive additional remedial instruction when they demonstrated a lack of learning, or i could be skipped past redundant instruction when they i |demonstrated adequate learning. Criteria for remediation t and skipping were established, as well as the amount of jremediation necessary or the amount of skipping possible; these criteria and amounts of instruction made up the Branching condition. For the Prediction condition, sub jects were administered seven pretests which were then related to the decisions made at each of the twenty-three adaptation locations in the program. Five of the pretests were found to be useful predictors of subjects' perform ances; four of the pretests came from Guilford's Structure of Intellect model and the fifth was a test of geometry fundamentals. For the experimental comparisons of the two adapta tion procedures, forty-four subjects were randomly assigned ito three treatment groups, fourteen subjects to the Linear 1 condition, fifteen to the Branching condition, and fifteen to the Prediction condition. Each treatment was adminis tered separately in a computer-based instructional labora tory. Subjects in the Linear condition were all presented 285 items of instruction, with variations in training time as the only possible adaptation to individual differences. |The computer kept track of the performances of subjects in the Branching condition, and at each of the twenty-three adaptation locations the computer decided what instruction was then appropriate using the pre-established criteria. For the Prediction condition, the five pretest scores for each subject were stored in the computer, and decisions were made at the twenty-three adaptation locations as a function of these pretest scores. At the conclusion of training for all members of a given treatment condition, a posttest was administered to that group. Time spent in training by each subject was also recorded. The scores on the posttest and the training times were analyzed simultaneously by using them as two dependent variables in a multivariate analysis of variance. The resulting F was significant at the .0005 level, leading to the rejection of the null hypothesis of no difference in mean vectors among the three treatment conditions. Comparisons of pairs of treatment condition cen troids showed that the Linear condition and the Prediction condition were not significantly different, that the Pre diction condition and the Branching condition were not significantly different, and that the Linear and Branching conditions were significantly different at the .001 level. Subsequent analyses of each dependent variable by univariate analysis of variance demonstrated that there was no significant difference among the three treatment conditions on posttest score means, and that a significant difference at the .005 level occurred among the training time means. Further analysis of the training times showed that the Linear condition took significantly more time than either the Prediction condition (£<.05) or the Branching condition (£<.01), and that the Branching condition took ! significantly less time than the Prediction condition <£<.05). Conclusions i Examination of the centroids presented in Figure 4 j shows a relatively clear ordering of the three treatment conditions. The Linear condition is the least effective, the Branching condition is the most effective, and the Prediction condition falls in between. Rejection of the null hypothesis of no difference between centroids of the Branching and Linear groups implies that an adaptation procedure based upon subject performance on the learning task is a useful strategy to follow. However, the finding that the Prediction condition does not differ from either the Branching or the Linear conditions tends to cloud the issue of whether adaptation to individual differences is necessary, and, if it is, what approach is to be preferred. Some clarification of this situation is possible when the results of the training times analysis are 64 considered. Here, the ordering of the three conditions is significant. This implies that, when a self-instructional program produces essentially equivalent learning among sub jects regardless of whether or not individual differences I are considered, attending to such individuality in learning; | l ! can result in significant savings of training time. In !addition, it seems that when individual differences are to j 1 be considered, a method using performance on the learning jtask (branching) is preferable to a method based on pre training assessment (prediction). Implications for Adapting to Individual Differences When an existing self-instructional program is to be used, with no important revisions to the program being ! carried out, this study shows that branching is superior to a linear presentation of the instruction. This superiority is evident when both amount learned and time spent on training are considered. Further, when using an existing program and when training time is critical, this study has shown that pre diction will result in significant savings of learners' time over the time spent with a linear presentation. The results also show that additional savings in training time, beyond those found for prediction, are possible when | branching is carried out. In general, then, three conclusions relevant to adapting Instruction to Individual differences are possible from the results of this Investigation: 1. The strategy of varying Instruction on the basis of learners' abilities can reduce train ing times significantly. This substantiates suggestions that prediction can be more effec tive than linear presentation. 2. A branching strategy can take significantly less training time than either prediction or I linear strategies of programmed Instruction. 3. When both amount learned and training time are of interest, branching is significantly supe rior to a linear presentation. This supports some earlier research comparing branching and linear programmed Instruction. These conclusions, and their Implications for ! adapting to individual differences, must be evaluated In light of the restrictions in this study. As Indicated in Chapter I, geometry represents a unique subject matter because of the interrelationships among topics and the orderly development of instruction. And, learners who are enrolled in geometry are a restricted sample of a high school population, due to their successes with prior mathe matics. Such restrictions Imply that the conclusions and Implications set forth above may well be limited to highly organized subject matter areas (e.g., mathematics) and to learners who have achieved some success In prior, related areas. Need for Further Research While the results of this Investigation yielded some conclusions applicable to the employment of strategies ifor adaptation to Individual differences, a more Important conclusion would be that further research In the area Is necessary. Three kinds of needed research have been recog nized: research on branching, on prediction, and on adap tation In general. On Branching In this study, a completely developed self-Instruc tional program was used, and a branching structure was added to It. During the empirical trials In Phase I, It became clear that the most effective possible branching procedure could not be developed within the restriction of using an existing program. That Is, these trials showed the necessity for developing the branching structure as an integral part of the program, while the program was being prepared. Specifically, two closely related research activi ties concerned with branching are needed: the study of i i appropriate criteria for determining whether or not a given 67 learner has mastered a given task, and the study of appro- j prlate amounts and kinds of Instruction to be presented as the result of a given learner's demonstration of mastery or i lack of it. Answers to two questions integrally Involved | in branching are lacking: (1) what data on learner per- l formance should be recorded, and how should it be used to | evaluate the learner's mastery; and (2) when a learner demonstrates mastery how much and what kind of instruction i can he afford to skip, and when a learner demonstrates a lack of mastery how much and what kind of remediation is appropriate. Until more is known about methods of assessing learning and about methods of responding to the assessment, branching programmed instruction will not reach its maximum capabilities. On Prediction In this study, a theoretical model of human intel lectual abilities was examined to locate measures related to the learning of a particular content. While this approach resulted in a method of predicting instructional sequences for different learners, the predictions were less than optimal. The particular problem which needs to be j resolved is the empirical determination of measurable learner characteristics that are related to the mastery of i some task. 68 Ideally, within the limits of measurement error and chance error, It should be possible to measure certain characteristics of learners and to designate which ones will and will not achieve certain objectives from the pre sentation of certain instruction. In such an ideal situa tion, it would be possible to specify prior to training which learners would need only the minimal amount of instruction, which ones would need the maximal amount, and gradations of amounts of instruction between these extremes• At present, this ideal cannot be attained, and j research is needed. One fruitful source might be through the application of factor analytic techniques to batteries of measures and learner performance data. Factor analytic investigations would provide useful information on measures related to the achievement of objectives of instruction. On Adaptation This study suggests the possibility of a third method of adapting programmed instruction to individual differences, in which prediction and branching techniques are combined. It would appear to be useful to assess learner characteristics prior to instruction and to make predictions of appropriate sequences of material, then to modify these predictions as a function of the learner performance while proceeding with the learning task. Research evaluating the relative effectiveness of predic tion, branching, and prediction modified by branching might demonstrate the utility of a third method for adapting to individual differences. Recommendations The following specific recommendations are made on the basis of the results of this study: 1. When a computer-based instructional system is available, a method of branching should be determined empirically and used during instruc tion. 2. In the absence of a computer-based instruc tional system, a method of prediction should be developed empirically and used during instruc tion. 3. Further research should be carried out in three areas: (a) methods of assessing learner per formance and methods of branching resulting from such assessments; (b) determination of adequate measures of pretraining learner characteristics and of the relations between these characteristics and appropriate amounts and kinds of instruction; and (c) e rategies for combining prediction and branching proce dures in order to improve adaptation to j 70 Individual differences. i i To apply the first recommendation, it would be j necessary to administer a program to subjects individually, so that branching criteria could be located and means of | varying the presentation of instruction as a function of subjects' performances could be ascertained. Once a | branching procedure has been developed from observations 'of a few subjects, it should be verified and refined by using it with very small samples of subjects who are under continuous observation. I In applying the second recommendation, it seems appropriate to specify the behavior to be exhibited by learners when proceeding with the program, and to locate ability tests which measure the same or highly related i behaviors. Using a model of human intelligence, like the Structure of Intellect, appears to be a useful way of locating tests related to behavior to be exhibited on the program. Once a set of tests has been isolated, the rela- ! tionshlp between scores on these tests and performance with the program should be determined, and a method of predict ing Instructional sequences from pretest scores estab lished. i BIBLIOGRAPHY bibliography Anderson. T. W. Introduction to Multivariate Statis tics! Analysis. Hew York: Wiley. lV58. Beane, D. G. A Comparison of Linear and Branching Techniques"of Programed Instruction in Plane Geometry! Technical keport No. 1. lirbana: University of Illinois, 1962. Bitzer, D. L., Braunfeld, P. G., and Llchtenberger, V. W. "PLATO II: A Multiple-Student, Computer- Controlled, Automatic Teaching Device." In J. E. Coulson (ed.), Prograar—^ Tr*rnlng and Computer- Based Instruction. Hew York: Wiley. 19&2. P p . "ZDJ-7T5. ----- Braunfeld, P. G., and Fosdlck, L. D. "The Use of an Automated Computer System in Teaching," IRE Trans actions on Education. E-5:156-167, 1962. Campbell, V. N. Adjusting Self-Instructional Programs to Individual Differences: Studies oi Cueing. Res ponding snd Bvnassing. San Mateo: American Institute for Research, 1961. "Bypassing as a Way of Adapting Self- Instruction Programs to Individual Differences," Journal of Educational Psychology. 54:337-345, 1961. Carroll, J. B. "Programed Instruction and Student Ability." Journal of Programed Instruction. 2:7-11, 19tt Chapman, R. L., and Carpenter, J. T. "Computer Tech niques in Instruction." In J. E. Coulson (ed.), Programmed Learning and Computer-Based Instruc tion. flew York:' ^ 7 7'1962. "Pp."240-253.--- Cooley, W. W., and Lohnes, P. R. Multivariate Proce- dures for the Behavioral Sciences, itew Yl ork: W£Iey,"T952.-------------------- 73 10. Coulson, J. E. "A Computer-Based Laboratory for Research and Development in Education.*1 In J. E. Coulson (ed.), Programr^ T^*^ning and Computer- Based Instruction, hew York: Wiley, I9b2. v r i s r - m : ------- 11. . Research vith a Program on Geoaetric Inequalities. TM-895/102/00. SantaTlonlca: System Developaent Corp., 1964. 12. Coulson, J. E., Estavan, D. P., Melaragno, R. J., and Silberaan, H. F. "Effects of Branching in a Computer Controlled Autolnstructlonal Device,** Journal of Applied Psychology. 46:389-392, 1962. 13. Coulson, J. E., Melaragno, R. J., and Silberaan, R. F. Non-Program Variables in the Application of Programmed instruction" TFf-2l/o/201/()0. Santa Monica: System development Corp., 1965. 14. Coulson, J. E., and Silberaan, H. F. "Effects of Three Variables in a Teaching Machine," Journ of Educational Psychology. 51:135-143, 1960. 15. Cronbach, L. J. "The Two Disciplines of Scientific Psychology," American Psychologist. 12:671-684, 16. Crowder, N. A. "Intrinsic Programing: Facts, Fallacies, and Future." In R. T. Filep (ed.), Prospectives in Programing. New York: Macmillan, 1963. P p . 84-ii5. 17. Eckstrand, G. A. "Individuality in the Learning Process: Some Issues and Implications," The Psychological Record. 12:405-416, 1962. 18. Edwards, A. L. Experimental Design in Psychological Research. New York: Rinehart, 1960. 19. Evans, J. L. "Programming in Mathematics and Logic." In R. Glaser (ed.). Teaching Machines and Pro- « ramed Learning. II. Washington, d.g.: National ducation Assn., 1965. Pp. 371-440. 20. Fry, E. B. Teaching Machines and Programmed Instruc tion. New York: McGraw-Hill^ 1963. 21. Gershon, A., Guilford, J. P., and Merrlfleld, P. R. "Figural and Symbolic Divergent-Production Abili ties in Adolescent and Adult Populations," Reports 22. 23. 24. 25. 26. 27. 28. 29. 30. 74 frowthe_Psychological Laboratory. No. 29. Los Angelas: University of Southern California, 1963. Glasar, &. Teaching Machines and Programed Learning. IX. Washington, D.C.: National Education Assn., T?65. i Glaser, R., Reynolds, J. H., and Harakas, T. An toer4 jjeental Comparison of a Small-Step Single Track Program with a Large-Step Multi-Track (Branching) Program I Pittsburgh: University of Pittsburgh, 1962. Goldbeck, R. A., and Campbell, V. N. "The Effects of Response Mode and Response Difficulty in Programed Learning." Journal of Educational Psychology. 53:110-118, 19^2. Gotkin, L. G. "Cognitive Development and the Issue of Individual Differences," Programed Instruction. 4:1, 1964. Guilford, J. P., Christensen, P. R., Merrlfleld, P. R.. and Frick, J. W. "An Investigation of Symbolic Factors of Cognition and Convergent Production," Reports from the Psychological Labor atory. Ho. 23. Los Angeles: University of Southern California, 1960. Guilford, J. P., and Merrlfleld, V. R. "The Structure of Intellect Model: Its Uses and Implications," Reports from the Psychological Laboratory. Ho. 24. Los Angeles: ifoiversity of Southern California, 1960. Hirsh, J. "Individual Differences in Behavior and Their Genetic Basis." In E. L. Bliss (ed.), Roots1 of Behavior. New York: Harper, 1962. Hoepfner, R., Guilford, J. P., and Merrlfleld, P. R. "A Factor Analysis of the Symbolic-Evaluatlon Abilities," Reports from the Psychological Labora tory. No. 33. Los Angeles: bnlverslty of Southern Californ£a,”1964. Holland, J. G. "Research on Programing Variables." In R. Glaser (ed.), Teaching Machines and Pro gramed Learning. II. Washington: National Educa- | tion Assn., 1^55. Pp. 66-117. 31. 32. 33. 34. 35. 36. 37. 38. 39. 75 Holland, J. G., and Porter, D. "The Influence of Repetition of Incorrectly-Answered Items in a Teaching-Machine Program,** Journal of the Experi mental Analysis of Behavior. 4;305-307. 1^61. Jenkins, J. J. "Comments on Professor Noble's Paper.** In C. N. Cofer (ed.), Verbal Learning and Verbal Behavior. New Tork: Mcd»raw-ttill. 19ol. Pp7“"T5F- T31T---- Lambert, P., Miller, D. M., and Wiley, D. £. "Experi mental Folklore and Experimentation," Journal of Educational Research. 55:485-494, 1962. Lumsdalne, A. A. "Assessing the Effectiveness of Instructional Programs." In R. Glaser (ed.), Teaching Machines and Programed Learning. II. Washington, D.d.: National Education Assn., 1965. Pp. 267-320. McDonald, F. J., and Allen, D. W. "An Investigation of Presentation, Response, and Correction Factors in Programmed Instruction," Journal of Educa tional Research. 55:502-507, 1962. Merrlfleld, P. R., Christensen, P. R., Guilford, J. P., and Frick, J. W. "A Factor-Analytic Study of Problem-Solving Abilities," Reports from the Psychological Laboratory. No. 22. Los Angeles: University of Southern California, 1960. Nihlra, K., Guilford, J. P., Hoepfner, R., and Merrlfleld, P. R. "A Factor Analysis of the Semantic-Evaluation Abilities," Reports from the Psychological Laboratory. No. 32. Los Angeles: Oniverslty ok Southern California, 1964. Noble, C. E. "Verbal Learning and Individual Differ- I ences." In C. N. Cofer (ed.), Verbal Learning and Verbal Behavior. New York: McGraw-Hill, T961. Pp. 132-146. Petersen, H., Guilford, J. P., Hoepfner, R., and Merrlfleld, P. R. "Determination of 'Structure- of-Intellect' Abilities Involved in Ninth-Grade Algebra and General Mathematics," Reports from thej Psychological Laboratory. No. 31. los Angeles: University of Southern California, 1963. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51. 76 Pressey, S. L. "A Machine for Automatic Teaching of Drill Material," School and Society. 25:549-552, 1927. . "A Third and Fourth Contribution toward the Coning 'Industrial Revolution* In Education," School ana Society. 36:668-672, 1932. Rao, C. R. Advanced Statistical Methods In Biometric Research. New York: Wiley. 1^52. Rlgney, J. W. "Potential Uses of Computers as Teach ing Machines." In J. E. Coulson (ed.), Programmed Learning and Computer-Based Instruction. New ITorkT yiley, ISrfj.PpT 155-170"------- Roe, A. "A Comparison of Branching Methods for Pro grammed learning," Journal of Educational Research. 55:407-416, 1962. Saltzman. I. J. Letter to the Editor, Contemporary Psychology. 9:238, 1964. Shay, C. B. "Relationship of Intelligence to Step Size on a Teaching Machine Program," Journal of Educational Psychology. 52:98-103, 1961. Sllberman, H. F. Exploratory Research on a Beginning Reading Program. 1^-6^5/100/00. Santa fefonlca: System Development Corp., 1964. Sllberman, H. F., Coulson, J. E., Gunn, E., and Melaragno, R. J. Development and Evaluation of Self-Instructional*^Materials for Underachieving and Overachievlng Students. 727. Santa Monica: System Development Corp., 1962. i Sllberman, H. F., Melaragno, R. J., Coulson, J. E.. j and Estavan, D. "Fixed Seauence Versus Branching } Autolnstructlonal Methods." Journal of Educational Psychology. 52:166-172, 1961': Skinner, B. F. "The Science of Learning and the Art of Teaching." Harvard Educational Review. 24:86- 97, 1954. ------------------------- "Teaching Machines." Science. 128:969-977. 1956. Stolurow, L. M., and Davis, D. "Teaching Machines and Computer-Based Systems." In R. Glaser (ed.), Teaching Machines and Programed Learning. II. Washinggon,’L.’ C: Rational Education Assn. , 1965. Pp. 162-212. Uttal, W. R. "On Conversational Interaction." In J. E. Coulson (ed.), Programmed learning and Computer-Based Instruction, tfew York: Wiley. 19d . Pp. 1 7 1- 1 9 0;--------- Wallis, W. A., and Roberts, H. V. Statistics: A New Approach. Glencoe, 111.: Free Press, 19^6. Williams, T. P. "A Comparison of Several Response Modes in a Review Program," Journal of Educational Psychology. 54:253-260, 1963. APPENDIX A SAMPLES OF QUESTIONS FROM THE SIX PRETESTS 79 SAMPLE QUESTIONS FROM SYMBOLIC REASONING TEST Note: Directions Indicate that examinee Is given a state ment and a conclusion. The task Is to determine whether each conclusion Is definitely true, definitely false, or impossible to determine definitely. The following symbols are defined: - means "is equal to" > means "is larger than" means "Is smaller than" y t means "Is not equal, and so is larger or smaller" ^ means "is not larger, and so is equal or smaller" £ means "is not smaller, and so is equal or larger" 5. A>B<C, therefore, A< C 11. A/fB>C, therefore, A>C 80 SAMPLE QUESTIONS FROM HIDDEN FIGURES TEST Note: Directions indicate that five basic figures are given, and examinee is to locate one of them hidden within another figure. Given basic figures: , < § > 81 SAMPLE QUESTIONS FROM OPERATIONS SEQUENCE TEST Note: Directions Indicate the examinee Is to arrange arithmetic operations in an order that will give a speci fied result. The examinee is given a number, a set of operations, and a specified result; he is to obtain the result by performing all operations. The task is to determine the correct order in which the operations are to be performed. The following sample Is given: Start with 5 A. x2 B. 47 and obtain 1 C. -3 The correct sequence is shown to be: A, C, B. 4. Start with 10 A. +2 B. -8 and obtain 12 C. x5 10. Start with 25 A. x4 B. square root and obtain 16 C. -9 82 SAMPLE QUESTIONS FROM PROBLEM SOLVING TEST Note: Directions instruct examinee to work out problems on scratch paper, then to select the answer that is nearest to his own solution. 4. An underwater mine will explode if a submarine passes within 30 feet of it. What is the greatest distance apart (in feet) the mines can be placed if a submarine (20 feet wide) is to be sunk as it travels through the mine field at 6 miles per hour? A. 5 D. 60 B. 12 E. 180 C. 30 8. One gun can fire 40 shells while another fires 50 shells. How many shells could the second gun fire while the first one is firing 60 shells? A. 55 D. 70 B. 60 E. 75 C. 65 83 SAMPLE QUESTIONS FROM MATCH PROBLEM TEST Note: Directions indicate that a pattern of headless matches is given, and the examinee is to take away a cer tain number of matches to leave a particular new pattern. Matches are "taken away" by marking through them. Addi tional solutions using different logic are to be given. The following example Is given; TAKE AWAY 3 MATCHES 1^1 i I LEAVING 4 SQUARES 2. Take away 6 matches Leaving 6 squares 4. Take away 7 matches A 4 AA a a a Leaving triangles 84 SAMPLE QUESTIONS FROM SYMBOL GROUPING TEST Note: Directions indicate that examinee is given a set of symbols and is to arrange them so that all the X's are first, then all the -'a, then all the O's. The examinee may move any group of one, two, three, or more adjacent symbols at a time. The object is to arrange the symbols correctly in the least number of steps. For any step, the examinee is to circle the symbols to be moved, and to put an arrowhead at the position to which the group is moved. The following example is given: X xv- * X X o 2. X - 0 x x - 11. -X-O-XOXX 85 SAMPLE QUESTIONS FROM GEOMETRY FUNDAMENTALS TEST Here are some axioms: 1. If things that are equal are added to things that are equal, the sums are equal. 2. If things that are equal are subtracted from things that are equal, the differences are equal. 3. Doubles of equals are equal. 4. Halves of equals are equal. 5. Things equal to the same things are equal to each 19. If j"k, then 5-j**5-k. This is true because of 1. Axiom 1 2. Axiom 2 3. Axiom 3 4. Axiom 4 5. Axiom 5 2. In Fig. A, £ 1-30° F i d . A other. 22. In the figure opposite, the angle included b} YX and ZX is y 1. £ 1 2. / 2 3. ]_ X 4. I Y 5. J_U z 26. If RS - ST, which angles are equal? R 86 sides APPENDIX B POSTTEST AND ANSWER SHEET GEOMETRY TEST 88 X. Using the sketch, /x»120° because ? 1. Exterior £ of a A > either reaote interior £. 2. Whole - sua of its parts. 3. Sua of /s of a A ■ 180°. 4. Exterior / of « A ■ sub of all the Interior / s . 5. Exterior £ of a A - sub of its reaote interior £a, 2. How long is the hypotenuse of a right triangle? 1. Always the longest side. 2. Soaetiaes the longest side. 3. Usually next to longest side. 4. There is always a longer side. 5. There is no definite answer. 3. If /MPO >/ONM and £l - /4, then £3 _?_ £5. p 1. > 2. < 3. - 4. Can't say definitely. 4. Consider the following stateaents when only positive nuabers are used. Circle the number of each statement that is false. 1. - + f t gives f t of saae order. 2. - - f t give* f t of saae order. 3. - x t gives f t of saae order. 4. f t - - gives f t of saae order. 5. All stateaents are true. 5. Which statement is false? 1. £3 > £1 2. 72 < 73 3. 71 + £2 > £3 4. 71 + - £3 5. All statements are true. 89 6. Given: the numbers A, B, C, and D are positive. If A - B and C > D, then A - C < B - D, because ? 1. - - j * gives f t of opposite order. 2. - - - gives j t of same order. 3. n 4 - I 4 gives f t of opposite order. 4. - + i 4 gives i 4 of opposite order. 5. « - i 4 gives i 4 of same order. 7. If a 4 y, and a is not less than y, then ? 1. a > y 2. m < y 3. a - y 4. Can't say definitely. 8. If m p P < 5, what is the value of m + n (when a, n, p are all positive)? 1. a + n- 5 + p 2. a + n > 5 - p 3. a + n c 5p 4. a + n<^5-p 5. Can't say definitely. 9. Which of the following stateaents is the most complete and also true descriptions of these inequalities7 a > b b c a 1. They have different aeanings and they have the opposite order. 2. They have different aeanings and they have the saae order. 3. They have different aeanings. 4. They have the saae meaning. 5. They have the saae order. 10. In drawing a sketch for a problem starting with the phrase, "In a triangle ..." what kind of triangle would be best to draw? 1. right 2. scalene 3. isosceles 4. equilateral 5. obtuse 90 11. What Is the name of the axiom that says: If m < p and p < r, then m < r? 1. Substitution Axiom 2. Trichoaetry Axiom 3. Transitive Axiom 4. Trichotomy Axiom 5. Inequality Axiom 12. What is always true about an exterior angle of a tri angle and its adjacent Interior angle? 1. Exterior angle is greater than adjacent interior angle 2. Exterior angle is equal to adjacent interior angle 3. Exterior angle is less than adjacent interior angle 4. No statement is always true 13. What statement means the same as this statement: CD < RS? 1. RS < CD 2. DC - SR 3. DC > SR 4. SR > DC 5. CD > RS 14. What is the name of the axiom that says: k > s, or k-s, orkcs? 1. Substitution Axiom 2. Trichotomy Axiom 3. Transitive Axiom 4. Trichometry Axiom 5. Inequality Axiom Answer the next three questions using this sketch. v /3 is a right ]_ 91 15. Apply this theorem: exterior / of a A > either remote Interior /. 1. jj > J j L 2. 74 > 71 3. 78 > 76 4. 73 > £5 5. There Is no application possible. 16. Apply this theorem: hypotenuse > either leg of a right A . 1. TW > UV 2. TX > TU 3. TY > TX 4. UX > TU 5. There Is no application possible. 17. Apply this theorem: whole > any of Its parts. 1. /TXW > / 7 2. TY > TV 3. TW > TV 4. £3 > £1 18. What Is the shortest distance from point X to point Z? Why? y j 20. Write, In English, the meaning of these three state ments. (1) c > f (2) r 7 v (3) b < w 21. Apply the Transitive Axiom to these statements: w 19. FT - 22'*. Why? C > D 22. Why is RS > SU? 24, R u T 92 23. Maoe all exterior angles of triangle MSK. ex What is the hypothesis of the Trichotomy Axiom: *'A quantity is equal to, greater than, or less than a second quantity of the same kind"? 25. State the conclusion of the following Idea: "The diagonals of a rhombus are perpendicular to each other." 26. Idea: If 2 sides of a A are +. then the opposite /s are f with the larger J_ opposite the larger side. Application: J \ > £ K. If then K 27. State the following Idea in If . . . Then . . . form: "The exterior angle of a triangle is equal to the sum of its remote interior angles." 28. Complete the following proof. GIVEN: A ABC is isosceles; AB - AC; /4 > /3. PROVE: J_ ACD > J_ ABD B 93 29. Complete the following proof. GIVEN: AVST, with altitudes QT, FV, and RS. PROVE: VS + ST + VT > VP + QT + SR. 30. Conplete the following proof GIVEN: APQR; FQ - QS. PROVE: /P > /R. P 31. Conplete the following proof. GIVEN: A ABC, with sides a, b, c PROVE: a < c c 32. In A ABC, AB - AC* Extend AB beyond point B. Next, choose "D" as some point on that extension. Then, draw line CD. Finally, prove that /ACD > Jj>, Directions: Draw and label a sketch In the provided box. Next, conplete the proof. It requires exactly six steps, Including step 4, which Is a clue. 94 GIVEN: A ABC: AB - AC; D Is an extension of AB. PROVE: /ACD > /D. 33. Problem: ABC is a A in which side AB I s greater than side AC. The bisectors of /B and JC are drawn, extending just to their point of intersection, 0. Prove that OB f OC. Directions: (1) Draw and label an accurate sketch in the box. (2) In order to prove that OB + OC, you oust prove either that OB > OC or else that OB < ■ OC. By examining your sketch, decide which should be correct and write it in the PROVE blank• (3) Complete the formal proof. GIVEN: A ABC; OB bisects /B; OC bisects JC. PROVE: ? GEOMETRY ANSWER SHEET 95 1. 1 2 3 4 5 10. 1 2 3 4 5 2. 1 2 3 4 5 11. 1 2 3 4 5 3. 1 2 3 4 12. 1 2 3 4 4. 1 2 3 4 5 13. 1 2 3 4 5 5. 1 2 3 4 5 14. 1 2 3 4 5 6. 1 2 3 4 5 15. 1 2 3 4 5 7. 1 2 3 4 16. 1 2 3 4 5 8. 1 2 3 4 5 17. 1 2 3 4 9. 1 2 3 4 5 18. Shortest distance: Reason: 19. 20. (1) (2) (3) i 21. 22. Reason « • 23. Angles 24. Hypothesis: 25. Conclusion: 26. If then /I > /K. ! 27. If Then 96 28. Complete the following proof. 1. /4 > /3. 1• Given. 2. J2 - /l. 2. 3. 12 + /4 > /3 + /l. 3. 4. /ACD > /ABD. 4. 29. Complete the following proof. 1. AVQT. AVSP, ASRT are right triangles. 1. Perpendiculars from right Js at P, Q. R. Any A with a right / is a right A . 2. VS > VP; ST > SR; VT > QT. 2. 3. VS + ST + VT > VP + SR + QT. 3. 30. Complete the following proof 97 1. APQR; PQ - QS. 1. Given. 2. Jl > /R. 2. 3. /I - /R. 3. 4. /P > /R. 4. 31. Complete the following proof. 1. A ABC, with sides a, b, c. 1. Given. 2. b + c > a. 2. b + c „ a 3. b + c . a ^ a . a 4. T - + 1 1 + 7. 4. 5 b + c + a >a 5. 32. Draw and label a sketch. Complete the proof. SKETCH 1. A ABC; AB - AC; D is an extension of AB. 1. Given. 2. 2. 3. 3. 4. /ACB > /D. 4. 5. 5. 6. J_ACD > Jp. 6. 99 33. Draw and label a sketch. Write the correct PROVE. Couplete the proof. SKETCH PROVE: 1. A ABC: AB > AC; OB bisects /B; OC bisects JC. 1. Given. 2. 2. 3. 3. APPENDIX C BRANCHING STRUCTURE FOR GEOMETRY PROGRAM 101 PART 1 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 15, 16, 18 correct? Mo Yes .9 20 21— 22 23 24 25 15, 16, 18 correct? Yes 26 27 28- 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 Problem T1 correct ? No Yes- \ 46 47 48 49 50 51 52 To Part 2 102 PART 2 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 Problem T2 correct7 No 72 Yes 73 Problem T3 correct ? No 75 Yes 88 80 81 82 83 84 Problem T1 correct 7 Yes No -To Part 3 103 FART 3 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 113, 114, 115, 116 correct? No Tee 117 118 119 120 121 122— 123 124 125 126 127 Problem T1 correct? No Yes ^^28 129 130 Problem T2 correct? No Yes 131 T3 ? No Yes- 132 133 134 135 Problem correct? -To Part 4 104 PART 4 136 152 166 184 137 153 167 185 186 138 154 168 187 139 155 — 169 188 140 156 170 189 139, 140 155, 156 171 190 correct? correct? 191 Mo Yes Mo Yes 172 \ \ 192 141 157 173 193 142 158 174 Problem T1 correct? 143 159- 175 N ^ Y«8 194 144- 160 176 195 145 161 177 196 146 2 of 4 proofs 178 Problem T2 correct ? correct? 147 No Yes------ 179 Mo Yes- \ 197 148 162 180 198 149 163 181 199 150 164 182 200 151 165 183 201 To Part 5| 105 PART 5 202 203 204 205 206 207 208 209 3 of 5 proofs correct? No Tea- \ 219 220 221 222 223 224 225 226 227 228 210 229 244 211 230 245 212 231 246 213 3 of 5 proofs correct? 247 214 No Yes— \ 248 215 232 249 216 233 250 217 234 251 218 235 252 236 237 238 239 240 241 242 243 3 of 5 proofs correct? No Yes \ 253 254 255 256 257 258 259 260 261 262 263 264 265 Problem T1 correct ? No Yes- 266 267 268 269 270 To i Part 6j 106 PART 6 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 Problem T3 correct? No Yes 290 291 292 293 294 295 ll correct? Yes— To 296 No 297 306 298 299 300 301 Problem T1 correct? No Yes 302 Problem T2 correct? No Yes- 303 To END APPENDIX D SAMPLE ITEMS FROM THE GEOMETRY PROGRAM 108 EXAMPLE OF A MULTIPLE-CHOICE ITEM By the theorem, "An exterior angle of a triangle equals the sum of the two remote interior angles," we see that / ZYW - ? z> lo a. 160° b. 180° c. 130° 109 EXAMPLE OF A NO-RESPONSE ITEM To summarize the theorem just presented, an exterior £ of a / Is greater than either remote Interior /. In the figure below, / 2 is an exterior £ and ]_ A and / B are the two remote Interior £s. £ 2 >£ B and £ 2>£ A, 0 C D 110 EXAMPLE OF A QUIZ QUIZ 4 Tl. List all exterior angles of £ DEG. A B £ T2. Complete the following proof. GIVEN: ^ ABC; straight d lines AF and DE. PROVE: / !>/ 4. 1. ^ ABC; St. lines AF, DE. 1. Given. 2. / 1> / DAE. 2. 3. L^<L DAE- 3. 4. 1 1> I 4. 4. Ill EXAMPLE OF QUIZ FEEDBACK ITEM T2. Reason 2: Exterior /of a > either of Its remote Interior /s. Reason 3: A whole >any one of Its parts. Reason 4: Transitive Axiom. How well did you answer T3? a. I had all three reasons correct. b. I had some of the reasons correct. c. I was completely wrong. 112 EXAMPLE OF A REMEDIAL ITEM PROBLEM T2 A GIVEN: £ ABC: As. st. lines AF and DE / \ % PROVE: / l>£ 4 O * — p c 1. ABC; St. lines AF, DE. 1. Given. 2. £ 1 > / DAE. 2. Ext. / > either remote int. J. 3. J_ DAE >£ 4. 3. Whole > any one of Its parts. 4. L i>Z 4. Transitive Axiom. #2: / 1 as an exterior / of ^ DAE. Then, £ 1 >remote Int. £ DAE. #3: £ DAE Is composed of £s 3 and 4. Then, whole £ DAE > its part, / 4. #4: We have shown that / 1 > £ DAE, and £ DAE>/ 4. Then, / !>/ 4, by using the Transitive Axiom. APPENDIX E INSTRUCTIONS TO SUBJECTS INSTRUCTIONS TO SUBJECTS You are about to begin your participation in a study of computer-based instruction. The purpose of this study is to find out how best to teach geometry to high school students. You are now in a 20-station automated classroom. Each one of you will be comunlcating with a large computer, which is downstairs, while you are learning some geometry. Let me tell you how you will proceed. On each of your desks is a notebook with instruction in geometry. All your instruction is contained in the notebook. Also on your desk is a response device, which you will use to communicate with the computer. Here is how: a. A number will appear in the four windows at the top of the response device. b. Turn to that page in your notebook and read it. c. If the page asks a question, it will be in multiple choice form, with possible answers "a" through "e**. Choose your answer, and press the button with the letter of your chosen answer. d. When you are ready to have the computer evaluate your answer, press the large ENTER bar on the side of the response device. You can change your answer If you want to. For example, If you had chosen "d" and had pressed the "df > key, then read the question again and decided that ”b" was a better answer, just press the "b" button. Once you have pressed the ENTER bar, however, you cannot change your mind. The last answer button you press before pressing the ENTER bar Is taken to be your answer. After you have entered your answer, the computer will accept It and tell you If you were right or wrong by the lights on the top of the response device. A green light means "right," and a red light means "wrong." Also, a light will come on next to the correct answer button. Then, you will i be told both whether you are right or wrong, and also what the correct answer is to every question you answer. When you are ready to go to the next page, press the ENTER bar again. All the lights will go off, all buttons will be released, and a new page number will appear in the four windows. Turn to the new page, and continue. If a page does not contain a question, simply read the page. Then press the ENTER bar when you are ready to continue. Let's go over that again. a. First, turn to the page number given in the four windows, and read it. b. Answer questions by pressing an answer button and the ENTER bar. c. After you have found out if you are right or wrong, and what the correct answer is, press the ENTER bar again. d. If there is no question on a page, read the page and press the ENTER bar when you are ready to go on. Do you have any questions about how to use the equip- ment? (Answer any questions by rereading the appro priate directions.) If you have any trouble with the equipment, raise your hand. Do not ask for help with geometry. All the instruction is contained in the notebook. There are six quizzes contained in the materials. When you come to a quiz, write your answers right on the quiz page. Then, tear the quiz page out of the notebook and press the ENTER bar. The next pages will give you the correct answers to questions in the quiz, and will ask you how well you answered the quiz questions. 117 9. Scratch paper and a pencil are also on the desk. Use them as you see fit. 10. You will be coming here each afternoon for the next three or four days. It takes different students different times to complete this notebook, so I can* not tell you exactly how long you will be here. After all of you have finished, you will be given a test covering the material In the notebook. After you have taken the test, you will be paid your five dollars. 11. Now, are there any more questions? (Answer questions by the rereading appropriate parts of the Instruc tions.) OK, let's begin. 12. (If any subject has difficulty with the equipment, show him how to use It.) 13. (After all subjects have completed Quiz 1.) Let me Interrupt you for a minute. You have all gone through, one of the quizzes. That Is how you will proceed with I the other five quizzes. Write your answer on the qulzj l sheet, tear out the quiz, press the ENTER bar, and compare your quiz answers with the correct ones. Go ahead now. 14. (When a session Is to be ended.) That's all for today. Leave everything as It Is on your desk, and take the same seat tomorrow 118 15. (When a subject Is finished.) Good work. Now study your school work until we are through. (Remove mate rials from subject's desk.)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The Effect Of Irrelevant Environmental Stimulation On Vigilance Performance
PDF
The Effects Of Feedback On The Communication Of Medical Prescription To Diabetic Patients
PDF
Factor Analysis Of Certain Aptitude And Personality Variables
PDF
Self-Worth, Future Goals, Mood And Time Perception
PDF
The Development And Validation Of A Proverbs Test For The Selection Of Supervisors
PDF
Methodological Problems In The Identification Of Suicidal Behavior By Means Of Two Personality Inventories
PDF
First Steps In An Attempt To Construct An Objective Test Of Character Structure
PDF
The Effects Of Transverse Accelerations And Exponential Time-Lag Constants On Compensatory Tracking Performance
PDF
The Influence Of The Control Of Personal Set Upon Prediction By Factored Tests Of Temperament And Interest
PDF
The Application Of Biostatistical Methodology To Personnel Classificationand Turnover
PDF
Noxious Auditory Environment And Psychomotor Performance
PDF
Prediction Of Success In Training Among Electronics Technicians
PDF
Dental Student Selection Through Handwriting Analysis
PDF
The Use Of Multiple Objective Measures In Survey Research As A Means For Predicting Respondent Future Behavior
PDF
Some Effects Of Preschool Experience On The General Intelligence And Creativity Of Culturally Disadvantaged Children
PDF
Relationship Of Achievement Motivation To Perception Of Degree Of Task Difficulty And Estimate Of Performance
PDF
The Modification Of Maladaptive Behavior Of A Class Of Educationally Handicapped Children By Operant Conditioning Techniques
PDF
A Study Of The Attitudes Of Mothers Of Mentally Retarded Children As Influenced By Socioeconomic Status
PDF
Determination Of Media Materials And Methods For Teaching English As A Second Language In Nigeria
PDF
The Effects Of Anxiety And Threat On Self-Disclosure
Asset Metadata
Creator
Melaragno, Ralph James (author)
Core Title
A Comparison Of Two Methods Of Adapting Self-Instructional Materials To Individual Differences Among Learners
Degree
Doctor of Philosophy
Degree Program
Psychology
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
OAI-PMH Harvest,psychology, general
Language
English
Contributor
Digitized by ProQuest
(provenance)
Advisor
Ruch, Floyd L. (
committee chair
), Allen, William H. (
committee member
), Slucki, Henry (
committee member
)
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c18-108958
Unique identifier
UC11360223
Identifier
6702113.pdf (filename),usctheses-c18-108958 (legacy record id)
Legacy Identifier
6702113.pdf
Dmrecord
108958
Document Type
Dissertation
Rights
Melaragno, Ralph James
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the au...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus, Los Angeles, California 90089, USA
Tags
psychology, general