Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Development and validation of the Cooper Quality of Imagery Scale: A measure of vividness of sporting mental imagery
(USC Thesis Other)
Development and validation of the Cooper Quality of Imagery Scale: A measure of vividness of sporting mental imagery
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DEVELOPMENT AND VALIDATION OF THE COOPER QUALITY OF IMAGERY SCALE: A MEASURE OF VIVIDNESS OF SPORTING MENTAL IMAGERY by Casey Laura Cooper A Dissertation Presented to the FACULTY OF THE GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIROFNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF PHILOSOPHY EDUCATION (COUNSELING PSYCHOLOGY) August 2004 Copyright 2004 Casey Laura Cooper Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. UMI Number: 3145185 INFORMATION TO USERS The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleed-through, substandard margins, and improper alignment can adversely affect reproduction. In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. ® UMI UMI Microform 3145185 Copyright 2004 by ProQuest Information and Learning Company. All rights reserved. This microform edition is protected against unauthorized copying under Title 17, United States Code. ProQuest Information and Learning Company 300 North Zeeb Road P.O. Box 1346 Ann Arbor, Ml 48106-1346 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. ACKNOWLEDGMENTS I would first like to thank Dr. Rodney Goodyear. Your support of me personally and professionally to complete my Ph.D. and subspecialty in Sport Psychology is something for which I will be eternally grateful. So few scholars receive such unconditional encouragement. You and Karen are dear friends. I would also like to recognize Dr. John Callaghan, whose guidance and expertise in the field o f Sport Psychology uniquely enhanced this instrument and my research of mental imagery. Seminars in your office were a great gift! Dr. Richard Clark, you are a very special professor to our program. And my deepest thanks to Dr. Kaaren Hoffman who so selflessly donated her expertise in instrument development and scaling to see me through this process. To my Daughter, Avery who inspired me to complete this project, you remind me that anything is possible! To my Husband who encouraged and challenged me throughout this process-1 love you. To my Mom who gave her time in so many ways by proofreading, babysitting, and recordings- You are amazing. To my Father, this is for you. If it weren’t for Tommie and your dreams for me, I may not have pushed so hard for all I have achieved. To my Brother and Mother-in-law who donated their love, time, and support- Thank you so much. A very special thanks to Coach Ron Allice, whose repeated backing o f my career made this project possible. And to everyone else who supported this project, the coaches, meet organizers, and IT guys... Thank You! ii Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. TABLE OF CONTENTS Acknowledgements ii List of Tables v List of Figures vi Abstract vii Chapter I (Introduction) 1 Defining Vividness o f Mental Imagery 3 Memory and Mental Imagery 5 Encoding 6 Retrieving 7 Summary 8 Current Imagery Questionnaires 9 Theoretical Support 13 Criticisms o f Current Measures 15 Construct Validity 16 Response Sets 18 Imagery Scripts 20 Summary 21 Methodological Problems in Sports Imagery Literature 22 Proposed Instrument 24 Research Questions 28 Chapter II (Method) 30 Participants 30 Measures 31 Instrument Development 31 Overview 31 Imagery Scripts 33 Development 33 Expert Review 34 Pilot Study 34 Photograph and Audiotapes 37 Photographs 37 Audio Clips 39 Computer Programming 39 Software Development 40 Psychophysical Scaling o f the Video Progressions 41 iii Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Direct-Estimation Method 41 Participants 44 Focus Group 44 Focus Group Outcomes 46 Final Version of the CQIS 47 Procedure 49 Chapter III (Results) 52 VVIQ Scores 52 Demonstration of CQIS Scale Validity 53 Demonstration o f Discriminant Validity 61 Demonstration of CQIS Scale Variance 62 Chapter IV (Discussion) 67 CQIS Development 67 VVIQ 69 Demonstration o f CQIS Scale Validity 70 Demonstration o f Discriminant Validity 71 Demonstration of CQIS Scale Variance 72 Delimitations 74 Limitations 74 Future Implications 76 References 79 Appendices A Transcribed Vignette Imagery Scripts 83 B Questionnaire Used For The Peer Review o f Imagery Vignettes 88 C Recruitment Letter 95 D General Instructions for Participants 97 E Consent Form 99 F Instrument Instruction Screens 103 G Direct-Estimation Questionnaires 106 iv Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 1. Table 2A. Table 2B. Table 2C. Table 2D. Table 3A. Table 3B. Table 4A. Table 4B. Table 5. Table 6. Table 7. LIST OF TABLES A summary o f mental imagery assessment tools commonly used with athletes Geometric mean of color scale from direct-estimation task, group 1. Geometric mean of color scale from direct-estimation task, group 2. Geometric mean of clarity scale from direct-estimation task, group 1. Geometric mean of clarity scale from direct-estimation task, group 2. Correlations between odd and even photos for the color scale. Correlations between odd and even photos for the clarity scale. Correlations between adjusted direct-estimation and physical values for the color scale. Correlations between adjusted direct-estimation and physical values for the clarity scale. Color and clarity scale ratings, adjusted direct-estimation and physical values. Correlations between the CQIS indices and the W IQ mean. CQIS audio scale frequencies. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. LIST OF FIGURES Figure 1. Distribution of VVIQ scores. 53 Figure 2. Distribution of CQIS color scale scores. 63 Figure 3. Distribution of CQIS clarity scale scores. 64 Figure 4. Distribution of CQIS audio scale scores. 65 vi Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. ABSTRACT Although psychologists and coaches frequently use mental imagery to enhance athletes’ performances, the instruments currently available to measure a person’s imagery ability have been inadequate. This study reports the development and validation of a new assessment tool, the Cooper Quality of Imagery Scale (CQIS) for measuring the vividness of mental imagery of athletes along three dimensions: color, clarity, and audio. Improvements include implementation of computer technology, magnitudinal scaling, and detailed imagery scripts. The CQIS and Vividness of Visual Imagery Questionnaire (W IQ ) were administered to 47 track and field and cross-country athletes. Participants were recruited from Southern California universities and community colleges and track and field competitions. The sample consisted primarily of collegiate athletes (85%), who were diverse with respect to gender (28 men, 19 women) and ethnicity (49% Caucasian, 28% African American, 9% Multiracial and Other, 4% Hispanic, and 2% Asian American). Level of competition, years involved in sport, scholarship status, and experience and utilization of mental imagery were assessed. The CQIS color and clarity scales, demonstrated construct validity through positive and significant (.01 level) Pearson Correlations that were observed between two independent samples completing direct-estimation tasks (color= .85, clarity= .95). Different groups of people concluded the same judgments about these scales’ representation of their defined dimensions. The scaling method was confirmed through positive and significant Pearson Correlations (.01 level) observed between vii Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. the direct-estimation ratings and assigned physical values (color= .88, clarity= .76). The perceived magnitude of change supports the physical changes estimated. The audio scale was not validated by direct-estimation. The overall VVIQ mean was 2.57 with a standard deviation of .88. Negative and insignificant Pearson Correlations between all CQIS indices and the W IQ scores (color= -. 10, clarity= -.17, audio= -.23) indicate the instruments are measuring different constructs. The range of scores (color m=89.49, sd=21.69, clarity m=87.02, sd=10.28, and audio m=48.89, sd=12.16) observed confirms that the CQIS is a meaningful instrument to measure the mental imagery of athletes. The CQIS was subject to several levels of conceptual and empirical review throughout its development. The result is a set o f magnitudinally scaled indices validated through direct-estimation ratings. Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. CHAPTER 1 INTRODUCTION Finding the formula for athletic success is the goal of almost every athlete. At the very least, the essential elements are proper physique, commitment, talent, and specialized training. However, state-of-the-art training facilities that now are available to many young.and promising players have minimized the importance of initial individual differences in skill and abilities. In addition, many athletes are finding that they can obtain further advantage through the help of psychologists. Sport psychology has developed during the last two decades into a legitimate area of practice and inquiry. One indicator was the 1986 development of the American Psychological Association’s (APA) Division 47 (Exercise and Sport Psychology). More recently, Division 47 has petitioned the APA’s Commission for the Recognition of Specialties in Professional Psychology (CRSPPP)to designate sport psychology an official psychological proficiency. Sport psychology is the study of the “effect of psychological factors on behavior in sport or the psychological effect that participation in sport or physical activity had on the performer” (Silva & Weinberg, 1984, p. 6). Within this domain of practice, mental imagery has been a primary intervention. In fact, Murphy, Jowdy, and Durtschi (1989) reported that 90 percent of athletes, 94 percent of coaches, and 100 percent of sport psychologists in the United States identified mental imagery techniques as a standard fixture in their training regimens. One 1 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. therefore would expect a robust literature concerning the theory, implementation, and effect of imagery on athletic performance. Unfortunately, however, existing research on mental imagery’s effect on sport performance has been compromised by two factors: (1) the instruments detect statistically insignificant differences in imagery ability and (2) researchers have made the assumption that all participants are equal in their mental imagery ability. Belief in this latter factor has become difficult to dispel because of the very limited variance that the two predominant assessment tools (the Vividness of Visual Imagery Questionnaire (VVIQ) and the Vividness of Movement Imagery Questionnaire (VMIQ) have been able to detect. This limited variance in measured imagery ability has precluded this as a meaningful predictor of sporting performance (Kihlstrom, Glisky, Peterson, Harvey, & Rose, 1991). Of equal concern has been the questionable validity of the currently used measures. For example, much of the 1995 Fall/Winter issue of the Journal of Mental Imagery focused on the VVIQ and its continued usefulness. Katz (1995) went so far as to suggest, “Rather than trying to save the test, we should go back to the drawing board and take the lessons learned from our use with the VVIQ to construct a better instrument of imagery vividness (p. 144).” Katz surmised that its continued use, despite years of critical review, is due to the VVIQ’s quick administration and ease of scoring. Surprisingly, no new instruments have surfaced to answer this call. In order to improve the body of literature on this commonly used intervention, our assessment of mental imagery must be modernized so that 2 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. research on the impact of mental imagery in sport training and performance has the increased potential to reach consistent and methodologically sound conclusions. The purpose of this validation study therefore was to develop a psychometrically sound, magnitudinally scaled instrument that utilizes computer technology. To more fully establish the need for this instrument, the following review addresses how mental images are produced and describes the development of previous measures of imagery. The review also presents methodological concerns with currently available mental imagery instruments and then concludes with a discussion of the proposed instrument’s importance, purpose, and rational. Defining Vividness of Mental Imagery Mental imagery refers to an individual’s ability to represent in the mind experiences that are not physically there (Matlin, 1989). Sports imagery, a specific type and application of mental imagery, is the rehearsal of physical actions without deliberate muscular activity (Gilmore, 1973). Denis (1985) expanded this explanation to, “any psychological activity, which evokes the physical characteristics of an absent object” (p. 4). This absent object may be either temporarily or permanently undetectable. A simplified definition of sports imagery was suggested by Singer (1980) who contended that mental imagery and imagery rehearsal are essentially the same concept, for both are “task rehearsal in which there are no observable movements” (p. 426). Athletes utilize imagery to learn new tasks, enhance a skill that has already been physically practiced or taught, rehearse a performance, strategize for 3 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. competition, narrow focus, maintain attention, minimize anxiety, and evaluate technique and performance (Onestak, 1991; Suinn, 1996). The vividness with which a person experiences a mental image is dependent on a multidimensional set of processes and includes many sensory experiences such as sounds, muscles, and other visceral feelings (Ahsen, 1995). This definition of vividness is contradictory to the assumption of unidimensional, stable mental imagery ability that undergirds the VVIQ and VMIQ. Debate over the past 2 decades has encouraged a fresh look at the vividness construct, with clear implications for a revised assessment of mental imagery. The aforementioned 1995 Journal of Mental Imagery issues was dedicated to this debate and focused on redefining vividness of mental imagery and the continued use of the VVIQ. The featured article (McKelvie, 1995) has been considered the most thorough review of the VVIQ to date and has stimulated considerable discussion. McKelvie (1995) noted the VVIQ defines vividness of mental imagery as an ability, which implies its stability as a trait. This definition is no longer accepted as valid within current literature. McKelvie (1995) argued instead that “perception is not a fixed stimulus-driven automatic process, but a joint function of sensory input and interpretation by higher-order factors such as stored knowledge (p.36).” Kaufmann, as cited in McKelvie (1995), calls for an assessment of vividness that extends beyond a simple representation and incorporates imagery as a multisensory experience. Ahsen (1995) also agreed that “other senses commingle with the visual cue (p.l 14).” 4 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. To redefine vividness of mental imagery as multidimensional and multi- sensory allows assessment to change from measuring memory recall ability to that of measuring both memory and imagination. Kunzendorf (1995) encourages understanding mental imagery as a personal construction not measured by its likeness to an object or past events. By understanding it as an interplay of external history and internal perception, mental imagery becomes a different construction that no longer is related to accuracy. It instead focuses on intensity of sensory experience during the mental imagery exercise. The assessment tool that was developed as part of this study is grounded in the following definition of mental imagery: The ability to symbolically rehearse a physical activity or experience in the mind that which is not physically there and is absent of overt muscular movements (Moran, 1991; Onestak, 1991). This requires that mental imagery be assessed in terms of its vividness. Vividness of mental imagery was defined for this instrument development as: Generating from memory and/or imagination an internal experience that is rich in visual color and clarity, audition, and physical sensation in response to verbal instruction (Ahsen, 1995; Campos, 1995; Kunzendorf, 1995; McKelvie, 1995). Memory and Mental Imagery The creation of a mental image is based on what we have seen and experienced in the external world. The mechanics of vision and the act of perception allow us to interpret these images and sensory experiences. The next step to understanding the elements that create quality of mental imagery is to explore our 5 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. ability to store and retrieve relevant memories. When a mental image is stimulated, we pull from our mind images, feelings, sounds, and smells that correspond with the prompt suggested. The images recalled are connected to sensory awareness according to the time they were perceived, utilized, and potentially stored in long term memory. For example, when a song triggers a memory from a high school dance, the experience that begin to flood one’s mind are coupled with that specific melody. Therefore, context helps to create the meaningfulness of the experience as a whole. This comes together when a vivid mental image is formed. Encoding Once a sensory stimulus has been perceived, it is compared with past experiences in the hippocampus (Lester, 2000) and typically stored for later use. The storage begins with comparison, or pattern recognition. Pattern recognition takes many variables into account in order to make an accurate assessment of a stimuli’s meaning (Bruning, Schraw, & Ronning, 1999). This recognition includes the qualities of the stimulus, our body of knowledge pertaining to the stimulus, and the contextual factors present during its detection. Templates represent objects that have been previously stored in memory. If new experiences match the properties of a known template, then it is recognized. However, if there is too much discrepancy between the template and the incident, the brain will utilize stored prototypes. A prototype is the essence or set of basic elements of an object (Bruning, Schraw, & Ronning, 1999). The next level in memory structures is a schema, which relates perceived information to a preexisting 6 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. structure, guiding additional information gathering. Schemata provide a framework to organize incoming data. As all of the contextual cues are processed, we are better able to detect other relevant information from the environment based on our expectations (Bruning, Schraw, & Ronning, 1999). As exposure to stimuli increase, our processing speed also increases because the demand on our brain’s resources is reduced. Therefore, the more often athletes are exposed to the elements of their sport, the more quickly they will be able to recognize the situation at hand. Their schemata related to their sports activities are more advanced due to practice; efficiency becomes a priority as the processes are automated (Bruning, Schraw, & Ronning, 1999). Contextual cues are more integrated into their framework due to the number of times they have been detected, recognized, and stored. Retrieving The ability to develop a mental image is contingent upon one’s memories and their recollection. Working memory enables the current processing of information, or the current contents of consciousness (Bruning, Schraw, & Ronning, 1999). There are three parts to working memory. The first, is the executive control system that governs what enters our consciousness and organizes the information. The second, the articulatory loop, allows for the rehearsal of auditory information. Finally, and most important to mental imagery, the visual-spatial “sketchpad” makes possible the manipulation of objects and other visual data. During an imagery exercise, stored images are pulled from long-term memory into the visual-spatial 7 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. sketchpad for further manipulation and processing (Bruning, Schraw, & Ronning, 1999). Interaction with current information also occurs as the imagery script or prompt influences the remembrance. The dual coding theory (Paivio, 1985) explains that when information is stored in both the verbal and nonverbal systems, memory recall will be improved. Because concrete words are immediately matched with an internal image, the result is faster memory recall for them. Thus, imaging the word “cat” is easier than imaging something more intangible, (e.g., “progress”). Much research (Williams, Davids, &Williams, 1999) suggests that subjects’ recall ability for pictures is superior to that for words. This visual recall ability has been evidenced in both short-term and long-term memory trials. Delays in recall occurred only when responses were requested in verbal form. Paivio (1985) contends that because imagery rehearsal is based on the memories of past performances, the result is only as accurate as the performer’s skill level. More specific to the quality of the images created, Paivio (1985) suggests, “language provides the retrieval cues for memories expressed as images (p. 27).” Thus, the instructions for the imagery requested must be precise enough to trigger recall, and the performer must have a significant memory base from which to create an imagery experience that is described in a script. Summary Mental imagery relies on the strength and automaticity of a memory framework and the capacity of the working memory systems. The quality of the 8 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. mental image is contingent upon this complex series of steps. Comparisons with stored memories in the hippocampus result in the development and refinement of schemata, creating a meaningful context of the stimuli originally identified. Stored memories can be further enhanced and manipulated within working memory. The vividness of a mental image cannot be assessed without recognizing the role of our perception, storage, and retrieval of the experiences we are hoping to generate. Current Imagery Questionnaires Instruments measuring imagery ability originated with Gabon’s “breakfast table questionnaire” in 1880. Gabon’s method offered the first means of quantitative measurement of visual imagery ability (White, Sheehan & Ashton, 1977). Later, Betts’ Questionnaire upon Mental Imagery (QMI) paved the way for current self- report measures. Originally 150 questions, the QMI evaluated individual differences on seven sensory modalities: visual, auditory, tactile, kinesthetic, gustatory, olfactory, and organic (White, Sheehan & Ashton, 1977). Sheehan (1967) shortened the QMI to 35 items, and that became the more commonly used form. The QMI utilizes a 7-point categorical scale that ranges from “Perfectly clear and as vivid as the actual experience” to “No image present at all, only ‘knowing’ that you are thinking of the object.” The QMI is a measure of imagery vividness and has a high internal consistency value of .95 (Juhasz, 1972) and a seven-month test-retest reliability of .78 (Sheehan, 1967). The QMI contributed greatly to the development of Marks (1973) Vividness of Visual Imagery Questionnaire (VVIQ) and (Isaac, Marks & Russell, 1986) 9 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Vividness of Movement Imagery Questionnaire (VMIQ), the two most widely used inventories for sports related imagery. The VVIQ consists of 16 items that are an extension of the QMI’s visual subscale. Marks (1973) also based his 5-point categorical scale on the QMI, utilizing the same end points. The choices of the 5- point scale are: 1) Perfectly clear and as vivid as the actual experience, 2) Clear and reasonably vivid, 3) Moderately clear and vivid, 4) Vague and dim, and 5) No image present at all, only ‘knowing’ that you are thinking of the object. Subjects taking the VVIQ are asked to image the item requested (a friend or relative, rising sun, familiar shop, country scene) with their eyes open and then with their eyes shut. The prompts to stimulate the image are very short and no more descriptive than the examples cited. It is not clear why participants are required to visualize under both conditions. No theoretical explanation has been provided, and the instrument has not yielded scores that are significantly different in the two conditions (White, Sheehan, & Ashton, 1977). The overall score is the mean of the 16 ratings. The reliability reported for the VVIQ has ranged from .67 to .87 (McKelvie, 1990). McKelvie (1995) identified an overall mean score of 2.31 with an overall standard deviation of .69 in his review of 38 research studies utilizing the VVIQ. Isaac, Marks, and Russell (1986) believed that a movement-related imagery assessment tool was needed and created the VMIQ to fill that need. The same 5- point scale from the VVIQ is used in the VMIQ for 24 items. Similarly to the VVIQ, the VMIQ asks participants to imagine the cues under two different 10 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. conditions: (1) as though you are “watching someone else,” and, (2) “doing it yourself.” However, those taking the VMIQ are instructed to keep their eyes closed during the visualizations. The movement prompts are categorized into six groups: 1) Basic body movements, 2) Basic movements with more precision, 3) Movement with control but some unplanned risk, 4) Movement controlling an object, 5) Movements which cause imbalance and recover, and 6) Movements demanding control in aerial situations (Isaac, Marks, & Russell, 1986). Unsurprisingly, given its roots, the VMIQ exhibits high convergent validity with the VVIQ (.81). Its test-retest reliability is also consistent with its predecessor at .76 (Isaac, Marks, & Russell, 1986). Unfortunately, like the VVIQ, the prompts employed to stimulate the movement imagery are insufficient, oftentimes consisting of a single word (standing, walking, running, etc.). The rationale to support attempts to assess imaging under the different conditions is also lacking, and it is unclear why this line of thinking was continued in the VMIQ. Self-report questionnaires have employed two main approaches to assess imagery ability. One approach has been to measure the vividness, or the sensory richness (Richardson, 1988) of an image. This is evident in the Questionnaire upon Mental Imagery, QMI, VVIQ, and the VMIQ and so far has been the focus of this review. The other approach has been to focus on a person’s control over his or her imagery experience: his or her ability to change and direct imagery (Kosslyn, 1990). This ability is most often assessed using Gordon’s (1949) Test of Visual 11 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Imagery Control (TVIC). The 12 item TVIC originally employed a dichotomous item format. It was later revised by Richardson (1969) to include “unsure” among the choices. Now, the TVIC employs a 5-point scale with the same end points of “yes” and “no.” During the assessment, participants are asked to image an automobile and manipulate its position and movements (stopped, turned over, climbing a hill). The reported test-retest reliability is .84 (McKelvie & Gingras, 1974). The TVIC has been criticized for its brevity and its high correlation to the QMI, .47 (Moran, 1991) and VVIQ, .45 (Kihlstrom et al., 1991). This criticism seems warranted because the TVIC is a test of imagery controllability, whereas the QMI and VVIQ are vividness assessments. These findings of convergent validity have supported arguments that controllability and vividness instruments are essentially measuring the same factor and are not distinguishably different dimensions (Ernest, 1977; Kihlstrom et al., 1991; Moran, 1991; White, Sheehan, & Ashton, 1977). The Imagery Use Questionnaire (IUQ), a tool to assess the mental imagery utilization by athletes, was introduced by Hall, Rodgers, and Barr (1990) and has gone through several revisions. Although it is not intended to assess into ability, the IUQ is noteworthy due to its imagery use assessment properties and to its use in conjunction with the QMI, VVIQ, VMIQ, and TVIC. Also, the authors utilized Paivio’s (1985) framework motivation/cognitive and specific/general to guide their assessment of imagery use by athletes. The IUQ is the only instrument designed specifically for use with athletes to assess any aspect of imagery. The IUQ can also 12 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. be adjusted to fit a particular sport and varies in its number of questions. The questionnaire employs a 7-point Categorical scale ranging from “never” to “always” and “very difficult” to “very easy.” Questions target the when, where, and how an athlete utilizes imagery. The IUQ can be a helpful guide to acquiring pertinent demographic and sport specific information. Table 1. A summary of mental imagery assessment tools commonly used with athletes. Scale Author/Year Purpose Scale Items Reliability Questionnaire on Mental Imagery QMI Betts 1909 Sheehan 1967 Vividness 7 Point 150 35 Internal Consistency .95 Test-Retest .78 Gordon Test of Imagery Control GTIC Gordon 1949 Richardson 1969 Controllability Yes/No 5 Point 12 Split Half .76 Test-Retest .84 Correlation to QMI .47*** Vividness of Visual Imagery Questionnaire VVIQ Marks 1973 Vividness 5 Point 16 Test-Retest Range Between .67 - .87 Vividness of Movement Imagery Questionnaire VMIQ Isaac, Mark, & Russell 1986 Vividness 5 Point 24 Test-Retest .76 Convergent Validity with VVIQ .81 Imagery Use Questionnaire IUQ Hall, Rodgers, & Barr 1990 Use by Athletes 7 Point 37 Theoretical Support Imagery instruments have been criticized for their lack of theoretical support. Two primary and competing theories discussed in mental practice literature 13 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. are the symbolic learning theory and the psycho neuromuscular theory. The symbolic learning theory (Savoyant, 1988) suggests that the deliberate use of imagery improves performance by providing athletes with a blueprint of sorts to pattern their movements into symbolic components, thereby improving the cognitive component of motor activity. The psychoneuromuscular theory, on the other hand, contends that imagery of motor tasks results in a minute innervation in the muscles that will later be activated during the deliberate performance of the same task (Hale, 1982). Feltz and Landers (1983) performed a meta-analysis of 60 studies. Using enhanced performance as the outcome, they found an overall effect size of .48 for mental practice, which was superior to no practice. They concluded that mental practice enhances performance by enabling the athlete to better focus attention. This conclusion has been interpreted to support the symbolic learning theory. However, Savoyant (1988) has suggested that these two theories should be linked rather than separated; performance of a task has both a cognitive and motor component. For example, a person might plan to throw a baseball (i.e. which grip to use, where to throw, route of the throw, etc.) and then execute the throw (i.e. holding the baseball, rotation of the arm, extension and release, etc.). To label sporting tasks as solely cognitive or motor becomes nearly impossible and is not productive. Paivio’s (1985) analytic framework for imagery effects is consistent with this idea of integration. He suggested that when imagery is used to improve human performance it has both a motivational and cognitive role. Both of these operate on 14 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. either a general or a specific level. Motivational imagery is connected with emotive arousal at the general level and goal-related activities at the specific level. Cognitive imagery that is general corresponds to strategizing for success, and specific cognitive imagery relates to skill acquisition, mastery, and execution. Motivational specific imagery is related to a positive performance outcome. The critical element to this imagery is visualizing the winning moment. Motivational general arousal imagery centers on emotive arousal, relaxation, and control. Athletes image the emotions of competition and see themselves controlling their physical and emotional responses. Motivational general mastery images are about focus, confidence, and positive self-talk. These images are centered on maintaining attention often in the face of distractions that may or may not be a part of their sport’s competition (Munroe, Giacobbi, Hall, & Weinberg, 2000). There is research to support the framework suggested by Paivio. In particular, Salmon and Hall (1994) and Munroe, Giacobbi, Hall, and Weinberg (2000) found support for Paivio’s framework through use of the IUQ and qualitative interviews with different athletic groups. When asked about their imagery, athletes’ responses consistently supported the breakdown of cognitive and motivational on general and specific levels. Criticisms of Current Measures Unfortunately, all of the imagery instruments previously reviewed have been criticized for their: lack of construct validity; susceptibility to response sets and social desirability; lack of divergent validity; absence of current norms for 15 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. interpretation; inadequate scripts to prompt vivid images; and poor standardization. There have been several reviews of imagery questionnaires (Hall, 1985; Hiscock, 1978; Kihlstrom et al., 1991; McKelvie, 1995; Moran, 1991; Murphy, 1990; White, Sheehan, & Ashton, 1977). The problems identified by these authors are reviewed. Construct Validity Self-report imagery instruments face many challenges. The most significant of these is their ability to measure accurately what these questionnaires are believed to measure. Like many psychological constructs, mental imagery is an internal, individual experience that is not directly observable (Reisberg & Heuer, 1988). Bringing out a person’s subjective and private event into the external realm for a shared examination has proven to be challenging. In the case of mental imagery assessment, Hiscock (1978) concluded that, “It is not clear what imagery questionnaires really measure or what criteria are appropriate for validating them” (p. 223). The imagery measures listed in Table 1 have demonstrated sufficient reliability. However, although reliability is a prerequisite for validity, it does not guarantee it (Moran, 1991). Predictive validity, using results to predict a range of observable effects, is one form of validity that has been used to validate quality of imagery tests. Attempts to predict performance on spatial ability, mental rotation, image scanning, and memory tasks using imagery ability scores have yielded inconsistent results (Hiscock, 1978; Reisberg & Heuer, 1988). Another way to demonstrate construct validity is to compare a new instrument to already accepted tests of the same construct to show that it correlates 16 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. highly with them (convergent validity) and with measures of dissimilar constructs to show no correlation (divergent validity). This strategy has been employed to validate the GTIC. Because it measures controllability whereas the QMI measures vividness, it was expected that there would be no correlation or a low correlation. This result would show that they are indeed measuring different constructs. However, this has not been the case. Positive correlations as high as .47 and a shared variance of 22 percent have been reported between the QMI and GTIC (McKelvie & Gingras, 1974). The GTIC also correlates significantly with the VVIQ (also a measure of vividness) at .45 (Kihlstrom et al., 1991). This similarity suggests that the concepts of vividness (QMI and VVIQ) and controllability (GTIC) are based on the same general factor and are not measuring separate constructs (White, Sheehan, & Ashton, 1977). The vividness instruments used more commonly by sport psychologists, the VVIQ and the VMIQ, have demonstrated a convergent validity of .81 (Isaac, Marks, & Russell, 1986), which is expected given their equivalent designs and methodology. The VVIQ also correlates positively with the QMI, most likely because it is based on a subscale of the QMI, with four of the 16 questions being identical (Kihlstrom et al., 1991). A major concern with the VVIQ and VMIQ’s ability to accurately measure vividness of imagery is their implication that vividness is uniform across situations (Reisberg & Heuer, 1988). However, these tests do not have the ability to assess vividness on multiple dimensions. With these measures, the person’s image is assessed as a whole without consideration of variation in experience or differences 17 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. within the same image. This may explain the minimal range that the VVIQ and VMIQ are able to detect because of this all-or-nothing approach. Although the tests use a 5-point scale, Kihlstrom et al. (1991) found that only 3 percent o f their sample (730 college students) reported having vague or dim images or no imagery at all (i.e., the first two of the five points on the scale). McKelvie (1995) identified an overall mean score of 2.31 with an overall standard deviation of .69 in his review of 38 research studies utilizing the VVIQ. The researchers concluded that, “few subjects lack the ability to produce the images requested by these questionnaires” (Kihlstrom et al., 1991, p. 139) and that the VVIQ should be considered an “easy” test. In short, although the VVIQ and the VMIQ do seem to measure vividness on an intuitive level, the restricted range detected compromises the meaningfulness of their scores. Adding to the problem of ascertaining meaning from the scores provided by the VVIQ and other imagery assessment tools is a lack of standardization (Moran, 1991). This lack of attention to proper scientific method can also be seen in the absence o f contemporary norms to provide a meaningful interpretation of imagery scores (Kihlstrom et al., 1991). Response Sets There are two types of response sets by which scores on imagery questionnaires have been influenced: social desirability and acquiescence. In the sporting community it is especially critical to consider the expectations, setting, or persons who will be reviewing scores as possible sources of error (White, Sheehan, & Ashton, 1977). For some athletes, there is a considerable amount at stake when exposing themselves to a battery of tests that may be 18 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. interpreted for coaches, managers, or agents. Imagery testing is not an uncommon practice when recruiting players for collegiate or professional level sport organizations. Given the simple nature of the imagery questionnaires that often are used, it would not be difficult for participants to rate themselves as having a vivid imagination (Moran, 1991), which is believed throughout the sport community to be a desirable asset. Acquiescence occurs because it is easier to say “yes” than “no” (Moran, 1991). It is common practice to employ reverse scoring in an attempt to control for such contamination. Unfortunately, this procedure has not been utilized in the QMI, VVIQ, or VMIQ. The VVIQ’s susceptibility to response sets has been noted for decades. Hiscock (1978) noted his concern with the VVIQ’s rating of “images using a numerical scale on which one pole always represents ‘good’ imagery” (p. 224). In addition to being “easy” tests that may be particularly vulnerable to the respondents’ desire to score as high as possible, the QMI, VVIQ, and the VMIQ all are plagued by an inconsistency of subject’s ratings (Moran, 1991). This inconsistency is because there is “no way of knowing whether or not Ss are applying the same standard in making their ratings” (Anderson, 1981, p. 157). This is because discriminating between item responses can be very difficult (e.g., “perfectly clear and vivid” versus “clear and reasonably vivid;” or, “moderately clear” versus “vague and dim.”) For these measures, there are no examples to aid subjects in differentiating between the ratings on these scales. It would be difficult to assign 19 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. these values to tangible images, let alone to an internal image that is being remembered. Imagery Scripts For all available measures, the imagery “script,” instructions to stimulate the intended imagery, “are appallingly b rief’ (Murphy, 1990, p. 161). At times, only a single word is used to stimulate an image. This crucial source of error “instructions must contain sufficient detail to ensure that all subjects...are imaging the task in the same manner” (Paivio, 1985, p.20). Lack of details leaves low-level imagers to fill in an abundance of gaps if they are to stimulate a quality image. Also, the simplicity of the task is likely to constrain variation. Without detail in the script, it is more likely that subjects will perceive themselves as high-level imagers. For example, the script “running” might stimulate a shadow figure with legs and arms in action as if they are running, but without movement, color, interaction with the ground, and sound for a low-level imager. But when asked how vivid their imagery was, the imagers may feel very successful about their shadow figure and rate themselves somewhere towards the middle of the scale. The same script could also stimulate a much more complex image that includes the elements of setting, movement, sound, and texture for a high-level imager. When asked to rate their level of vividness, they may be likely to lower their rating because they saw room for improvement in their imagery. Paivio (1985) states, “The goal is to find ways of augmenting the richness and accuracy of the memory base for imagery and maximizing the efficiency with 20 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. which the relevant information can be retrieved” (p. 26). What if instead of asking a participant to image “running,” they were provided the following script? Imagine a person running. See their body moving in rhythm, their arms and legs driving them forward. See the expression on their face. Are they pushing themselves to their limit or staying in their comfort zone? Hear them breathing. Are they gasping for air or perhaps their breath is in rhythm with their body. Now when the participants are asked to rate their vividness of the image, they have more information upon which to base their assessment. They will be able to better understand the qualities of a vivid image. A high-level imager already may be fdling in the dimensions requested above, but it is very possible that a low-level imager may have no idea how complex mental imagery can become until he or she is given a more in-depth script. In addition to providing the foundation for a quality mental image, scripts provide the retrieval cues for stored experiences that will be incorporated into imagery (Paivio, 1985). Consider, for example the pool player who does not run on even a casual basis. For this person, the one-word script “running” is unlikely to spark a vivid image. But by providing additional cues, the pool player will be better able to piece together memories that match the request of the script and develop a more vivid image. A quality of imagery assessment tool should maximize the extent to which participants are on equal footing. Summary Although they demonstrate high reliability and convergent validity, the VVIQ and VMIQ have been deemed “easy” tests; “easy” both in their assumption 21 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. about the imagery experience as a whole without dimensions and in their lack of statistical variance. It is likely that the low variance has made it impossible to make meaningful predictions with statistically significant success (Kihlstrom, et al., 1991). Imagery measures that assess controllability also are of suspect value due to their inappropriately high correlations with measures that assess vividness. Another major limitation is the measurement error caused by response sets. In particular, athletes may have considerable incentive to perform well on imagery assessments, or may be unaware of what truly high-quality imagery “looks like.” Without test development strategies that include reverse scoring, a more straightforward means of rating the imagery experience, or scripts that are realistic and sufficiently detailed, such contamination will continue. Methodological Problems in Sports Imagery Literature Most research on the impact of imagery on sports performance has utilized a before and after approach to compare performances of athletes who have trained with mental rehearsal, physical practice, mental rehearsal and physical practice, and no practice (Murphy, 1990). These studies have primarily examined differences between novice and elite athletes, gross motor and finite motor tasks, and internal and external imagery perspectives. The findings have been inconsistent. This inconsistency is quite possibly related to the limited variance on imagery scores obtained by the imagery instruments utilized because those scores are not meaningful predictors of performance (Kihlstrom, et al., 1991). 22 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Some authors contend that mental imagery practice is superior to no practice (Feltz & Landers, 1983). Others have found superior performance on varying tasks when combined mental and physical practice is compared to physical practice alone (Onestak, 1991). Unfortunately, confidence in these results is questionable because the majority of researchers have assumed the ability of their subjects to perform the required imagery tasks was equal. The inability to state with confidence that mental imagery has been shown to be an effective treatment to enhance performance because of assumptions of equal skill has been expressed repeatedly. “Sport psychology researchers have also been comparatively neglectful of individual differences in imagery ability and styles in their assessment efforts” (Murphy, 1990, p. 163). Hall (1985) discussed a typical example describing this particular problem in sports imagery research. When subjects are requested to image a motor skill in an experiment, generally there has been little concern with how subjects might differ in their ability to comply with this request. Yet there are individual differences in imagery and these differences may influence the results (p. 18). These assumptions regarding imagery ability have created a major methodological flaw that permeates the body of sport psychology research. “What happens if the treatment effects are confounded by individual differences in imagery ability” (Moran, 1991, p. 157)? Furthermore, the lack of consideration about individual differences in imagery ability and the potential impact of such variability on treatment effects has been discussed, but not addressed with new research designs (Hall, 1985; Moran, 1991; Murphy, 1990). The conclusions of these studies must 23 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. be scrutinized. As long as the sport psychology community does not have faith in the ability of its quality of imagery assessment tools, the likelihood that problematic methodologies will be addressed and corrected is low. Proposed Instrument The purpose of this study was to develop and validate a quality of imagery assessment tool that has fewer of the limitations present in the current imagery instruments, especially the VVIQ and VMIQ. Specifically, the first goal of this study was to determine the accuracy of a measure that employs standardized, computer delivered color, clarity, and audition stimuli in measuring vividness of mental imagery. Additionally, this study intended to test this methodology’s capacity to differentiate a meaningful range of performance on sport-related mental imagery tasks. To assess the vividness of mental imagery is not innovative. However, vividness redefined as generating from memory and/or imagination an internal experience that is rich in visual color and clarity, audition, and physical sensation in response to verbal instruction is a departure from prior instruments. This theoretical understanding of vividness drove this instrument’s development. Vividness as an interplay of history, imagination, and bodily sensation influenced new instrument. The measure was designed to assess imagery by referring to video and audio depictions, allowing the subject’s vividness to be made tangible for measurement purposes. Reisberg and Heuer (1988) state that “Imagery, by its nature, is a subjective, private phenomenon...This clearly creates difficulties for the scientific 24 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. study of imagery” (p. 89). Depiction of these external experiences was accomplished through the use of computer technology. The virtual absence of technology in prior measures is both noteworthy and surprising. The most recently developed imagery questionnaire, the VMIQ, was published in 1986 and directly followed in the footsteps of its predecessors by utilizing paper-pencil format, single word prompts, and a 5-point categorical scale. Computers provide the opportunity to provide stimuli that permit a more sensitive, nuanced measurement. This capacity was utilized in the development of this measure which presented test takers with digital images and audio sound bytes as stimuli to rate. Major criticisms of other imagery tests have been the brevity of the scripts used to stimulate sporting images (Murphy, 1990), their lack of incorporating surrounding stimuli (Munroe, Giacoggi, Hall, & Weinberg, 2000), and an overall need to be more attentive to their use of language to formulate an image (Hall, 1985; Moran, 1991; Paivio, 1985). Therefore, another feature of this measure is the use of audio clips to present scripts that are detailed and descriptive, and adopt both an internal and external perspective. The use of an audio presentation was intended to permit participants to become fully involved with their imagery experiences than would be the case when reading the script. By remaining relaxed with their eyes shut while listening to the script demands on participants’ working memory are reduced, allowing them to become deeply involved in their imagery. 25 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. To determine vividness, this measure was designed to assess several sensory experiences rather than simply asking the participant, “How vivid was your image?” Reasonably, the more sensorily “rich” the imagery, the more meaningful it will be to the athlete and therefore the more transferable to sport performance (Munroe, Giacobbi, Hall, & Weinberg, 2000). The senses assessed by this new measure are visual color and clarity, and auditory intensity. Ideally, kinesthetic experience would be assessed as well, but this was precluded by the technology available. The visual and auditory assessments are accomplished by progressively distorting digital images and audio clips so that subjects can very directly express when the computer image or sound best matches their mental imagery. This measure also was designed to minimize the problems present in other available instruments response set. The scoring employed makes it difficult to respond the same way to every item. Participants are presented with a graded series of video or audio scene distortions from which to choose. Changes are very gradual, allowing ample opportunity for the athlete to stop the clip when it best matches his or her mental imagery. No previous imagery vividness assessment tool has been designed specifically for athletes. The Imagery Use Questionnaire (IUQ) developed by Hall, Rodgers, and Barr (1990) targets the what, when, and where of athletic imagery and has been tailored to specific sports, but it does not measure ability. The VMIQ attempts to assess vividness of imagery related to performing specific types of movements (Isaac, Marks, & Russell, 1986), but has many drawbacks. Among 26 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. those are the one-word script prompts, such as “standing,” “walking,” which may or may not stimulate images that are relevant to the individual’s sport practice or competition. The present measure was designed specifically to assess the mental imagery of athletes. The vignettes were carefully written to stimulate images related to sport preparation and competition, but are general enough to allow an athlete of any level in any sport to take the inventory. Paivio’s framework guided the creation of the scripts. To improve the range of detectable imagery ability, this instrument utilizes magnitudinal scaling. This required creating a continuum as opposed to a set of choices. Direct-estimation, or numeric-estimation was utilized to validate the continuums created for the three scales of this new instrument. Specifically, a direct- estimation task has a participant compare a set of stimuli to a reference, in this case a photograph, by assigning a number to express the strength or magnitude one perceives in their potential difference. The obtained scores are correlated against physical values for the same stimuli. The result is psychophysical, magnitudinally scaled indices that are corroborated by human perception. In short, this measure was designed to respond to the more prevalent criticisms of available imagery inventories. It employs computers to deliver digital images and sound to stimulate a more sensorily rich image. It utilizes magnitudinal scaling. As well, it is theoretically driven by embracing vividness as a complex interplay of memory and imagination and a multisensory experience that interacts with the visual field. 27 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. A valid measure of the vividness of mental imagery is greatly needed in the sport psychology community. This measure was developed to provide researchers with an instrument to do this. As well, it should provide practitioners, coaches, and athletes the opportunity to accurately assess athletes’ strengths and weaknesses in performing their mental rehearsal. Doing so will enable the development of targeted interventions to improve their mental practice. Research Questions This dissertation reports the development and validation of a vividness of sporting mental imagery instrument, the Cooper Quality of Imagery Scale (CQIS). Three scales were created, each to measure a separate aspect of the vividness construct: (1) color, (2) clarity, and (3) audition. The following are research questions: 1. Will the scales amount of color and degree of clarity demonstrate satisfactory validity? Specifically, will construct validity be evidenced by: (a) a statistically significant relationship between independent groups completing a direct-estimation rating task of the psychophysical stimuli that make up the scales; (b) statistically significant relationships between the direct-estimation results and corresponding physical values representing actual change of the scales’ pictures. 28 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. 2. Will the CQIS scales demonstrate discriminant validity with a commonly used measure of vividness, the VVIQ? Validity will be evidenced by the nonexistent relationship between participants’ CQIS scale and VVIQ scores. 3. For this sample, will sufficient score variance meaningfully differentiate vividness capability across people? Variance will be evidenced by descriptive analysis demonstrating statistically appropriate means, standard deviations, and evenly distribution of scores. 29 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. CHAPTER II METHOD To develop a new instrument to measure the vividness of athletes’ mental imagery the methodology section is indicative of the stages required. After presentation of the participants and measures used, discussion of the Cooper Quality of Imagery Scale (CQIS) describes the development and expert and peer reviews of the imagery scripts. The photograph and audio clips used in the development of the scales are then described. The computer programming and software created to administer the CQIS is presented. A procedure to validate the psychophysical stimuli through direct-estimation resulting in the CQIS’ magnitudinal scales is outlined. Discussion of a final focus group that previewed the CQIS is examined. Finally, the completed CQIS is described in detail along with the procedure for data collection. Participants Participants were 47 (28 male; 19 female) athletes from 8 universities, 2 community colleges and several professional training groups. Their mean age group was 18 to 21 years old; only 2 were older than 28. The largest ethnic group was Caucasian (23; 49%), followed by African-American (13; 28%), Multiracial (4; 9%), “Other” (4; 9%), Hispanic/Latino (2; 4%), and Asian/Asian American (1; 2%). Almost all (45; 96%) were track and field athletes; the remaining two indicated their sport to be cross-country. Years of involvement in their sport were : less than 1 year (1; 2%); 1-2 years (2; 4%); 3-4 years (7; 15% ); 5-7 years (17; 36%); 8-10 years (14; 30%); 11 or more years (6; 13%). Years competing in 30 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. their sport were quite similar to years involved. Most (40; 85%) had achieved a collegiate level of competition, with the rest having achieved professional (4; 9%) and recreational (3; 6%) levels. The majority (24; 56%) were scholarship athletes. Most athletes (36; 77%) also reported prior experience with the use of imagery use to enhance athletic performance. Measures Two measures were used in this study. The first was the Cooper Quality of Imagery Scale (CQIS), which was the focus of the study itself. The second was the Vividness o f Visual Imagery Questionnaire (VVIQ). Data were collected for both instruments in the same administration, with randomized order of presentation The VVIQ (Marks, 1973) consists of 16 items rated on a 5-point Categorical scale. The choices of the 5-point scale are: 1) Perfectly clear and as vivid as the actual experience, 2) Clear and reasonably vivid, 3) Moderately clear and vivid, 4) Vague and dim, and 5) No image present at all, only ‘knowing’ that you are thinking of the object. The overall score is the mean of the 16 ratings. The reliability reported for the VVIQ has ranged from .67 to .87 (McKelvie, 1990). McKelvie (1995) identified an overall mean score of 2.31 with an overall standard deviation of .69 in his review of 38 research studies utilizing the VVIQ. Instrument Development Overview The final version of the CQIS is a computer-delivered measure consisting of six orally presented (via audio clips) vignettes, each approximately 1.5 minutes in 31 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. length. The first is of a beach scene. The five that follow are designed to assess one aspect of Paivio’s framework: Cognitive General; Cognitive Specific; Motivational Specific; Motivational General Arousal; Motivational general Mastery. Each vignette is followed by two video segments and one audio segment. The video segments are used to obtain ratings o f vividness of color and of clarity; the audio clip is used to assess audio clarity. The development of the CQIS occurred in the following steps: 1. Development of Imagery scripts: The initial versions of the vignettes were conceptualized using Pavio’s framework and written to strike a balance between being sufficiently descript to stimulate an athlete’s imagery while also being general enough to allow it to be useful to athletes from multiple sports. 2. Expert Review: Vignettes were reviewed by an expert in hypnosis and imagery. His feedback was used to review them. 3. Pilot Study: The vignettes then were reviewed and rated by a panel recruited for that purpose. They were further refined on the basis of that feedback. 4. Photographs and Audio Clips: Photographs related to track and field and to cross country were chosen and edited to create the new continuums for this instrument. As well, audio clips were made. 5. Computer Programming: A computer programmer was hired to develop the measure as it was to be delivered. 32 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. 6. Psychophysical Scaling: Given the critical importance of these progressions for the magnitudinal scaling approach o f the CQIS, a separate validation study was conducted. 7. Focus Group: A panel of masters students in counseling took the nearly-final version of the CQIS and provided feedback, when then was used to develop the final version of the measure. These steps are elaborated in the material that follows: Imagery Scripts The scripts written for the CQIS are designed by this researcher and are intended to stimulate an imagery experience for any athlete. To add theoretical support for their content, Paivio’s framework for sports related imagery was used as a guideline as the vignettes were outlined and edited. Given the importance of the vignettes in the CQIS, they were put through expert and peer review to ensure that they were descriptive and vivid, corresponded to Paivio’s guidelines, used clear and basic language, were read at an appropriate speed, and could stimulate an imagery experience. Development The scripts are each approximately 1.5 minutes in length. A total of six vignettes are used in the CQIS. The first vignette is a beach scene to provide a baseline for the instrument. The other vignettes are meant to stimulate an imagery experience for each of the five imagery categories in Paivio’s framework: Cognitive General, Cognitive Specific, Motivational Specific, Motivational General Arousal, 33 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. and Motivational General Mastery. To control for gender and vocal bias, recordings by 2 men and 2 women of each vignette were used in the final version of the CQIS, which are randomly chosen by the program. Expert Review An expert was employed to review the vignettes. A White male in his 40’s who holds a doctorate and masters in social work, specializes in hypnosis, and held joint faculty appointments in the schools of Dentistry and Social Work at the University of Southern California was the reviewer. In addition to his faculty appointments, his expertise is substantiated by his tenured license as a clinical social worker, membership in the Southern California Society of Clinical Hypnosis (SCSCH), and 25 publications in the field of hypnosis. He reviewed and critiqued the vignettes. His suggestions were utilized to expand the prompts to deepen the imagery exercise (e.g., the use of such instructions as “Feel it fully,” and “Allow this day to come to mind”). His suggestions that the prompts to image environmental descriptors be enhanced also were used. Pilot Study The next step was to subject the vignettes to a more systematic review to verify that they accurately represented Paivio’s framework of sports imagery. Also of interest was their feedback on the speed of delivery, language used, and an ability to create an image based on the reviewer’s experience of the vignette. The vignettes were audio taped to ensure standardization. Each reviewer received a tape, instructions, a questionnaire, and an envelope to return the tape 34 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. with their answers (See Appendix B). The instructions led the reviewers through a series of steps, including instructions regarding body position and relaxation, in order to create a similar listening environment for everyone. After hearing each vignette, the reviewers were asked to provide feedback using the questionnaires. Most items on the questionnaire employed a numeric rating scale. But an additional, “General Comments” section was also provided to encourage reviewers to share any reactions to the vignettes. For each of the six vignettes, respondents were asked to respond to four questions: 1. Were you able to create a mental image based on this vignette? 2. How was the speed or pace during the reading o f the vignette? 3. How well did this vignette meet Pavio ’ v description o f (type identified) imagery? (Definition provided) 4. Was the wording in the vignette confusing or complicated? Ten (5 male; 5 female) colleagues, family, and friends were recruited to review the vignettes. Their ages ranged from 20 to 56 years-old. They completed the task independently and returned their questionnaires anonymously. The general feedback was that the wording of the vignettes was consistently clear and easy to understand, and the speed of delivery was comfortable. More specifically, the average ratings for Question 1 (Were you able to create a mental image based on this vignette?) ranged from 3.11 to 4 with an overall mean of 3.52 where 1 is “No Image At All” and a 5 is “Clear as TV Image.” This indicates that 35 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. the participants were able to create an image that was “Pretty Clear” during the majority of the vignettes. The scores for vignette 3, however indicate that the participants had the most difficulty generating a vivid image. A 3 rating, “In and Out,” was most often selected as opposed to a 4 rating. The free response data for this vignette prompted changes to the vignette. More descriptive queries were added to encourage imaging specific distractions and obstacles in more detail. The participants were asked to image a particular day where they were challenged in a specific manner and were lead through the memory of how they overcame that challenge. Participants were relatively unanimous when it came to the pacing of the vignettes (Question 2). The average score ranged from 2.89 to 3.33 with an overall mean of 3.18. The speed of delivery was not adjusted because it was experienced as “Just Right” the majority of the time. Responses to Question 3 (consonance of the vignette with Pavio’s description of cognitive specific imagery) were gathered for vignettes 2 to 6 because the beach scenario was not written to correspond to Pavio’s framework. The average responses ranged between 3.44 to 4.22 with an overall mean of 3.82. Vignette 3 was the only script to receive an overall score below a 3.5 (3.44). All other vignettes were believed to “Moderately” meet the description provided. Written feedback suggested that vignette 3 might benefit from the changes already discussed. Suggestions related to a specific challenge or obstacle and resulting adjustments should make the cognitive general framework more concrete. It is also possible that this lack of 36 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. connection to Paivio’s framework could have hindered the participants’ ability to stimulate a vivid image, as reported with Question 1. Because a response of “Perfectly” was the only other rating for the participants to indicate (vignettes 2, 4, 5 and 6 all received at least one 5-point rating), the feedback did not suggest that other revisions were necessary. The average scores for Question 4 (concerning confusing or complicated wording in the vignette) ranged between 1.0 to 1.56, with an overall mean of 1.13. Therefore, the surveys clearly suggest that the wording of all of the vignettes was “Not at All” confusing or complicated, and was not changed for purposes of clarity. Photographs and Audiotapes Photographs. Digital photographs of a sunset and of varying track and field settings were taken to provide a data bank for the program to randomly select from for the sports-related video progressions. Sporting images include a starting block on a track, 2 different track stadiums, a long and triple jump runway, and a close-up of track lanes. All photographs are free of people or other potentially discrepant details that may distract participants during their assessment. Using general sporting images allowed for some sport-specific information to be included without attempting to match the athletes’ exact mental imagery experience, which would clearly be extremely improbably if not impossible. Relatable images are, however, intended to enhance the recall ability of the participants. The visual progressions appear as a slide show through the use of Flash technology. This was accomplished by generating multiple versions of the same 37 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. photograph and compiling them into a video. To create progressions to represent changes in the color and clarity of pictures, this researcher altered all photographs using Camedia, image software for PC computer systems. The Flash videos required the creation of 36 different versions of the original or real-life photograph. These versions were saved as .jpeg files, 388 pixels wide by 291 pixels high. To portray the most complete range of clarity, original photographs were edited in Camedia using the Blur and Sharpen filters, the most extreme versions being blurred 20 times at one end of the continuum and sharpened 10 times at the opposite end of the continuum. To diminish the endpoints of the progression, a zoom feature was used 6 times on the most blurred version and 9 times on the most sharpened version. To express varying levels of color, many filters were used during editing to achieve a progression that included Black & White, 16 Colors, and 256 System Colors. The filters used in Camedia included Bilevel, Edge, Black & White, 16 Color, 256 System Colors, Gamma, Brightness, and Contrast. Multiple combinations of all these filters were used to achieve a gradual change in the amount of visible color. Combinations included Gamma at a 2.2 level followed by Bilevel, Edge, or 16 Color. Gamma at a .8 level was also followed by many negative 50 Contrast applications. For the clarity and color dimensions the application of the filters was replicated with each subsequent photograph to create multiple videos. Therefore 38 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. each end point could be recreated with a similar application of filters and the unedited photo occurred in a similar point in the continuum. Audio Clips. Sounds used for the beach vignette include crashing waves, seagulls calling, and the ebb and flow of an ocean tide. Sport-related sounds include background crowds, cheers and disappointing sighs of an audience at an outdoor track meet, running on pavement and a treadmill, clanking of weight plates at a gym, and cheers from an audience in an auditorium. Sounds of nature include storming weather, a running creek bed, and birds chirping. The audio progressions play as a continuous sound track. They are a compilation of multiple .wav files either purchased from AudioSparx.com or recorded by this researcher. To create the auditory progressions, wav files were layered together using Magix Audio Studio software for PC systems. This program allowed for each layer o f audio to independently fade in and out to create a seamless audio track. The result was a successful set of audio files that fade in with vignette appropriate sounds, peaking in intensity mid-way through the progression, and fading back out to silence. No randomization of audio files occurs because only one audio clip for each vignette was created. Computer Programming The assistance of a professional computer programmer was necessary to make the CQIS accessible and functional. Through conversations with specialists, it became apparent that the software needed to make the CQIS function did not 39 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. independently exist. Therefore, a professional software programmer, was hired to develop it. Software Development. For the programmer to become familiar with the rationale, purpose, and needs of this instrument, meetings were held to conceptualize how the CQIS would need to function. The CQIS went through several revisions before a working prototype was created that would fit both the researcher’s needs and the parameters established by the programmer. The result was a unique program that then was installed on a PC desktop and a laptop computer for data collection. The programmer also assured the modem connections required to communicate with the secure server he maintains. The program takes participants through all the portions of the CQIS and VVIQ (See Appendix H), beginning with the consent form (See Appendix E). Once the athlete has acknowledged consent (mandatory step), the program immediately takes the athlete to the demographics portion of the instrument. After that is completed in full (mandatory step), the program randomly launches the assessment phase by presenting either the CQIS or the VVIQ. When the assessment is finished (mandatory step), the program automatically connects with the server to send the data to be stored. The participant then receives a unique code to enable later retrieval of their scores. Creating new software allows for very detailed and nuanced data to be collected in an efficient and secure manner. For example, the video continuums are played at a predetermined speed of one frame per second. This video should not 40 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. vary in its presentation, which leaves very little room for error when evaluating the exact moment when the participant stops the clip to best represent their mental imagery. This precision makes analysis more accurate and scoring more meaningful to the researcher and the participants. The ability to enable multiple randomizations, including order of administration, vocal recordings of vignettes by gender, and images used in the video progressions, is made possible through the use o f this computer technology. Psychophysical Scaling of the Video Progressions To improve the range of detectable imagery ability, this instrument utilizes magnitudinal scaling. This required creating a continuum as opposed to a set of choices. The video progressions of the CQIS allow participants to express their level of imagery by stopping a video. To develop these videos, a photograph was altered using computer software editing filters. The software program, Camedia, allowed this researcher to develop a computer-generated progression along the visual dimensions of color and clarity of a photograph. However, for data gathered by the CQIS to be meaningful, the progressions needed human confirmation. Only then could the athletes’ responses during the video progressions on their CQIS administration confidently convey a level of color and clarity to their own mental imagery. Direct-Estimation Method. To ensure that the color and clarity dimensions were being scaled accurately, this researcher utilized direct-estimation method for psychophysical scaling. This would allow for a comparison between perceived 41 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. change in the stimuli, or photographs, in terms of their quality of clarity and color, and the actual physical change from altering the pictures. The direct-estimation task was accomplished by creating sets of photographs from the videos and having subjects rate their impressions of the photographs. To standardize the task, the participants were shown a picture that they could refer to throughout the task. This anchor, or reference slide, was given a score of 100 and was labeled A. The remaining pictures were labeled B-S so that half of the pictures from each slide show were rated at a time. The sunset photographs in the clarity and color video progressions were chosen because they were the only non-sporting pictures, and the participants targeted for this validation were not necessarily involved in athletics. Photographs from the video progressions were separated by their slide number into odd and even numbered groups. Since each video continuum was created in a standardized manner with the same editing process, it was not necessary to individually validate each video. Each picture was reduced to a size of 1.63 inches high by 2.17 inches wide to fit onto 2 standard sized pages (See Appendix G). Direct-estimation, or numeric-estimation is considered the “most direct and widely used method for constructing ratio scales of sensation (Lodge, 1981, p.7)” and was used for this psychophysical scaling exercise. Specifically, a direct- estimation task has the participant compare a set of stimuli to a reference, in this case a photograph, by assigning a number to express the strength or magnitude one perceives in their potential difference. The reference photograph from the video continuum was chosen at random from 5 of the mid-point slides. The reference was 42 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. labeled slide A and was given a score of 100. The task for the participants in this scaling exercise was to compare the other 17 pictures to the reference on the basis of color or clarity definitions provided and to express their comparison by giving a score that seemed appropriate. The directions gave examples for possible scoring (See Appendix G). Here is a set of photographs, labeled A-S. Note that some of the pictures are more clear as real life than the first picture, A, and some are less realistically clear. The first picture is your reference. We will give it the number 100. Your task is to say how clear as real life the other pictures are in comparison to the first picture by giving each photo a number compared to 100. If you feel the photo is twice as clear it would receive a score of 200. If you feel the photo is half as clear it would receive a 50. The more clear as life a photo is in comparison to the first picture, the bigger the number you should give it compared to 100. The least clear as life a picture appears to be in comparison to the first picture, the smaller the number you will give it compared to 100. Just give each picture a number that seems appropriate. Here is a set of photographs, labeled A-S. Note that some of the pictures are more colorful as real life than the first picture, A, and some are less realistically colorful. The first picture is your reference. We will give it the number 100. Your task is to say how colorful as real life the other pictures are in comparison to the first picture by giving each photo a number compared to 100. If you feel the photo is twice as colorful it would receive a score of 200. If you feel the photo is half as colorful it would receive a 50. The more colorful as life a photo is in comparison to the first picture, the bigger the number you should give it compared to 100. The least colorful as life a picture appears to be in comparison to the first picture, the smaller the number you will give it compared to 100. Just give each picture a number that seems appropriate. For the purposes of this study, questionnaires were limited to 2 pages to enable quick visual scanning of all pictures. Printing was done on a high-quality color Xerox 43 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. machine using laser paper to ensure the highest color resolution and consistency of printing. Participants Data were collected at a Community College in the general quad during a lunch hour. This researcher approached 20 people who all offered their time to complete the exercise, which took between 3-5 minutes. The ages of participants ranged from 18-45 years old. Twelve men and 8 women of varying racial and ethnic backgrounds participated. Focus Group When the CQIS program was near completion, a class of approximately 20 Masters counseling graduate students at the University of Southern California reviewed it an provided feedback. In the interest of time, the administration was done in a group format by connecting the computer to a conference room projection system. Unfortunately, the audio was not connecting with the room’s speakers and played only from a laptop at the front of the room. This deficiency of the audio component is of concern due to the large size of the room and the change in quality of audio when the volume was raised to 100%. Several panel members had to move closer to hear the vignettes. The panel was instructed to imagine that they were individually taking the instrument, even though this researcher was the only person interacting with the program. The program was allowed to run through the consent form, demographics, instruction screens, all vignettes, and the video and audio 44 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. progressions. Once finished, an interactive question and answer session occurred with the principal researchers and the panel. Questions posed to the panel were, “Were the screens easy to understand, or were you ever confused by the language or presentation?,” “Were you ever confused by what the task was asking you to do?,” and “Were you able to express yourself using the video and audio continuums?” Also, the panel was given time to share any other thoughts or concerns they had. Most feedback centered on the panel’s ability to accurately express their imagery experience by using the video continuums. Three reviewers indicated that the depicted image was incompatible with their own imagery. For example, one member repeatedly stated, “My beach had people on it,” and did not feel he could have selected any point during the sunset image continuum because the beach photograph used was of an empty shoreline during sunset. However, the majority of the panel did not share this concern, and it should be noted that a former track and field athlete stated she had “No problem” completing the assessment tasks. No suggestions were offered as to the individual screens regarding their presentation, wording, or format. Several members did have comments on how to add to the audio clips. Specifically, one member suggested adding a heartbeat and breathing sounds to the audio file related to sounds of training. The researcher shared that these sounds were already incorporated into that sound file, but were inaudible in this instance because of the sound quality from the laptop speakers. It was the consensus that this researcher would independently check the volume of 45 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. those additional sounds prior to the final version of the program. Sound quality is not a concern for the actual administrations to athletes because they will be listening with headphones of good quality provided by this researcher. Focus Group Outcomes. Two major changes resulted from this panel review. The first was more explicit instructions prior to the video and audio progressions. The instructions during the panel review were, “You are about to see an image that is going to change on its own. When the image best matches the quality of the mental imagery you just experienced, press the space bar...” Because a few members did not understand what the word “quality” meant in the instructions, it became evident that more detailed instructions were needed. Also, one of the panel members observed that her mental imagery was changing and moved in and out of focus. She was confused as to which moment to stop the video because she could have selected both a blurry and focused slide. The resulting instructions included very precise language to define the word “quality” and clarified when to stop the progression for each dimension being assessed. Also, instead of a single instruction for all assessment tasks, three different instructions were created for clarity, color, and auditory continuums. The second change was the addition of a larger selection of photographs for the video progressions. Initially, the program randomly selected from only 2 different images for the 5 sports vignettes. The number of photographs was doubled to reduce the number of duplicate images per administration. The reason provided by the panel was fatigue and monotony from seeing the same images repeatedly and 46 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. fear that participants would tune out the task at hand. Photographs of different track and field images (triple and long jump runway and a starting block) were added in addition to far away photos of the entire track and stadium, to keep participants energized and awaiting the video progressions. The Final Version of the CQIS These multiple steps resulted in the version of the CQIS that was employed in the validation study. After acknowledging their consent and providing demographic information, the measure is as follows: Participants are given written instructions about the imagery task they are about to attempt, along with specific steps to enable a successful and standardized imagery experience (See Appendix F). Specifically: You are about to hear a scenario. Try to imagine the situation described. You will be asked about any imagery you may have experienced immediately after the scenario is done. Put your headphones on at this time. Make sure there are no distractions around you. When you are sitting with your legs uncrossed, feet on the floor, and your arms resting to your sides, select “Ready to Tisten” to begin. Close your eyes after making this selection as the situation plays. Once the participant has acknowledged readiness to begin, he or she is presented with an audio reading of the first script (See Appendix A). After the vignette is completed, the participant is verbally instructed to “Open your eyes,” as the program transitions to the assessment phase. Prior to each assessment task, an instruction screen appears, t he instruction screens reads for the first continuum (See Appendix F), You will now see an image that is going to change on its own. The picture is not intended to perfectly match what you may have seen 47 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. during the scenario. The task is for you to stop the picture when it best resembles the most colorful moment of your imagery experience. When it does, select “That’s It!” If you miss the exact moment when the image is changing, don’t worry because the image will change back a few more times. second continuum (See Appendix F), You will now see another image that is going to change on its own. The picture is not intended to perfectly match what you may have seen during the scenario. The task is for you to stop the picture when it best resembles the clearest moment of your imagery experience. When it does, select “That’s It!” If you miss the exact moment when the image is changing, don’t worry because the image will change back a few more times. and third continuum (See Appendix F) You will now hear sounds changing. This audio is not intended to perfectly match what you may have heard during the scenario. The task is for you to stop the audio when it best resembles the clarity of sounds during your imagery experience. When it does, select “That’s It!” If you miss the exact moment when you hear the sound, don’t worry because it will playback a few more times. Once the participant has read each instruction, they cued the program to begin playing a video or audio progression. At this time, the participant saw an image or heard a sound track that changed back and forth along a continuum. The videos are distortions of two dimensions, clarity (image changed from fuzzy and indefinable to perfectly clear to overly sharp) and color (image changed from black and white to 16 colors to full spectrum color to unnatural or exaggerated color). Auditory progressions play distorted sound files (sound changed from inaudible and muffled to perfectly clear back to muffled). Once the participant assessed their mental imagery on all three continuums by stopping the image or sound to express the most vivid moment, they were instructed to proceed to the next vignette and 48 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. repeat the same assessment tasks. This process continued until all six vignettes and queries were completed. Different images were randomly chosen to appear in the video window for each distortion. A blank video box was present during the audio evaluation; the participant only hears the sound clip. Procedure Forty-seven athletes from 8 NCAA Division I Universities from throughout the United States, 2 Community Colleges in Southern California, and several professional training groups were recruited to participate. Recruitment was done through e-mail, mail, phone calls, and in person. Coaches were contacted to inform them of the purpose, benefits, and importance of this validation study with the hope that they would encourage their athletes to participate (See Appendix C). No monetary or other reimbursement was offered as an incentive. In order to access a high number of athletes from a wide geographic area, athletes who were participating in the 2004 Mt. SAC Relays, a prestigious track and field meet held annually at Mount San Antonio College were recruited to participate. Athletes who were finished competing for the day were welcomed to take the assessments. More than half (29; 60%) of the total sample was recruited at the Mt. SAC Relays. Prior to setting up computers at the college campuses, the researcher was invited to discuss the study with the athletic teams involved during team meetings. This presentation consisted of handing out the consent form and an instruction sheet to access the computer program (See Appendix D). Participants also were informed that they would not be receiving a score at the time of their assessment, but that the 49 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. researcher would inform them when their scores would be available at a later date, accessible through their unique code provided at the conclusion of the assessment. The computer-based instrument was made available on PC computers at track and field facilities for the athletes’ use to complete the instrument. A desktop and a laptop were used. To minimize differences in the presentation of the 2 computers, the monitor purchased to accompany the desktop, a 17-inch View Sonic, was chosen to be of similar size, shape (flat screen), and quality of color as the laptop. The researcher also provided identical headphones for each of the two computer stations, along with a pen and paper to write down their unique code when they completed the instruments. The athletes took the instrument in a semi-private room with minimal distractions. Because the CQIS was made available on a computer, it was not necessary for this researcher to be present to administer the instrument or collect data. A hard copy of instructions regarding how to activate the program was left for the participants. Also, contact information was taped to the side of the computer monitor if any problems accessing or utilizing the program occurred. For security reasons, the researcher was present while the athletes completed the CQIS on the laptop. Participants began by completing a consent form (See Appendix E) that explained the purpose, potential benefits and risks, and procedures of the study. They were asked to read and acknowledge their voluntary participation in the study. They were also informed that they were not receiving a “score” immediately after 50 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. completing the instrument, but were instructed to write down the assigned code at the completion of the instrument in order to review their performance at a later date Participants then were asked to provide information about their sex, age, ethnicity, primary sport of participation, level of sport participation and competition, athletic scholarship status, and prior experience with imagery instruction and use. The participants were instructed to base their imagery during the administration on their experiences from the sport they had identified as primary. The CQIS and VVIQ were presented in random order. Like the CQIS, the VVIQ was presented via computer in text form with the instructions, questions, and categorical scales available for viewing, as it would be in printed form. The participants were able to complete the VVIQ at their own pace and marked their answers using checkboxes. Once the athlete completed both instruments, the program submitted their answers via modem to a secure server where the data was stored with a randomly assigned identification number. Several safeguards are used to protect the program and data. Only those athletes who have been given access to the computer and a password are able to take the CQIS. This safety measure is accomplished by providing the password to individuals and teams that are recruited to take the CQIS. The assigned code allows only the participant to have access to his or her performance profile at a later date. The data collected from each athlete’s test is stored on a server, accessible only by the principal researchers involved. 51 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. CHAPTER III RESULTS This chapter reports results bearing on the validation of the Cooper Quality of Imagery Scale (CQIS). A description of participant’s overall performance on the VVIQ and CQIS will be followed by results as they pertain to the remaining research questions of this dissertation. VVIQ Scores The scores obtained by this sample are very similar to those obtained in prior research, suggesting both that this was a valid administration of the VVIQ and that the sample was typical of those used in other research. McKelvie (1995) reported an overall mean of 2.31 with an average standard deviation of .69 and sample size of 51 in his review of 38 research studies using the VVIQ. For this sample, the mean was 2.57 with a standard deviation of .88. The overwhelming majority (55%) of participants in this study endorsed a 2 rating, “Clear and Reasonably Vivid,” when scoring their mental imagery (see Figure 1). Nearly all of the remaining scores (40%) were split evenly between ratings of 3, “Moderately Clear and Vivid,” and 4, “Vague and Dim.” No athletes in this sample received a 5 rating, “No Image Present at all, Only ‘Knowing’ That you are Thinking of the Object,” and only 2 athletes recorded a 1 rating, “Perfectly Clear and as Vivid as the Actual Experience.” These response patterns did not differ by gender, age, ethnicity, level of competition, or scholarship status. 52 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. These results suggest that the VVIQ was a relatively easy instrument for the athletes in this sample and did not serve as a meaningful tool to separate mental imagers into multiple categories of vividness ability. These findings are consistent with former critical mental imagery reviews. Figure 1. Distribution of VVIQ scores V V IQ 30 — 25 — </) o> 2 0 - « o 15 < u -Q 10— 5 — 1 2 3 4 5 O Scores Demonstration of CQIS Scale Validity Research question 1: Will the scales amount of color and degree of clarity demonstrate satisfactory validity? Specifically, will validity be evidenced by: 53 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. (a) statistically significant relationship between independent groups completing a direct-estimation rating task of the psychophysical stimuli that create the scales; (b) statistically significant relationships between the direct-estimation results and corresponding physical values representing actual change of the scales’ pictures. The validity of the CQIS psychophysical and magnitudinally scored visual scales, color and clarity was confirmed through a direct-estimation task. The pictures used to create the scales were given ratings based on subject’s belief that they depicted visual changes as the color and clarity continuums were defined. These ratings were compared between separate subject pools and subsequently their geometric means were compared to physical values representative of the difference between each photograph and the original, unedited picture. Scoring on the audio scale was assessed through physical estimation only because of the difference in the audio clips’ creation as opposed to the picture videos. The audio clips are comprised of a series of sounds spliced on top of one another, varying in intensity and do not lend themselves to the same form of isolated, individual comparison as do the color and clarity videos. Although degree of change from one second to the next was estimated for scoring, it would have been impossible to isolate each frame of the audio sound for direct-estimation ratings by groups of subjects to confirm that this scoring was accurately perceived by others listening to the audio clips. 54 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. To ensure that the color and clarity dimensions were being scaled accurately and represent the color and clarity continuums described to participants taking the CQIS, this researcher utilized a direct-estimation method for psychophysical scaling and validation. This would allow for a comparison between perceived change in the stimuli, or photographs, in terms of their quality of clarity and color, and the actual physical change from altering the pictures. The photographs were separated into odd and even groups of pictures, resulting in color A and B, and clarity A and B questionnaires. These questionnaires were completed by different groups of raters, groups 1 and 2 for color and clarity. The average of the responses to each photograph was tabulated by taking the common log (base 10) of each participant’s rating. This enabled the calculation of a geometric mean for each photograph by averaging the base 10 values and then exponentiating (10x ) these averages to obtain the geometric mean (see Tables 2A, 2B, 2C & 2D). The geometric mean was used instead of the arithmetic mean because it is the most appropriate measure of central tendency when interpreting magnitudinal data (Lodge, 1981). However, it must be noted that original scores of 0 were adjusted to 1 due to the inability to calculate a common log (base 10) of a physical stimuli with a zero value. The geometric means were rounded to the nearest whole number for practicality of scoring. The even slides were compared to a reference, slide 16. An adjustment was computed to account for the difference between slide 16’s reference value of 100 and received score of 94 in the direct- estimation task by multiplying each photo’s value by .94. 55 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 2 A. Geometric mean o f color scale from direct-estimation task, group I Picture Labels Mean o f Logs 10x Slide Number A 2.00 100 16 B 0.86 7 36 C 2.02 135 23 D 1.97 135 19 E 1.38 24 8 F 2.00 124 21 G 1.97 109 15 H 0.33 2 2 I 0.92 30 31 J 1.23 17 6 K 1.00 17 33 L 1.56 15 25 M 1.53 87 27 N 1.95 99 13 O 1.18 1 35 P 1.98 122 17 Q 0.84 7 4 R 1.93 85 10 S 1.55 19 29 Table 2B. Geometric mean o f color scale from direct-estimation task, group 2. Picture Label Mean o f Logs 10x Slide Number A 2.00 100 15 B 1.23 15 34 C 2.13 37 24 D 2.13 99 20 E 2.00 93 14 F 1.94 35 28 G 2.04 95 16 H 0.99 10 5 I 1.98 89 12 J 1.28 8 30 K 1.18 34 26 L 2.09 93 18 M 0.23 2 1 N 1.46 29 7 O 0.12 7 36 P 2.09 104 22 Q 1.84 69 9 R 1.48 10 32 S 0.75 6 3 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 2C. Geometric mean o f clarity scale from direct-estimation task, group 1. Picture Labels Mean o f Logs 1 0 x Slide Number A 2.00 100 16 B 1.02 10 31 C 2.05 113 25 D 1.89 78 9 E 1.80 34 7 F 2.00 100 27 G 0.88 8 3 H 2.05 112 21 I 1.96 90 15 J 1.99 98 17 K 1.39 25 5 L 0.30 2 35 M 2.10 125 23 N 1.90 80 11 0 1.96 92 19 P 1.49 31 29 Q 0.00 1 1 R 1.84 69 13 S 0.61 4 33 Table 2D. Geometric mean o f clarity scale from direct-estimation task, group 2. Picture Labels Mean o f Logs 10x Slide Number A 2.00 100 17 B 1.89 78 8 C 1.93 86 14 D 1.10 12 34 E 2.12 130 22 F 0.78 6 2 G 2.02 104 20 H 1.85 72 6 I 1.92 83 28 J 1.30 20 32 K 2.00 99 18 L 1.93 85 12 M 0.80 6 36 N 1.63 43 4 O 1.97 94 16 P 1.65 45 30 Q 2.15 140 24 R 2.13 134 26 S 1.88 76 10 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. In order to determine if participants in the numeric-estimation task were ranking pictures similarly across groups, Pearson Correlations between the odd and even groups were examined. The color scale photographs received a Pearson Correlation of .85, also significant at the .01 level (see Table 3A). The clarity scale photographs received a Pearson Correlation of .95 that is significant at the .01 level (see Table 3B). This evidence shows that two independent samples rated the pictures similarly in terms of their representation of the dimensions described for the scales. Thus, different groups of people concluded the same judgments about the photographs, adding evidence that the scales have construct validity and measured what they intended to measure. Table 3A. Correlations between odd and even photos for the color scale. _____________________________ Odd Slides Even Slides Odd Slides Pearson Correlation 1.00 .85** Even Slides Pearson Correlation .85** 1.00 Note: N = 18 ** p < .01 (2-tailed) Table 3B. Correlations between odd and even photos for the clarity scale. _____________________________Odd Slides Even Slides Odd Slides Pearson Correlation 1.00 .95** Even Slides Pearson Correlation .95** 1.00 Note: N = 18 ** p < .01 (2-tailed) To demonstrate the construct validity of the color and clarity scales, the values for each photograph from the direct-estimation task were compared to physical values for each photograph. The physical values were determined by adding and subtracting to the base score of 100 depending on the number of 58 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. imaging filters that had been applied. The following image filters were applied in a variety of combinations to achieve the color and clarity changes of each photograph: Black & White, 16 colors, 256 system colors, Bilevel, Edge, Gamma, Brightness, and Contrast (color), and Blur and Sharpen filters (clarity). For pictures altered with a single filter, the corresponding value was adjusted +1-2. For pictures altered with more than one filter, the corresponding value was adjusted +/-5. The resulting physical values for the clarity scale ranged from a low of 50 to a high o f 118. The resulting physical values for the color scale ranged from a low of 48 to a high of 111 (see Table 5). The Pearson correlation between the log scores of the color scale’s geometric means and corresponding physical values are .88 and are significant at the .01 level (see Table 4A). The Pearson Correlation between the log scores of the clarity scale’s geometric means and corresponding physical values are .76 and are significant at the .01 level (see Table 4B). These findings offer evidence that the independent sampling through numeric-estimation is supported by a physical equivalent. The perceived magnitude of change supports the physical continuums estimated. The use of these psychophysical scales is further supported, and adds credence to the use of these psychophysical scales in assessing vividness of mental imagery. 59 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 4A. Correlations between adjusted direct-estimation and physical values for the color scale. Direct-Estimation ______________________________________ Values______________Physical Values Direct-Estimation Values Pearson Correlation 1.00 .88** Physical Values Pearson Correlation .88** 1.00 Note: N = 36 ** p < .01 (2-tailed) Table 4B. Correlations between adjusted direct-estimation and physical values for the color scale. Direct-Estimation _____________________________________ Values______________Physical Values Direct-Estimation Values Pearson Correlation 1.00 .76** Physical Values Pearson Correlation .76** 1.00 Note: N = 36 ** p < .01 (2-tailed) Table 5. Color and Clarity Scale ratings, adjusted direct-estimation and physical values. Slide Clarity Values Color Values Clarity Physical Ratings Color Physical Ratings 1 1 2 50 48 2 6 2 55 53 3 8 6 60 58 4 43 7 65 63 5 24 10 70 68 6 72 17 75 73 7 60 28 80 78 8 78 24 82 83 9 73 66 84 88 10 76 85 86 90 11 75 90 88 92 12 85 89 90 94 13 65 94 92 96 14 86 93 94 98 60 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 5. Continued Slide Clarity Values Color Values Clarity Physical Ratings Color Physical Ratings 15 85 104 96 100 16 94 95 98 102 17 92 116 100 104 18 99 93 102 106 19 86 128 104 108 20 104 99 106 n o 21 105 118 108 112 22 130 104 110 114 23 118 128 112 116 24 140 37 114 111 25 106 14 116 106 26 134 34 118 101 27 94 83 113 96 28 83 35 108 91 29 29 18 103 86 30 45 8 98 81 31 9 29 93 76 32 20 10 88 71 33 4 16 83 66 34 12 15 78 61 35 2 1 73 56 36 6 7 68 51 Demonstration of Discriminant Validity Research question 2: Will the CQIS scales demonstrate discriminant validity with a commonly used measure o f vividness, the VVIQ? Validity will be evidenced by the nonexistent relationship between participants’ CQIS scale and VVIQ scores. Although discriminant validity is typically established by correlating a new construct against an already accepted and established instrument that measures a different construct, this form of validity is relevant in this case. This is because these instruments, although both claiming to measure vividness of mental imagery, 61 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. actually operationalize the vividness and mental imagery constructs differently and assess performance with different scaling methods. The VVIQ is negatively and insignificantly correlated with all CQIS indices. The CQIS color scale and VVIQ have a -.1 Pearson Correlation. The CQIS clarity scale and the VVIQ have a -.17 Pearson Correlation. The CQIS audio scale and the VVIQ have a -.20 Pearson Correlation (see Table ). The CQIS scales are positively and significantly correlated to one another at the .05 level (see Table 11). The color and clarity scales have a Pearson Correlation of .31. The color and audio scales have a Pearson Correlation of .33. The clarity and audio scales have a Pearson Correlation of .30. Table 6. Correlations between the CQIS indices and the VVIQ mean. ____________________________ Color______Clarity Audio_____ VVIQ Color Pearson Correlation Clarity Pearson Correlation Audio Pearson Correlation VVIQ Pearson Correlation 0.31* 0.33* -0.10 1 0.30* -0.17 1 -0.23 1 Note. N = 47 * p < .05 (2-tailed) Demonstration of CQIS Scale Variance Research question 3: For this sample, will sufficient score variance meaningfully differentiate vividness capability across people? Variance will be evidenced by descriptive analysis demonstrating statistically appropriate means, standard deviations, and evenly distribution of scores. 62 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. The CQIS measures several dimensions using individual scales, color, clarity, and audio. The mean score of the athletes in this sample on the color scale was 89.49, with a standard deviation of 21.69 (see Figure 2). Athletes’ scores fell between 24 and 124 out of a possible 1 through 128. The scores on the color dimension were well distributed. Eight athletes scored between 24-69 (17%), followed by a larger cluster of 21 athletes between 85-93 (45%), and a final grouping of 18 athletes with scores ranging from 95-124 (38%). This scoring distribution is evidence that the athletes expressed a distinctive range of ability to see vividness of color in their mental imagery. Figure 2. Distribution of CQIS color scale scores CQI Color Scale 2 0 - 15- ( / > < B o I Q - 120 140 20 40 60 80 100 Scores 63 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. The mean score of the athletes in this sample on the clarity scale was 87.02, with a standard deviation of 10.28 (see Figure 3). Athletes’ scores ranged between 64 and 104 out of a possible 1 through 140. Subjects’ scores on the clarity dimension were also well distributed, but within a narrower range than the color scale. Fifteen athletes scored between 64-80 (32%), followed by a cluster of 14 athletes between 85-90 (30%), and a final grouping of 18 athletes with scores ranging from 95-104 (38%). This is evidence that the athletes expressed a definite range of ability to see vividness of clarity in their mental imagery. Figure 3. Distribution of CQIS clarity scale scores CQI Clarity Scale 80 90 Scores The audio scale was scored according to physical values. These values were not validated by direct-estimation method, and interpretation of these scores should be made with caution. Given the inability to state with confidence that the 64 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. magnitudinal scoring is valid, the audio scale was not averaged with the empirically validated visual scales. The mean score of the athletes in this sample on the audio scale was 48.89, with a standard deviation of 12.16 (see Figure 4). Athlete’s scores fell between 10 and 97 out of a possible 0 to 100. The scores on the color dimension were the most evenly distributed of the CQIS indices. Five separate clusters appeared. Six athletes scored between 10-20 (13%), 8 athletes between 25-35 (17%), 16 athletes fell in the middle range of 40-50 (34%), 12 athletes scored between 55-77 (25.5%), and the remaining 5 athletes scored between 82-97 (10.5%) (see Table 11). This scoring distribution is evidence that the athletes expressed a definite wide range of ability to hear clarity of sound during their mental imagery. Figure 4. Distribution of CQIS audio scale scores CQI Audio Scale Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Table 7. CQIS audio scale frequencies Frequency Percent Valid Percent Cumulative Percent Valid 10 1 2.1 2.1 2.1 15 2 4.3 4.3 6.4 17 2 4.3 4.3 10.6 20 1 2.1 2.1 12.8 25 1 2.1 2.1 14.9 30 4 8.5 8.5 23.4 32 1 2.1 2.1 25.5 35 2 4.3 4.3 29.8 40 1 2.1 2.1 31.9 42 4 8.5 8.5 40.4 45 6 12.8 12.8 53.2 47 2 4.3 4.3 57.4 50 3 6.4 6.4 63.8 55 1 2.1 2.1 66.0 57 1 2.1 2.1 68.1 60 2 4.3 4.3 72.3 62 1 2.1 2.1 74.5 65 1 2.1 2.1 76.6 67 1 2.1 2.1 78.7 70 3 6.4 6.4 85.1 77 2 4.3 4.3 89.4 82 1 2.1 2.1 91.5 87 2 4.3 4.3 95.7 92 1 2.1 2.1 97.9 97 1 2.1 2.1 100.0 Total 47 100.0 100.0 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. CHAPTER IV DISCUSSION This dissertation reviewed the development and validation of a new measurement tool designed to assess the vividness of sporting mental imagery. The end result, the Cooper Quality of Imagery Scale (CQIS), demonstrated an ability to assess mental imagery performance on several dimensions of vividness, color, clarity and audition by redefining the vividness construct and seeking out new ways of assessing its properties. This chapter will review the development of the CQIS: its purpose, rationale, scaling methods, and computer technology. A discussion of the overall VVIQ and CQIS performance of the final athletic sample will follow, while addressing the findings as they pertain to this study’s research questions. The limitations of the development and validation stages will be presented. Finally, the future implications of the CQIS potential use in the field of sport psychology research will be offered. CQIS Development The CQIS was put through expert and peer review throughout several phases of its construction. The development of this new instrument began with a formal definition of vividness of mental imagery: generating from memory and/or imagination an internal experience that is rich in visual color and clarity, audition, and physical sensation in response to verbal instruction (Ahsen, 1995; Campos, 1995; Kunzendorf, 1995; McKelvie, 1995). Defining differences in mental 67 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. imagery performance based on vividness of a dynamic experience made it different from past efforts to restrict vividness to a single dimension. To accurately assess the vividness of mental imagery of athletes, detailed scripts designed to elicit an imagery experience rich in color, clarity, and sound were developed. The CQIS scripts were designed with the training and performance needs of the athlete in mind. Allen Paivio’s (1985) framework outlining several forms of sporting mental imagery was used to guide this phase of development. Once the scripts were complete, confidence in their use was confirmed through peer and expert review assessing use of language, pacing, ease of image generation, and connection to Paivio’s framework. To enable the CQIS to measure vividness along multiple, dynamic dimensions, a different method of scaling were sought. The magnitudinal scaling of psychophysical stimuli was chosen to enhance the CQIS’s ability to accurately and consistently measure these internal processes. The color and clarity scales were developed by altering photographs pertaining to, but without attempting to match, the mental imagery experiences of participants. Using computer programs, the photographs were altered in a standardized manner to create continuums along the color and clarity dimensions. The audio scale was developed by splicing together a series of sound clips that increase and decrease in audibility and intensity. Validating the color and clarity scales was accomplished through direct- estimation of the photographs to test that the changes along each construct’s dimension were identifiable by the human eye and perceived as accurately 68 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. demonstrating the range for color and clarity as it was defined. Human perception of these changes was compared to assigned physical values for each photograph. The scales were scored using the geometric means earned from the direct-estimation task for each photograph in the video continuums. These values were supported by evidence that the changes witnessed by two samples were in line with the actual physical values of the scales. It should be noted that the audio scale was not put through this type of validation, but that the scoring method was based solely on physical estimation. The use of computer technology became integral to the CQIS’s development, completion, and use. To enable participants to interact with the magnitudinal scales and listen to the imagery scripts created, a computer program was designed. A programmer developed the CQIS as Windows-based software that could be installed on any PC computer. The specification requirements are a recent edition of Microsoft Windows, color monitor, sound capabilities, and a modem connection to a server hosted by the programming expert. The modem connection is critical so that information between the server hosting the image and audio files and the computer activating the program software can communicate. The server stored all data gathered immediately after each administration. VVIO The VVIQ is a 16-item, categorically scaled instrument that measures vividness of mental imagery as a unidimensional construct, assuming mental imagery is a static ability. This quick assessment has the advantage of saving time 69 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. in administration and scoring. Unfortunately, the use of a non-standardized 5-point scale spurs questions regarding the VVIQ’s validity. Also, the brevity of scripts used to stimulate “vivid” imagery is ground for concern. The VVIQ results suggest that this was a valid administration. The overall mean (2.57) and standard deviation (.87) are typical of published research findings. McKelvie’s 1995 review of 38 studies (average N=51) that used the VVIQ observed an average mean of 2.307 and an average standard deviation of .692 and are comparable to this study’s sample size (N=47). There were no significant differences in performance when controlling for gender, age, ethnicity, level of competition, prior experience with mental imagery, or utilization of mental imagery. Given the limited range detected by the VVIQ in this and other samples, its usefulness as a research tool in predictive studies is not there. The VVIQ separated the athletes in this sample into two groups, those who scored 1 ’s (N=2) “Clear as Normal Vision” and 2’s (N=26) “Clear and Reasonably Vivid,” and those who scored 3’s (N=9) “Moderately Clear and Vivid” and 4’s (N=10) “Vague and Dim.” Prior reviews have confirmed this separation as groups of “Vivid” and “Non-Vivid” imagers (McKelvie, 1995). Demonstration of CQIS Scale Validity The CQIS’s dimensions of mental imagery vividness were scaled and validated through direct-estimation. This task is used to assess the accuracy of psychophysical stimuli. Four groups of 5-6 people were asked to rate the photographs used to create the color and clarity continuums. The task required 70 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. subjects to rate each photograph individually when compared to a reference slide. The reference slide was a photograph from the same continuum, chosen from the mid-point and assigned a score of 100. There were 36 photos that made up each color and clarity video. Two groups of 18 slides were separated by odds and evens. Once the ratings for each photo were collected, the geometric mean of the average log scores was calculated. The means for the even slides were later adjusted based on the reference slide’s rating of 94 on this same task. The validity of the color and clarity scales is established through the significant (.01 level) and positive (.85 and .95 respectively) Pearson correlations between two independent samples. These correlations were confirmed as accurate by comparing the geometric means from the direct-estimation to the physical values (color=.88 and clarity=.76). It can therefore be stated with a high level of confidence that the color and clarity magnitudinal scales measure what they intended to measure. The photos change along a continuum that expresses varying amounts of realistic and unrealistic color, and fuzzy and sharp clarity. Demonstration of Discriminant Validity To offer additional evidence of the CQIS’s accuracy, discriminant validity between the color, clarity, and audio indices was established through insignificant and negative Pearson correlations with the VVIQ. These instruments operationalize the vividness and mental imagery constructs differently and assess performance with different scaling methods. 71 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. The VVIQ had negative and insignificant Pearson correlations with all CQIS indices, Color Scale, -.1, Clarity Scale, -.165, and Audio Scale, -.20. All correlations were statistically insignificant. It is quite clear that the VVIQ and the CQIS are measuring different constructs. Since the VVIQ has never been put through a validation study, it is possible that the VVIQ is not measuring vividness of mental imagery as it is perceived to do. The VVIQ has a history of correlating with instruments intended to measure controllability of mental imagery as opposed to vividness. This relationship has added to the confusion of when it is appropriate of use the VVIQ as opposed to other instruments measuring controllability of mental imagery. The low Pearson correlations observed between the CQIS indices suggest that they are also independently measuring different constructs. The color scale correlated insignificantly with the clarity (.31) and audio scales (.33. The clarity scale also correlated insignificantly with the audio scale (.30). Since the CQIS conceptualized vividness of mental imagery as a dynamic, internal experience that is rich in color, clarity, and audition, these Pearson correlations are encouraging. The CQIS intended to measure richness of visual color and clarity and audition as dimensions of the mental imagery experience. The Pearson correlations support that this independent assessment occurred. Demonstration of CQIS Scale Variance The CQIS index score distributions were compared to the VVIQ to determine their usefulness in distinguishing different groups of imagers for 72 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. vividness of sporting mental imagery. The trends show a broader range of scores with more distinctive clusters identified by the CQIS than the VVIQ. The descriptors used to identify these clusters are for discussion purposes only and are not intended as specific labels for the CQIS scales. The color, clarity and audio index scores break down into separate groups when examined independently. The ranges of scores along the color and clarity continuums were narrower than expected. The average lowest score obtained was 51, with 1 being the lowest score possible. Scores did reach the highest possible values (124 out of a maximal 128). Scoring trends observed in these scales reveal three clusters. These groups were separated by concentration of scores and are based on a participant’s answer in relation to the sample for each scale. The color scale differentiated responses into low scorers 24 through 67 (17%), moderate scorers 85 through 93 (45%), and high scorers 95 through 124 (38%). The clarity scale also separated participants into low scorers 64 through 80 (32%), moderate scorers 85 through 90 (30%) and high scorers 95 through 104 (38%). The audio index scores, the only scale not validated through direct- estimation, detected the broadest distribution of scores between 10 and 97 out of a possible 0-100. Scores demonstrated five groups of audio clarity, poor scorers 10 through 20 (13%), low scorers 25 through 35 (17%), moderate scorers 40 through 50 (34%), above average scorers 55 through 77 (25.5%), and high scorers 82 through 99 (10.5%). 73 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. This scoring distribution is not evidence of the CQIS’ validity, but is potentially useful to assist future assessment and research of mental imagery. Given the concern that predictive research studies have potentially been hindered by the restricted range in performance detected by prior vividness inventories, it is possible that the CQIS can assist in addressing this methodological anxiety. Delimitations It is assumed that the CQIS, although limited to a Track and Field population in this development and validation study, could be applied to other sports equally well. The imagery scripts were written with many sports in mind, including contact, non-contact, individual, and team qualities. Although the photographs for the color and clarity scales, as well as the sound clips for the audio scale would need to be revised with each sport population in mind, it is supposed that the CQIS would not need to be otherwise changed for use with other sporting groups. Limitations There were definite opportunities for improvement and refinement throughout the development and validation stages of the CQIS, beginning with the images used for the color and clarity scales. These photos were not tested with an athletic sample for their relatability. Some concern was raised by the final focus group that the images were not “what they had in mind,” and that they had difficulty expressing their vividness of color and clarity because the content of the image did not match the content in their mind’s eye. However, this focus group was not composed of athletes, the target group. It is unknown if athletes in the final 74 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. validation sample experienced similar issues concerning content or if the images elicited a negative reaction hindering their ability to accurately express their experience. Although qualitative feedback was not requested, there was no mention of this difficulty during conversations with participants after their administration. Unfortunately, the ability to retest athletes in order to demonstrate test-retest reliability was not available. The performances by athletes on the CQIS scales are a snapshot of their mental imagery experiences for that particular sitting and cannot be compared to other mental imagery exercises. The computer software used to distort images, which created the videos used for the color and clarity scales, was not directly intended for this purpose. The software’s filters were sometimes used in combination to create the desired effect of this researcher. Software available to the public could not be located that would change still photos along the continuums as they were defined for the CQIS. For example, gradual distortions in amount of color without using multiple filters were only possible by removing degrees of or all of one color such as red, green, or blue. This change did not adjust the vividness of color, rather the amount of saturation by independent colors. Therefore, the values used to determine the amount of correlation between the direct-estimation values and their physical counterparts were estimates and were not assigned by the computer program. If software had been able to be located to directly create these continuum changes without multiple distortions, the validation process would have been less subjective. 75 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. The computer program designed to administer the CQIS and VVIQ was hampered by technology. Problems with the modem connections to the secured server were encountered for two reasons. The most prominent difficulty was maintaining the stationary computer. There were instances of athletes not being allowed access to the program because the host server, in this case a university, was not being recognized by the secure server hosting the program and data collect, and was therefore being denied access. Working directly with university information services allowed their server to “talk” with our secure server, solving this problem. Other unforeseen complications were reception difficulties when using the laptop. Although the speed of modem connection was adequate, storm weather delayed the speed of the program for a few participants. Power reserves for the laptop were the final issue combated when gathering data in remote locations. The settings where athletes took the CQIS and VVIQ were not as controlled as initially planned. Since athletes from varying universities participated, access to office space for a more private administration was not as readily available as this researcher hoped. Some athletes were forced to take the CQIS and VVIQ in less than ideal surroundings with background noise and other distractions. During these times, the volume for the headphones was adjusted to drown out other noise, decreasing the athlete’s ability to become distracted. Most all athletes did not feel this hindered their ability to concentrate during their administration. The sample population, although diverse in ethnicity and evenly split for gender, was not very diverse in age. Given the accessibility of collegiate athletes 76 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. and the difficulty accessing professional athletes, this lack of age differenential is not surprising. However, it is unknown how more senior or younger athletes would have impacted the results of this validation study. In addition, the total N of 47 was considerably less than the target range, 75-100. Unfortunately, accessibility and total time of each testing (30 minutes) hindered a larger sample size. Future Implications This development and validation dissertation demonstrated the level of involvement required to develop a theoretically supported, psychometrically sound instrument to measure the internal experience of a dynamic, psychophysical construct like vividness of mental imagery. It is no longer surprising to this researcher that a new approach to measuring vividness of mental imagery experience had not been published since 1986. The CQIS was developed to meet a specific need within the sport psychology community: a valid assessment tool that would provide meaningful ability to differentiate between groups of mental imagers. By referring to theoretical debate regarding the vividness of mental imagery construct and critical reviews of past assessment tools, and accessing technological and scaling advances, the CQIS emerged as a new tool for use in scientific inquiry. Through changes in presentation of imagery scripts and psychophysical stimuli, the CQIS offers the sporting community a computerized, interactive experience to accurately assess vividness of mental imagery. 77 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. A major accomplishment of the CQIS is the ability for the color, clarity, and audio indices to differentiate a meaningful range of performance on sporting mental imagery tasks. This variance will enable future research to test predictive hypotheses regarding the impact of mental imagery interventions on sports training and performance by detecting statistically significant results. These predictive studies have produced discrepant findings in the past that have hindered sport psychology literature. Mental imagery use is commonplace in the sports world, and yet we do not know enough about its effectiveness with different groups of athletes. Hopefully, the CQIS will rejuvenate this field of study and encourage the design and validation of other instruments measuring varying aspects of mental imagery. 78 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. R eferen ces Ahsen, A. (1995). Self-report questionnaires: New directions for imagery research. Journal of Mental Imagery, 19, 107-123. Anderson, M. P. (1981). Assessment of imaginal processes: Approaches and issues. In T. V. Merluzzi, C. R. Glass, & M. Genest (Eds.), Cognitive assessment (pp. 149-187). New York: Guilford. Bruning, R. H., Schraw, G. J., & Ronning, R. R. (1999). Cognitive psychology and instruction (3rd ed.). Upper Saddle River, NJ: Merrill. Campos, A. (1995). Twenty-Two years of the VVIQ. Journal of Mental Imagery. 19*123-129. Campos, A., Lopez, A., & Gonzales, M. A. (1999). Effect of eyes open/closed and order of rating on VVIQ scores. Journal of Mental Imagery, 23, 35-44. Cornoldi, C. (1995). Imagery and meta-imagery in the VVIQ. Journal of Mental Imagery. 19, 131-136. Denis, M. (1985). Visual imagery and the use of mental practice in the development of motor skills. Canadian Journal of Applied Sport Science, 10, 4-16. Ernest, C. H. (1977). Imagery ability and cognition: A critical review. Journal of Mental Imagery, 2, 181-216. Eton, D. T., Gilner, F. H., & Munz, D. C. (1998). The measurement of imagery vividness: A test of the reliability and validity of the vividness of visual imagery questionnaire and the vividness of movement imagery questionnaire. Journal of Mental Imager, 23, 125-136. Feltz, D. L., & Landers, D. M. (1983). The effects of mental practice on motor skill learning and performance: A meta-analysis. Journal of Sport Psychology, 5. 25-57. Gilmore, G. B. (1973). An experimental study to determine which of five different practice procedures is more effective in the acquisition of a complex motor skill. Dissertation Abstracts International. 34, 605. Gordon, R. (1949). An investigation into some of the factors that favour the formation of stereotyped images. British Journal of Psychology. 39, 156- 167. 79 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Hale, B. D. (1982). The effects of internal and external imagery on muscular and ocular concomitants. Journal of Sport Psychology, 4, 211-220. Hall, C. R. (1985). Individual differences in the mental practice and imagery of motor skill performance. Canadian Journal of Applied Sport Science, 10, 17- 21 . Hall, C. R., Rodgers, W. M, & Barr, K. A. (1990). The use of imagery by athletes in selected sports. The Sport Psychologist, 4, 1-10. Hiscock, M. (1978). Imagery assessment through self-report: What do imagery questionnaires measure? Journal of Consulting and Clinical Psychology, 46, 223-230. Isaac, A, Marks, D. F., & Russell, D. G. (1986). An instrument for assessing imagery of movement: The vividness of movement imagery questionnaire (VMIQ). Journal of Mental Imagery. 10, 23-30. Juhasz, J. B. (1972). On the reliability of two measures of imagery. Perceptual and Motor Skills. 35. 874. Katz, A. N. (1995). What we need is a good theory of imagery vividness. Journal of Mental Imagery. 19, 143-146. Kaufmann, G. (1995). Stalking the elusive image. Journal of Mental Imagery. 19. 146-150. Kihlstrom, J. F., Glisky, M. L., Peterson, M. A., Harvey, E. M., & Rose, P. M. (1991). Vividness and control of mental imagery: A psychometric analysis. Journal of Mental Imagery. 1 5 ,133-142. Kosslyn, S. M. (1990). Mental imagery. In D. N. Osherson, S. M. Kosslyn, & J. M. Hollerback (Eds.), Visual cognition and action. Vol. 2 (pp. 74-97). Cambridge, MA: MIT. Kunzendorf, R. G. (1995). VVIQ construct validity: Centrally excited sensations versus analog representations and memory images. Journal of Mental Imagery. 19, 150-153. LeBoutillier, N., & Marks, D. F. (1999). The factorial validity and reliability of the eyes-open version of the vividness of visual imagery questionnaire. Journal of Mental Imagery, 23, 107-113. 80 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Lester, P. M. (2000). Visual communication: Images with messages (2n d ed.). Belmont, CA: Wadsworth. Marks, D. F. (1973). Visual imagery in the recall of pictures. British Journal of Psychology, 64, 17-24. Marks, D. F. (1995). New directions for mental imagery research. Journal of Mental Imagery, 19, 153-167. Matlin, M. W. (1989). Cognition (2n d ed.). New York: Holt, Rinehart, & Winston. McKelvie, S. J. (1990). The Vividness of Mental Imagery Questionnaire: Commentary on the Marks-Chara debate. Perceptual and Mental Skills, 70, 551-560. McKelvie, S. J., & P. P. Gingras (1974). Reliability of two measures of visual imagery. Perceptual and Motor Skills, 39, 417-418. McKelvie, S.J. (1995). The VVIQ as a psychometric test of individual differences in visual imagery vividness: A critical quantitative review and plea for direction. Journal of Mental Imagery, 19. 1-104. Milner, A. D., & Goodale, M. A. (1995). The visual brain in action. Oxford: Oxford University Press. Moran, A. (1991). Conceptual and methodological issues in the measurement of mental imagery skills in athletes. Journal of Sport Behavior, 16, 156-170. Munroe, K. J., Giacobbi, P. R., Jr., Hall, C., & Weinberg, R. (2000). The four w ’s of imagery use: Where, when, why, and what. The Sport Psychologist, 14. 119- 137. Murphy, S. M. (1990). Models of imagery in sport psychology: A review. Journal of Mental Imagery, 14, 153-172. Murphy, S. M., Jowdy, D. P., & Durtschi, S. K. (1989). Report on the United States Olympic Committee survey on imagery use in sport: 1989. Unpublished research report, U. S. Olympic Committee, Colorado Springs, CO. Onestak, D. M. (1990). The effects of progressive relaxation, mental practice, and hypnosis on athletic performance: A review. Journal of Sport Behavior, 14, 247-282. 81 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Ostrow, A. C. (Ed.). (1990). Directory of psychological tests in the sport and exercise sciences. Morgantown, WV: Fitness Information Technology. Paivio, A. (1985). Cognitive and motivational functions of imagery in human performance. Canadian Journal of Applied Sport Science. 10. 22-28. Pearson, D. G. (1995). The VVIQ and cognitive models of imagery: Future directions for research. Journal of Mental Imagery. 19. 167-170. Reisberg, D., & Friderike, H. (1988). Vividness, vagueness and the quantification of visualizing. Journal of Mental Imagery, 12, 89-102. Richardson, A. (1969). Mental Imagery. New York: Springer. Richardson, A. (1995). Guidelines for research with the VVIQ. Journal of Mental Imagery. 19. 171-177. Richardson, J. T. E. (1988). Vividness and unvividness: Reliability, consistency and validity of subjective imagery ratings. Journal of Mental Imagery. 12. 115- 122. Salmon, J., Flail, C., & Flaslan, I. (1994). The use of imagery by soccer players. Journal of Applied Sport Psychology. 6. 116-133. Sheehan, P. W. (1967). A shortened version of Betts’ questionnaire upon mental imagery. Journal of Clinical Psychology. 23. 386-389. Silva, J. M., & Weinberg, R. S. (Eds.). (1984). Psychological Foundations of Sport. Champaign, Illinois: Human Kinetics Publishers. Suinn, R. M. (1996). Imagery rehearsal: A tool for clinical practice. Psychotherapy in Private Practice, 15. 27-31. White, K., Sheehan, P. W., Ashton, R. (1977). Imagery assessment: A survey of self-report measures. Journal of Mental Imagery, 1. 145-170. Williams, A. M., Davids, K., Williams, J. G. (1999). Visual perception and action in sport. New York: Routledge. 82 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix A (Transcribed Vignette Imagery Scripts) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. VIGNETTE 1: Beach Scene You are walking along the beach in the middle of a warm and sunny day. The sun is getting low on the horizon, a golden yellow. The sky is bright blue and the sand glistens in the sunlight. Feel the cold, wet, packed sand beneath your feet. Smell the salt in the air. Hear the waves crashing into the shore, one after another in perfect sequence. In the distance, you hear the call of a sea gull gliding overhead. You find a spot to sit down and take in the sights, sounds, and smells of the ocean. The colors of the sky are changing as the sun says goodbye to another day. When you look closely you can see every color o f the rainbow in the reflection of the water. You sit peacefully and calmly. You are content and soothed by the ocean. VIGNETTE 2: Cognitive Specific Imagine a typical practice. Look around.. .Notice where you are.. .What are you wearing? What else is going on around you? Is there anyone else there or are you alone? Can you notice familiar smells in the air? Try and hear the sounds of your training. Now focus on a specific task or routine that would be common for you during this practice. Any routine will do. As you are preparing, recall the things you typically do to set up a successful practice. Perhaps you make adjustments to your uniform, or maybe you have to set up your materials. Now, focus on the task you are about to perform .. .your technique.. .proper form. You are fully prepared to begin the practice. See yourself performing the activity to perfection. Can you see it clearly? Feel the action within your muscles and throughout your body. Allow yourself to feel it fully. Are you tense or relaxed? How does your body respond to 84 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. the task? Notice your body interacting with the ground... obstacles... objects... others. Allow yourself to flow through the activity. See an accurate completion to your practice. VIGNETTE 3: Cognitive General Imagine an upcoming competition. This could be anytime when your athletic skills are going to be tested or challenged. What are your goals for this competition? Are you aiming for a new time, distance, or score? Perhaps you are focused on preventing your opponent from accomplishing their time or score? What do you want to accomplish? Now, picture the obstacles you will encounter. Will you face a specific person or group of people? Is there something unique about the setting? Perhaps getting to the competition or the time of day you will compete is challenging. Now that you are aware of some obstacles, focus on how you will overcome them. Will you adjust your training? Perhaps you’ll study a competitor or setting to find opportunities for success. Are there changes to your pre-game routine or warm-up that will help you be better prepared? Imagine a revised practice or warm-up. See yourself making these adjustments. Picture the winning game plan for this competition. VIGNETTE 4: Motivational Specific Imagine a competition scenario. This could be anytime when your athletic skills are being tested or challenged. Picture the task you are facing. Remember it clearly. You are prepared mentally, emotionally, and physically. You have a plan of action, a strategy for success. Take a moment to recall this well thought out plan. What are 85 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. the key elements, ingredients? The competition approaches. You are ready for the challenge and your performance begins. Your execution is precise and going according to plan. What challenges are you faced with? See yourself responding with successful outcomes. What strategies did you use to prevail? Now imagine you are nearing the end of the competition. Notice your body’s form and your facial expressions as the moment of triumph takes place. Success is yours as you take in the responses of those around you. Maybe you see familiar faces celebrating your accomplishment. Can you hear these final moments? If so, what does victory sound like? Allow your feelings to run through you and savor this moment. VIGNETTE 5: Motivational Arousal, General Recall your best performance. Allow your memory to wander back.. .letting the day come to mind. Now that you have a time in mind, see yourself getting ready to perform. How are you feeling in your stomach.. .your muscles? What is your breathing like? As you are preparing, remember your mental state. Are you calm .. .focused.. .energized? Is there that small amount o f nervous energy or is it very strong, anxious? Think of what your arousal level was like on this successful day. Feel it completely. Notice your heart rate. If you are not feeling up enough, what do you do to achieve your maximal level of arousal? How did you maintain this perfect mental state? Perhaps you had to bring yourself up and down during your performance. How up or down are you as you perform? Maybe your performance was greatest in a state of relaxation. If so, what calmed you on this 86 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. exceptional day? See yourself controlling your state of mind as you finish your routine. VIGNETTE 6: Motivational General, Mastery Recall a time when you performed expertly and there were distractions you overcame. Remember the day intensely. See yourself.. .the setting.. .any competitors... Maybe there’s a crowd.. .teammates.. .coaches? You remember it all with clarity. In any performance there can be many distractions. It could be that you are performing in an unfamiliar place, an away game or site. Notice where you are on this successful day. Now, become aware of the distractions you overcame. Perhaps you tuned out the sounds of a taunting crowd or competitors. Maybe the conditions at this competition were particularly challenging: high altitude, severe weather, poor equipment, footing, or injury. Or are your distractions personal, worries and concerns about life issues that only you may know about? Remember these threats to your concentration. Remember them fully. How did you maintain your focus? What strategies did you use to block things out? See yourself narrowing your focus. See yourself maintain attention on your performance and achieve a quality performance. 87 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix B (Questionnaires used for the peer review o f imagery vignettes.) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after this vignette 1. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 89 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after vignette 2. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 3. How well did this vignette meet Pavio’s description of cognitive specific imagery? This is imagery that focuses on skill development and execution for sport participation. Proper form and execution are listed as important elements during the imagery exercise. 1 2 3 4 5 No Connection Vaguely Somewhat Moderately Perfectly 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 90 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after this vignette 3. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 3. How well did this vignette meet Pavio’s description o f cognitive general imagery? This is imagery that focuses on strategy development and execution. Here an athlete would use imagery to prepare for a competition to help ensure a winning outcome. 1 2 3 4 5 No Connection Vaguely Somewhat Moderately Perfectly 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 91 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after this vignette 4. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 3. How well did this vignette meet Pavio’s description o f Motivational specific imagery? This is imagery that is related to a positive performance outcome. The critical element to this imagery is visualizing the winning moment. 1 2 3 4 5 No Connection Vaguely Somewhat Moderately Perfectly 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 92 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after this vignette 5. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 3. How well did this vignette meet Pavio’s description o f Motivational general arousal imagery? This is imagery that centers on emotive arousal, relaxation, and control. Athletes image the emotions of competition and see themselves controlling their physical and emotional responses. 1 2 3 4 5 No Connection Vaguely Somewhat Moderately Perfectly 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 93 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Please respond to the following questions after this vignette 6. 1. Were you able to create a mental image based on this vignette? 1 2 3 4 5 Not at all, Somewhat, Image was Pretty Clear Clear as a Just Thoughts Fuzzy Screen In and Out TV Image 2. How was the speed or pace during the reading of the vignette? 1 2 3 4 5 Way to Slow Could be Just Right Needed More Way to Fast Faster Time 3. How well did this vignette meet Pavio’s description of Motivational general mastery images? This imagery is about maintaining focus, having confidence, and using positive self-talk. These images are centered on maintaining attention often in the face of distractions that may or may not be a part of their sport’s competition. 1 2 3 4 5 No Connection Vaguely Somewhat Moderately Perfectly 4. Was the wording in the vignette confusing or complicated? 1 2 3 4 5 Not at All Occasionally Often Repeatedly Throughout General Comments: (Feel free to provide any additional feedback.) 94 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix C (Recruitment Letter) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. June 28, 2004 To Whom It May Concern: As you are well aware, mental imagery is widely taught and used during to enhance performance in competitive sports. It has been found that 90 percent of athletes, 94 percent of coaches, and 100 percent of sport psychologists in the United States utilize mental imagery in their training (Murphy, Jowdy, & Durtschi, 1989). However, the ability to test for differences in imagery ability among athletes remains unadvanced. To further the study of sport and performance, I am proposing for my dissertation project a manner of assessing imagery through computers, more detailed imagery scripts, and video to more accurately test for a range in ability among adult athletes. A critical step in developing a testing instrument is put it through a validation study. This will allow for comparison of many individual’s scores to know if the test truly measures what it claims to and to allow the instrument to yield meaningful results in relation to others who have taken the same instrument. Adult athletes who train and participate in baseball, basketball, and track are currently being invited to take this test. There is no cost to you or the athlete and there are no foreseeable risks to participating in this study. During the assessment general demographic information such as sport of participation, age, level of competition, gender, and ethnicity will be gathered. All information is highly confidential. At no time will a participant be asked for his or her name. Instead, the athlete will be assigned a code and password to retrieve results at a later date. Information and answers cannot be connected back to any individual athlete and reporting of results will be kept anonymous. The measure takes approximately 17 minutes to complete. During the test participants will be asked to listen to a scenario or vignette. After each vignette they will be asked questions about their mental imagery. To take this test the following are needed: 1) Computer with CD or DVD Drive, 2) Modem or other Internet Connection, 3) Color monitor. Participating in this study may enhance an athlete’s understanding of their mental imagery. If you know of athletes who would be suitable to take this assessment, please direct them to the following website www.qualityofimagery.com for more details. They will be able to request a copy of the instrument and it will be sent to them or your department free of charge. Thank you for your support in this process. Should you have questions please contact Casey Cooper at cackerman01@cox.net or (949) 306-3603, or Dr. Rodney Goodyear at rgoodyea@usc.edu or (213) 740-3267. Dr. Goodyear is the dissertation chair and principal researcher of this project. Sincerely, Casey Cooper, MA USC Counseling Psychology, Ph.D. Candidate 96 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix D (General Instructions for Participants) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Imagery Test Instructions This instrument will be available on a designated computer that will be installed next week, February 9th -13th. The specific location will be announced. After it is installed you will have until March 12th to take the instrument. The test will have an Icon named: Visualization Test. Double click the Icon. You will be asked for a Password. Enter: Victory The test will begin by itself. Follow the instructions on the screen. The test will take approximately 20 minutes to complete. When you are finished the test will close itself. Please do not give this Password to other athletes or teams. This is a special screening of the CQI for USC Track and Field Athletes ONLY. Also, please leave the headphones provided for your teammates. If you would like to use your own headphones just detach the ones provided, but make sure you reconnect them when you are finished. After everyone has completed the instrument we will have a seminar on using Mental Imagery to enhance your performance goals. Thank you for your participation. Should you have any questions or problems with the test or the computer, please call Casey Cooper at (949) 306-3603. 98 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix E (Consent Form) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. University of Southern California Department of Counseling Psychology IN F O R M E D C O N S E N T F O R N O N -M E D IC A L R E S E A R C H CONSENT TO PARTICIPATE IN RESEARCH Validation of a Quality of Mental Imagery Instrument You are asked to participate in a research study conducted by Casey Cooper, M.S. and Rodney Goodyear, Ph.D. from the Department of Counseling Psychology at the University o f Southern California. This validation study will become a doctoral dissertation. You were selected as a possible participant in this study because of your ability to perform mental imagery in the context of athletics. A total of 50-75 subjects will be selected from a variety of sporting groups to participate. Your participation is voluntary. □ PURPOSE OF THE STUDY The purpose of this research study is to validate a new measure of mental imagery. This measure is attempting to assess your ability to create a mental image. □ PROCEDURES If you volunteer to participate in this study, we would ask you to do the following things: The measure will take you approximate 17 minutes to complete. During the test you will be asked to listen to a scenario or vignette that may or may not stimulate a mental image. After each vignette you will be asked questions about your potential mental imagery. There are no right or wrong answers in this test. Make sure that your computer’s sound is turned on and that you have headphones ready for use. You will also be asked to complete another measure of imagery ability, the VVIQ, which is an accepted test to assess mental imagery ability. □ POTENTIAL RISKS AND DISCOMFORTS There are no foreseeable risks to participating in this study. □ POTENTIAL BENEFITS TO SUBJECTS AND/OR TO SOCIETY Participating in this study may enhance your understanding of your experience of your own mental imagery. 100 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. □ C O NFIDENTIALITY Any information that is obtained in connection with this study and that can be identified with you will remain confidential and will be disclosed only with your permission or as required by law. To participate, you will be asked for some general demographic information, but this information is confidential. You will not be asked for your name. Instead, you will be assigned a code and password. Your information and answers cannot be connected back to you and reporting of results will be kept anonymous. When the results of the research are published or discussed in conferences, no information will be included that would reveal your identity. The data will be destroyed in accordance with APA guidelines ETHICAL PRINCIPLES OF PSYCHOLOGISTS AND CODE OF CONDUCT 2002: "6.01 Documentation of Professional and Scientific Work and Maintenance of Records Psychologists create, and to the extent the records are under their control, maintain, disseminate, store, retain, and dispose of records and data relating to their professional and scientific work in order to (1) facilitate provision of services later by them or by other professionals, (2) allow for replication of research design and analyses, (3) meet institutional requirements, (4) ensure accuracy o f billing and payments, and (5) ensure compliance with law." The only people who will have access to the results are the investigators, Casey Cooper, M.A. and Rodney Goodyear, Ph.D., and the computer programmer of the instrument who will maintain the security o f the database, Bryan Walker. □ PARTICIPATION AND WITHDRAWAL You can choose whether to be in this study or not. If you volunteer to be in this study, you may withdraw at any time without consequences of any kind. Should you become uncomfortable during any of the vignettes, please remove your headphones. You may also refuse to answer any questions you don’t want to answer and still remain in the study. The investigator may withdraw you from this research if circumstances arise which warrant doing so. □ IDENTIFICATION OF INVESTIGATORS If you have any questions or concerns about the research, please feel free to contact Rodney Goodyear, Ph.D. at (213) 740-3267 or Casey Cooper, M.S. (213) 220-3822. □ RIGHTS OF RESEARCH SUBJECTS You may withdraw your consent at any time and discontinue participation without penalty. You are not waiving any legal claims, rights or remedies because o f your participation in this research study. If you have questions regarding your rights as a 101 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. research subject, contact the University Park IRB, Office o f the Vice Provost for Research, Bovard Administration Building, Room 300, Los Angeles, CA 90089- 4019, (213) 740-6709 or upirb@usc.edu. SIGNATURE OF RESEARCH SUBJECT, PARENT OR LEGAL REPRESENTATIVE. I understand the procedures described above. My questions have been answered to my satisfaction, and I agree to participate in this study. I have been given a copy of this form. Name of Subject Name of Parent or Legal Representative (if applicable) Signature o f Subject, Parent or Legal Representative Date S IG N A T U R E O F IN V E S T IG A T O R I have explained the research to the subject or his/her legal representative, and answered all o f his/her questions. I believe that he/she understands the information described in this document and freely consents to participate. Name of Investigator Signature o f Investigator Date (must be the same as subject’s) SIGNATURE OF WITNESS (If an oral translator is used.) My signature as witness certified that the subject or his/her legal representative signed this consent form in my presence as his/her voluntary act and deed. Name of Witness Signature o f Witness Date (must be the same as subject’s) 102 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix F (Instrument Instruction Screens) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Instructions prior to hearing each vignette: You are about to hear a scenario. Try to imagine the situation described. You will be asked about any imagery you may have experienced immediately after the scenario is done. Put your headphones on at this time. Make sure there are no distractions around you. When you are sitting with your legs uncrossed, feet on the floor, and your arms resting to your sides, select “Ready to Listen” to begin. Close your eyes after making this selection as the situation plays. Instructions prior to completing each assessment task for vividness o f imagery on dimension o f color: You will now see an image that is going to change on its own. The picture is not intended to perfectly match what you may have seen during the scenario. The task is for you to stop the picture when it best resembles the most colorful moment of your imagery experience. When it does, select “ That’s It!" If you miss the exact moment when the image is changing, don’t worry because the image will change back a few more times. Instructions prior to completing each assessment task for vividness of imagery on dimension of clarity: You will now see another image that is going to change on its own. The picture is not intended to perfectly match what you may have seen during the scenario. The task is for you to stop the picture when it best resembles the clearest moment of your imagery experience. When it does, select “ That’s It!” If you miss the exact moment when the image is changing, don’t worry because the image will change back a few more times. 104 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Instructions prior to completing each assessment task for vividness of imagery on dimension of sound: You will now hear sounds changing. This audio is not intended to perfectly match what you may have heard during the scenario. The task is for you to stop the audio when it best resembles the clarity of sounds during your imagery experience. When it does, select “ That’s It!” If you miss the exact moment when you hear the sound, don’t worry because it will playback a few more times. 105 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Appendix G (Direct-Estimation Questionnaires) Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Questionnaire for Direct-Estimation of Clarity, Labeled Clarity 1. Am IC O ivr'*v»*f*v<nJ< w > * » > • < *♦» .'i JO ** r-»i*«r.vir * . n u fe . > -• *»• <*-« i -v ■'•rim** fr<rrM*<t» iK 'm « ■ » « « < « * I'rt* >■»'/#»» J * . . Vy<j»»VWMin*, *'», M • /„< ** * » »*• ****«*».» v •#.-**«, “ tit******* .*» »t «»»«■ % •. j , ‘ i. » i.v* * - » * * * !* » ■ » mtvlnw • - * » « C . < • « ■ 107 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Questionnaire for Direct-Estimation of Clarity, Labeled Clarity 2. 11' » * < .•* ■**>>./ >•« »» ♦ * .< » ■ •*,».. t »■(</•» r» > * ntn rs*. #»«* *.*»•*« * < ■ ,» * « . « .« • *“ i V* » ■ ! - * • > ♦ » » » « i. i .,' « ■ > « •» *«••* i > «v « < % n w t s v * H a. m . «u < > 4 « * •♦ .» » :* W f «fy»»twn< M tW i * H “ » * * « < « # * * ,» • *««»»..* 4«Ui»«M.v I " < » < > » • tnwtam* » o *•%#■ i» «..!»;|v » % » A * < ■ *»■«*..* 4ft > » Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Questionnaire for Direct-Estimation of Color, Labeled Color 1. S - w . * < ! ■ tta » tuMAit > -m . % •>iM > #(*> * vim < 1 4 ,-.!* < « « /I-*., A » » * * '• '• 0 * *> 1 . * ' ***„«• « -» '»< M n n M tlS .lu .M h |lM i • D U * » • * < « * If <*.** t»*« . i n»«y i j» ",. y > . < < « . .« * * i< i *a : V*J» * # r » «f . f c M K lit Mi|,\VU‘ i * •* « « » H’WMIH .vili ■ ■ S h i a * H J s ....... 109 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission. Questionnaire for Direct-Estimation of Color, Labeled Color 2. mmm f®. ' i ’ m a * m m m m » «* » * m *s«a ftwwtf wmt m * m m ■#*»,*»*«* M m m % m m t A* % m m mmm mam 0m ««*s»»sfis> t » mtmmmmmmmmim n mt t mm,M M kM wtM wwi #«**««pw#i# »» ***»«s«sehpienw »»w ® t»t . . -a. ~ H-, . & t s r r w 110 Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Gender differences in symptom presentation of depression in primary care settings
PDF
Gender role conflict, personality, and help -seeking in adult men
PDF
Gender differences in motivation for sexual intercourse: Implications for risky sexual behavior and substance use in a university and community sample
PDF
An integrative analysis of racism and quality of life: A comparison of multidimensional moderators in an ethnically diverse sample
PDF
Integration of science and practice: A collective case study of scientist -practitioner programs in counseling psychology
PDF
Attachment style, interpersonal guilt, parental alcoholism, parental divorce and eating disordered symptomatology in college women
PDF
Assessment of racial identity and self -esteem in an Armenian American population
PDF
Effects of test interpretation style and need for cognition on elaboration, favorability, and recall
PDF
A daily diary approach to compare the accuracy of depressed and nondepressed participants' estimation of positive and negative mood: A test of the depressive realism hypothesis
PDF
Impact of language and culture on a neuropsychological screening battery for Hispanics
PDF
An investigation of a new diagnostic sub-type: Post traumatic stress disorder with psychotic features
PDF
Heterosexuals' attitudes toward lesbians and gay men and the attitude functions they serve: Correlates, stability and gender differences
PDF
Intergenerational conflict, family functioning, and acculturation experienced by Asian American community college students
PDF
A phenomenological inquiry into the essential meanings of the most intense experience of religiosity and spirituality
PDF
A cluster analysis of the differences between expert and novice counselors based on real time training
PDF
Considering the impact of culture on the therapeutic relationship: A look at Latino school-age children in a special education setting
PDF
Development and validation of the Group Supervision Impact Scale
PDF
A structural model of the determinants, personal and situational influences, and the consequences of athlete dissatisfaction
PDF
Gay men with eating disorders and food, body image, and exercise concerns: A group treatment approach
PDF
Change beliefs and academic outcomes: A construct validation
Asset Metadata
Creator
Cooper, Casey Laura
(author)
Core Title
Development and validation of the Cooper Quality of Imagery Scale: A measure of vividness of sporting mental imagery
School
Graduate School
Degree
Doctor of Philosophy
Degree Program
Education (Counseling Psychology)
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Education, Physical,OAI-PMH Harvest,psychology, clinical,psychology, psychometrics,Recreation
Language
English
Contributor
Digitized by ProQuest
(provenance)
Advisor
Goodyear, Rodney (
committee chair
), Callaghan, John (
committee member
), Clark, Richard (
committee member
), Hoffman, Kaaren (
committee member
)
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c16-662512
Unique identifier
UC11340233
Identifier
3145185.pdf (filename),usctheses-c16-662512 (legacy record id)
Legacy Identifier
3145185.pdf
Dmrecord
662512
Document Type
Dissertation
Rights
Cooper, Casey Laura
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the au...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus, Los Angeles, California 90089, USA
Tags
Education, Physical
psychology, clinical
psychology, psychometrics