Close
The page header's logo
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected 
Invert selection
Deselect all
Deselect all
 Click here to refresh results
 Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
What about the children left-behind? An evaluation of a reading intervention program
(USC Thesis Other) 

What about the children left-behind? An evaluation of a reading intervention program

doctype icon
play button
PDF
 Download
 Share
 Open document
 Flip pages
 More
 Download a page range
 Download transcript
Copy asset link
Request this asset
Transcript (if available)
Content
WHAT ABOUT THE CHILDREN LEFT-BEHIND?
AN EVALUATION OF A READING INTERVENTION PROGRAM

by

Giuliana Milagros Klijian

________________________________________________________________

A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION


August 2009





Copyright 2009                                                                     Giuliana Milagros Klijian

ii
DEDICATIONS
With heartfelt love, warmth, and deepest appreciation, I dedicate this
dissertation and the journey traveled to reach this goal to my husband, Michael; my
three little girls, Faith, Felicia, and Gabriela; my two angels, Nathaniel and Natalie;
and my Mami.
To Michael, the love of my life, I am so lucky to have you. You have been
there every step of the way, through the ups and downs of life, supporting me,
encouraging me, and loving me unconditionally. Through our life challenges and
loss, you were right there beside me, carrying me when I felt drained and shattered.
You never lost faith, and you gave me the strength and hope that I needed. In this
doctoral journey that I embarked on three years ago, you cheered me on and stepped
up to undertake additional parenting and household roles, every time with a smile.
You always believed in me and gave me the strength when I most needed it. I am so
grateful to have you in my life. You are my soul-mate and I love you so much.
To my beautiful little girls, Faith, Felicia, and Gabriela, you are my
inspiration and my strength. You are my adorable darlings that I love so much. You
bring life to my life, and are truly the sparkles of my eyes. All I want for you is to
embrace your dreams, reach for the stars, be the best that you can be, and never give
up. I hope to encourage you in furthering your education, no matter what career path
you decide on. Remember, life will always be a learning experience.
To my Nathaniel and Natalie, my two little angels that I miss so much, there
is not one day that goes by that I do not think about you. You have been my

iii
inspiration. Your loss has made me stronger and more determinant in my endeavors
and aspirations. I will always love you!  
You left your footprints on our hearts forever.
- Anonymous

To my Mami, I am so grateful for the values and morals you have instilled in
me, the educational opportunities you provided me with, and all the sacrifices you
endured to be able to provide betterment for your family. You have been an
extraordinary role model, always guiding me and supporting me. Through this
doctoral journey, your support has been unparalleled. I am so thankful and lucky to
be your daughter. I love you very much.

iv
ACKNOWLEDGMENTS
First and foremost, I would like to thank my wonderful chair, Dr. Dennis
Hocevar, for his continuous commitment, support, and confidence in me throughout
this dissertation process. Without your dedication, encouragement, and guidance, the
completion of this dissertation would not have been possible. I also extend my
sincere gratitude to my committee members, Dr. Richard Brown and Dr. Robert
Keim, whose expertise and wisdom were invaluable. Thank you so much for your
time and support invested in this process. I also want to recognize and express my
thankfulness to Dr. John Zimmer, who was my original third committee member and
unfortunately passed away during this process. All said, it was an honor to have
worked under such great mentors.
To my colleagues and friends of my thematic dissertation group, Josephine
Bixler and Marvin Horner, thank you for all your help and encouragement
throughout this dissertation process. Special appreciation also goes to my dear
classmates of my cohort, Ingrid Jaimes, Liz Blanco, Laura Hernandez-Flores, and
Seema Gaur. Your friendship, support, and encouragement were extremely
instrumental throughout this program. I will always cherish the moments we shared
and endured during this program. You girls were always there 24/7. But exclusive
thanks to my close and long-time friend and colleague Ingrid, together we began
right from the start of the application process, including studying for the GRE, and
completed this journey jointly at the commencement ceremonies. Congratulations
EdD Class of 2009!

v
I wish to thank Tia Hilda, Tia Chabela, and Nana for all of their
unconditional love, assistance, and support in helping take care of my daughters
during this journey. You were always there willing to help. My family and I are very
fortunate to have such dedicated individuals. You truly brought to life the proverb,
“It takes a village to raise a child.” In my case, it was my three little girls. I could
have not done it without you.
To my dearest and close friends, Deanna, Ingrid A., Neda, Roxana, Sharon,
Sonia, and Sylvia, thank you for your understanding, patience, and support during
these past three years, and for enduring my limited or lack of availability. I am really
grateful for all your encouragement and kind words, and for cheering me on,
especially during those stressful moments. But most of all, I am very thankful for
your friendship, which is priceless.
Many thanks also to the Rolling Hills staff who participated in providing the
collected data for this study. In particular, special thanks to the intervention teachers,
Mrs. Afsharian and Ms. Myrick, who provided the bulk of the data. Your input was
important.
To my local Starbucks, you were my home away from home, where many
weekends and evenings were spent studying and writing in that quiet corner, from
the early hours in the morning to the late hours at night. Thank you for your
hospitality and for providing me with that special place.

vi
Last but not least, I want to thank my editor, Dr. Shantanu Duttaahmed. Your
assistance and support through this process has been invaluable. Thank you for your
encouragement and caring, and for believing in me. It was great working with you.

vii
TABLE OF CONTENTS
Dedication  ……………………………………………………………………….. ii
Acknowledgments  ………………………………………………………………. iv
List of Tables  …………………………………………………………………… viii
List of Figures  ……………………………………………………………………. x
Abstract …………………………………………………………………………… xi
Chapter One:  The Problem of Practice ………………………………………….. 1
Chapter Two:  Literature Review  ……………………………………………….. 22
Chapter Three:  Methodology …………………………………………………… 49
Chapter Four:  Results …………………………………………………………… 74
Chapter Five:  Summary, Discussion, and Recommendations  …………………. 86
References  …………………………………………………………………….... 106

viii
LIST OF TABLES
Table 1:  Percentage of Performance Bands by Grade Level California  ………… 4
Standards Test Scores – 2007 for English Language Arts

Table 2:  Student Enrollment by Ethnicity in 2006-2007  ………………………… 7
Table 3:  Student Enrollment by Other Subgroups in 2006-2007  ……………….. 8
Table 4:  Special Education Enrollment by Ethnicity in 2006-2007  …………….. 8
Table 5:  Percentage of 2
nd
Grade Students Scoring At Proficient or Above  …… 10
on the ELA of the CST by Significant Subgroups

Table 6:  Percentage of 3
rd
Grade Students Scoring At Proficient or Above  …… 11
on the ELA of the CST by Significant Subgroups

Table 7:  Percentage of 4
th
Grade Students Scoring At Proficient or Above  …… 13
on the ELA of the CST by Significant Subgroups

Table 8:  Percentage of 5
th
Grade Students Scoring At Proficient or Above …….. 14
on the ELA of the CST by Significant Subgroups

Table 9:  Percentage of 6
th
Grade Students Scoring At Proficient or Above  …… 16
on the ELA of the CST by Significant Subgroups

Table 10:  Dependent Variables Measuring Reading Achievement  ……………. 52
Table 11:  Experimental Group by Grade Levels  ………………………………. 60
Table 12:  Experimental Group by Gender  ……………………………………... 61
Table 13:  Experimental Group by Ethnicity  …………………………………… 61
Table 14:  Experimental Group by EL and SPED Statuses  ……………………... 61
Table 15:  Total Participants in Experimental and Control Groups  …………….. 74
Table 16:  Between-Subjects Factors  …………………………………………… 75
Table 17:  Pre- versus Post-Intervention English Language Arts (ELA)  ……….. 77
Achievement Paired Samples Statistics

ix
Table 18:  Pre- versus Post-Intervention English Language Arts (ELA)  ……….. 78
Achievement Paired Samples Correlations

Table 19:  Pre- versus Post-Intervention English Language Arts (ELA)  ……….. 78
Achievement Paired Differences Samples Test

Table 20:  Pre- versus Post-Intervention English Language Arts (ELA)  ……….. 80
Achievement Practical Significance

Table 21:  ANCOVA – Tests of Between-Subjects Effects Dependent  ………… 82
Variable: CST ELA 2008 with Covariate: CST ELA 2007

Table 22:  Group Effect on the CST ELA 2008 Estimated Marginal Means  …… 82
Table 23:  Grade Effect on the CST ELA 2008 Estimated Marginal Means  …… 83
Table 24:  Post hoc t-tests of Grade Differences and their Effect Sizes  ………… 83
Table 25:  Experimental versus Control Groups Practical Significance  ………... 84

x
LIST OF FIGURES
Figure 1:  2
nd
Grade Performance Levels on the ELA of the CST in the Last  …… 9
Four Years (2004 to 2007) at Rolling Hills Elementary School

Figure 2:  3
rd
Grade Performance Levels on the ELA of the CST in the Last  ….. 11
Four Years (2004 to 2007) at Rolling Hills Elementary School

Figure 3:  4
th
Grade Performance Levels on the ELA of the CST in the Last …… 12
Four Years (2004 to 2007) at Rolling Hills Elementary School

Figure 4:  5
th
Grade Performance Levels on the ELA of the CST in the Last …… 14
Four Years (2004 to 2007) at Rolling Hills Elementary School

Figure 5:  6
th
Grade Performance Levels on the ELA of the CST in the Last  …... 15
Four Years (2004 to 2007) at Rolling Hills Elementary School

xi
ABSTRACT
Scholars and educators in the field of education continue to express concern
about the “underachievement” of minority students, and suggest that the data reflects
the educational inequity that exists with regard to certain ethnic, student populations.
Given that the diversity of our student population continues to grow, the issues
relevant to educational inequity is particularly acute. Current research projects that
by the year 2026, 25 percent of the total student enrollment will be limited English
proficient, and that 70 percent of the total student population will be non-White and
Latinos.
The focus of this dissertation study is on Rolling Hills Elementary (a
pseudonym) in Los Angeles County, California. Rolling Hills School faces the
problem of having approximately 58 percent or more students performing below
proficiency levels in English Language Arts (ELA) in the upper grades (fourth, fifth,
and sixth grades). The concern is that given their grade levels, these students will
soon be moving on to secondary schools not being proficient readers, and obviously
being a nonproficient reader will impact the student’s learning and future success in
school and in society.
In this study, I address the problem of poor performance in ELA by students
enrolled in the upper grades at Rolling Hills. The objective of this research is four
fold: first, to conclude whether or not the Language! intervention program currently
taking place with the fourth, fifth, and sixth grade identified at-risk students
improves their achievement performance in reading, particularly their performance

xii
on the California Standards Test (CST) ELA; second, to provide data to the school’s
leadership data team on the effectiveness of the implementation of this intervention
program; third, to identify strengths and weaknesses in the implementation of
Language! and to assess if the program should be modified; and lastly, to add to the
research literature on the use of Language!, as this research is currently limited. This
study utilized a pre-post quasi-experimental design. The data was gathered via both
quantitative and qualitative methodologies. Both formative and summative
evaluations were combined in examining the research questions pertaining to this
study. Two groups were used for comparison in evaluating the reading intervention
program, Language!, in this study: an experimental group and a non-equivalent
control group.
The results of this research study enlighten us as to the overall effects of the
reading intervention program, Language!, on the reading achievement of the at-risk
students at Rolling Hills School. Given the findings, the overall intervention effect of
Language! was positive. Although the inference that this intervention program has a
positive effect on the CST ELA achievement was not attained, this reading
intervention program has provided those at-risk students with more basic reading
skills.  

1
CHAPTER ONE
THE PROBLEM OF PRACTICE
In 1983, the United States was shocked to hear of the poor quality of our
educational system as described in the report A Nation at Risk (The National
Commission on Excellence in Education, 1983). Even today, the data from the field
research continues to illustrate not only the inadequacy of the education being
offered in our schools, but also the alarming inequality that exists among the diverse
racial and ethnic student populations in our schools.  In the areas of student
achievement and specifically in terms of reading proficiency, the problem is
particularly acute. For instance, in 2007, the National Assessment of Educational
Progress (NAEP) reported that 69% of eighth graders in the nation are reading below
the proficiency level (National Center for Education Statistics [NCES], 2007b). In
breaking down this statistic further, we find that 59% of Whites, 87% of Blacks, 85%
of Latinos, 82% of Native Americans, 85% of socioeconomic disadvantaged, and
95% of English Language Learner (ELL) students in the eighth grade nationwide are
not proficient readers (NCES, 2007b).
These data reiterate and validate what scholars have been saying for some
time now, that the “underachievement” of minority students reflects the educational
inequity that exists with regard to minorities (Bennett, 2001, 2002; Cummins, 1986;
Gallimore & Goldenberg, 2001; Garcia, 2002). The diversity of our student
population is an ongoing and intricate problem with which urban education continues
to contend. The complexity of this problem is exacerbated by issues of poor

2
education, educational inequity, desegregation, and integrations of minority and
diverse students into the classrooms. It is imperative that we as a society understand
that the composition of our population is changing dramatically and that these
demographic changes will only continue to increase. As reported by Garcia (2002),
“nearly one-fifth of Americans live in a household in which a language other than
English is spoken” (p. 8). He projects that by the year 2026, 25 percent of the total
student enrollment will be limited English proficient. Moreover, 70 percent of the
total student population will be non-White and Latinos (Garcia, 2002).
Meeting proficiency standards in reading is a problem that students face in
schools across the nation. For example, based on the 2007 California Standards Test
(CST) results, 41% of eighth grade students statewide performed at or above
proficiency levels in reading (California Department of Education [CDE], 2007a).
However, the NCES reported that on the 2007 NAEP, about 22% of eight graders in
California performed at or above proficiency levels in reading (NCES, 2007a). While
the CST and NAEP represent different tests of reading proficiency, they can provide
insight on how students are performing in reading. Evidently, these numbers indicate
that in the areas of deficient student achievement, lack of reading proficiency is a
specific problem that is being faced across the nation. This is of foremost concern as
“children who struggle with reading in the early grades often remain behind their
peers throughout school, and academic progress in all subject areas suffers”
(McIntyre et al., 2005, p. 99).

3
Problem Identification
The focus of this dissertation study is on Rolling Hills Elementary. Rolling
Hills School faces the problem of having approximately 58% or more students
performing below proficiency levels in English Language Arts (ELA) in the upper
grades (fourth, fifth, and sixth grades). The concern is that given their grade levels,
these students will soon be moving on to secondary schools not being proficient
readers, and obviously being a nonproficient reader will impact the student’s
learning and future success in school and in society. Reading is an essential
foundation that students must have, to not only be successful in their education but
also in their lives.  Additionally, the data for the school show that on the 2007 CST,
the scores for ELA 58% of fourth graders scored below proficiency, but 79% and
69% of fifth and sixth graders, respectively, also performed below proficiency levels
(Table 1). These percentages are quite alarming, and are of concern to the principal
and her leadership-school data team as these groups of students may not meet
proficiency literacy standards by 2014 as mandated by the No Child Left Behind Act
of 2001 (NCLB) (United States Department of Education [USDE], 2002). Thus,
there is urgency for the implementation of reading interventions within the regular
education to address this achievement gap.

4
Table 1
Percentage of Performance Bands by Grade Level
California Standards Test Scores – 2007 for English Language Arts
Performance Bands Grade 2 Grade 3 Grade 4 Grade 5 Grade 6
% Advanced 15% 4% 16% 7% 2%
% Proficient 27% 17% 26% 14% 29%
% Basic 26% 30% 28% 45% 45%
% Below Basic 11% 30% 10% 13% 19%
% Far Below Basic 21% 19% 20% 21% 5%
% Proficient and Above 42% 21% 42% 21% 31%
% Below Proficient 58% 79% 58% 79% 69%


According to the 2007 CST Summary in ELA for Rolling Hills School, the
percentage of students scoring at the proficient or above performance levels grades 2
through 6 is 31.6%. This percentage indicates that Rolling Hills School is
encountering difficulties in its students reaching proficiency levels as stated by
NCLB, which states that “all children [will] have a fair, equal, and significant
opportunity to obtain a high-quality education and reach, at a minimum, proficiency
on challenging State academic achievement standards and state academic
assessments” (USDE, 2002, p. 1439).
With the accountability movement being driven by NCLB, administrators
and teachers are being called to use data driven strategies to rigorously improve

5
student achievement. The importance of data has been growing with the concept of
“data-driven decision-making” (DDDM) or what has been called “continuous
instructional improvement.” Consequently, at the school level, the principal is not
only accountable for being the leader, supervising instruction and student learning,
providing professional development opportunities, and monitoring curriculum
implementation, but also for collecting students data, reviewing them with teachers,
and having teachers adapt their instruction accordingly. In the classroom, teachers
are accountable for providing a high-quality education and students are accountable
for learning. Both principal and teachers are accountable to identify those struggling
students and provide interventions to them to support their learning.  In short,
schools today must operate within, what has been called in the field literature, “the
era of accountability” (Booher-Jennings, 2006).
In addition, the school is already overidentified in special education (26.1%),
especially with a high percentage of Latinos (41.9%) in special education. Based on
the school’s data and the directive of NCLB of “hold[ing] schools, local educational
agencies, and States accountable for improving the academic achievement of all
students, and identifying and turning around low-performing schools that have failed
to provide a high-quality education to their students, while providing alternatives to
students in such schools to enable the students to receive a high-quality education”
(USDE, 2002, p. 1440), the school principal along with her leadership-school data
team put together an intervention plan, Language!, in place to address the needs of
these students who demonstrate a low performance in reading.  This plan was

6
implemented for the school year 2007-2008. My role as researcher in this study was
to evaluate the effectiveness of this intervention at the school site.
Study Setting
Rolling Hills School (a pseudonym) is located in Los Angeles County, about
15 minutes from downtown Los Angeles, in Southern California. The district is in an
urban area with wide diversity within its population. Rolling Hills Elementary
School is one of 20 elementary schools in the Valley Unified School District
(VUSD). This district also consists of three middle schools and five high schools,
and has a total enrollment of 20, 826 students (California Department of Education
[CDE], 2007b).
Rolling Hills is a K-6 elementary school with a student enrollment of 402
pupils. Its student population is primarily Latino (63.7%), with 25.9% of the student
population being ELLs (Tables 2 and 3). The second highest represented ethnic
group at Rolling Hills is African American (21.1%) followed by White (10.2%). The
remaining distribution of student ethnicity is as follows:  1% Asian, 0.5% Filipino,
0.2% American Indian/Alaska Native, and 3.2% Multiple/No Response (Table 2).  
Moreover, 85.1% of the student population receives free or reduced-price lunches.  
With regards to the educational level of the parents, this information had a 0%
response as reported on the October 2006 California Basic Educational Data System
(CBEDS) data collection and the 2007 Standardized Testing and Reporting (STAR)
Program student answer document. In addition, Rolling Hills School is both a Title I
and a Reading First School.

7
Table 2
Student Enrollment by Ethnicity in 2006-2007
Ethnic Subgroups No. of Students Percentage
White (not Hispanic) 41 10.2%
Hispanic/Latino 256 63.7%
African American 85 21.1%
Asian 4 1%
Filipino 2 0.5%
American Indian/Alaska Native 1 0.2%
Multiple/No Response 13 3.2%
Total Enrollment 402 n/a


As mentioned before, Rolling Hills’s student population is overidentified in
special education (26.1%) as compared to the accepted 10% (Table 3). Specifically,
41.9% of the students identified as special education are Latinos, with 28.6% being
White and African American students, respectively (Table 4). Rolling Hills School is
unique when compared to other schools within the district as it provides a wide range
of special education services, and these include blended inclusion, modified
inclusion, resource specialist program (RSP), and special day classes.

8
Table 3
Student Enrollment by Other Subgroups in 2006-2007
Other Subgroups No. of Students Percentage
Special Education 105 26.1%
ELL 104 25.9%
Re-designated FEP 18 13.8%
Low SES 342 85.1%


Table 4
Special Education Enrollment by Ethnicity in 2006-2007
Ethnic Subgroups No. of Students Percentage
White (not Hispanic) 30 28.6%
Hispanic/Latino 44 41.9%
African American 30 28.6%
Asian 0 –
Filipino 0 –
American Indian/Alaska Native 1 0.9%


Rolling Hills School has a total of 23 fully-credentialed teachers, including
four special education teachers, all meeting the qualification criteria specified by the
No Child Left Behind Act of 2001 (NCLB). The ethnic breakdown of the teaching
staff is as follows:  56% White, 22% Latino, 13% African American, 4% Asian and
Multiple/No Response. The gender distribution of the teaching staff is 78% females
and 22% males.

9
Analyzing the school’s California Standards Test (CST) English Language
Arts (ELA) data reveal that the second grade has been performing stronger than the
other grades (Figure 1 and Table 5). For example, in 2004, 44% of second graders
scored at or above proficient levels, with an increase to 55% of second grade
students performing at proficient or above in 2005. But the subsequent years, 2006
and 2007, have shown a slight decline, 43% and 42% of second graders scoring at
proficient or above in reading, respectively. The data show that the overall shift in
reading proficiency from 2004 to 2007 was -2%.

 
Figure 1:  2
nd
Grade Performance Levels on the ELA of the CST in the Last Four
Years (2004 to 2007) at Rolling Hills Elementary School

10
Table 5
Percentage of 2
nd
Grade Students Scoring At Proficient or Above on the ELA of the
CST by Significant Subgroups
Subgroups 2004 2005 2006 2007
Special Education 18% 17% * 13%
African American * * * 19%
Hispanic/Latino 51% 56% 44% 58%
ELLs 50% 47% 48% 52%
Low SES 41% 50% 40% 42%
Total 44% 55% 43% 42%



A different picture is depicted when looking at the third grade CST ELA data
(Figure 2 and Table 6). From 2004 to 2007 there was an overall growth of 1% of
third grade students performing at proficient or above. In 2004, 20% of third graders
scored proficient or above in reading, and in 2005, the number went up to 24%. But
then there was a decrease to 16% of third grade students performing at proficient or
above. In 2007, the percentage of third grade students scoring at proficient or above
was 21%.

11
24
27
29
18
2
11
32
33
19
5
29
14
40
14
2
19
30
30
17
4
0%
20%
40%
60%
80%
100%
2004 2005 2006 2007
Source: CDE (2004-2007).
3rd Grade Performance Levels
California Standards Test - English Language Arts
Advanced
Proficient
Basic
BB
FBB
 
Figure 2:  3
rd
Grade Performance Levels on the ELA of the CST in the Last Four
Years (2004 to 2007) at Rolling Hills Elementary School


Table 6
Percentage of 3
rd
Grade Students Scoring At Proficient or Above on the ELA of the
CST by Significant Subgroups  
Subgroups 2004 2005 2006 2007
Special Education * 14% * 23%
African American 7% 31% * 21%
Hispanic/Latino 24% 23% 23% 21%
ELLs 4% 5% 15% 7%
Low SES 19% 21% 17% 21%
Total 20% 24% 16% 21%

12
Looking at the CST ELA data for the fourth grade at Rolling Hills School indicate
that students have been making continuous progress (Figure 3 and Table 7). In 2004,
15% of fourth grade students performed at proficient or above. This number jumped
to 33% in 2005. A slight increase was seen in 2006 where 35% of fourth graders
scored proficient or above, and in 2007, this number increased to 42% performing at
proficient or above. The overall growth for fourth grade students scoring at proficient
or above from 2004 to 2007 was 27%.

8
21
57
13
2
13
22
32
25
8
11
20
33
22
13
20
10
28
26
16
0%
20%
40%
60%
80%
100%
2004 2005 2006 2007
Source: CDE (2004-2007).
4th Grade Performance Levels
California Standards Test - English Language Arts
Advanced
Proficient
Basic
BB
FBB
 
Figure 3:  4
th
Grade Performance Levels on the ELA of the CST in the Last Four
Years (2004 to 2007) at Rolling Hills Elementary School

13
Table 7
Percentage of 4
th
Grade Students Scoring At Proficient or Above on the ELA of the
CST by Significant Subgroups
Subgroups 2004 2005 2006 2007
Special Education * * 15% *
African American 8% 25% * *
Hispanic/Latino 15% 37% 45% 41%
ELLs 13% 11% 20% 7%
Low SES 15% 37% 40% 42%
Total 15% 33% 35% 42%



Of all grade levels, fifth grade students at Rolling Hills School made the least
progress (Figure 4 and Table 8). Their overall record from 2004 to 2007 shows a
shift of -4%. In 2004, 25% of fifth graders scored at proficient or above in ELA and
31% in 2005. These numbers declined to 22% of fifth grade students performing at
proficient or above in 2006 and to 21% scoring at proficient or above in 2007.

14
14
30
32
16
9
13
15
42
27
4
24
17
37
17
5
21
13
45
14
7
0%
20%
40%
60%
80%
100%
2004 2005 2006 2007
Source: CDE (2004-2007).
5th Grade Performance Levels
California Standards Test - English Language Arts
Advanced
Proficient
Basic
BB
FBB

Figure 4:  5
th
Grade Performance Levels on the ELA of the CST in the Last Four
Years (2004 to 2007) at Rolling Hills Elementary School

Table 8
Percentage of 5
th
Grade Students Scoring At Proficient or Above on the ELA of the
CST by Significant Subgroups  
Subgroups 2004 2005 2006 2007
Special Education * * * 8%
African American * 42% * *
Hispanic/Latino 26% 27% 23% 18%
ELLs 16% 9% 0% 0%
Low SES 21% 31% 22% 21%
Total 25% 31% 22% 21%

15
Lastly, sixth grade CST ELA data demonstrate that 26% performed at
proficient or above in 2004, and this number slightly increased to 31% scoring at
proficient or above in 2005 (Figure 5 and Table 9). However, the number regressed
back to 26% performing at proficient or above in 2006, but increase back up to 31%
attaining proficient or above performance levels in 2007. The overall growth for
sixth grade from 2004 to 2007 was 5%.


9
20
44
22
4
8
21
41
21
10
9
23
42
21
5
5
19
45
29
2
0%
20%
40%
60%
80%
100%
2004 2005 2006 2007
Source: CDE (2004-2007).
6th Grade Performance Level
California Standards Test - English Language Arts
Advanced
Proficient
Basic
BB
FBB

Figure 5:  6
th
Grade Performance Levels on the ELA of the CST in the Last Four
Years (2004 to 2007) at Rolling Hills Elementary School

16
Table 9
Percentage of 6
th
Grade Students Scoring At Proficient or Above on the ELA of the
CST by Significant Subgroups
Subgroups 2004 2005 2006 2007
Special Education * * * *
African American * * * *
Hispanic/Latino 23% 32% 22% 40%
ELLs 6% 0% 0% *
Low SES 22% 24% 22% 31%
Total 26% 31% 26% 31%


Purpose, Design, and Utility
Purpose
In this study, I addressed the problem of poor performance in English
Language Arts (ELA) by students in the upper grades at Rolling Hills Elementary
School. The objective of this research was four-fold: first, to conclude whether or not
the Language! intervention program currently taking place with the fourth, fifth, and
sixth grade identified at-risk students improved their achievement performance in
reading, particularly their performance on the California Standards Test (CST)
English Language Arts (ELA); second, to provide data to the Rolling Hills
Elementary School’s leadership data team on the effectiveness of the implementation
of this intervention program; third, to identify strengths and weaknesses in the
implementation of Language! and to assess if the program should be modified; and

17
lastly, to add to the research literature on the use of Language!, as this research is
currently limited.
Research Questions
The following research questions guided the quantitative component of this
study:  
(1) Does the implementation of the intervention program Language! have an
effect in the English Language Arts (ELA) achievement of the identified
at-risk students?  
(2) Is there a statistical significance grain from pre- to post-intervention in the
experimental group?
(3) Is there a practical gain from pre- to post-intervention in the experimental
group?
The qualitative part of this research study was guided by the subsequent question:
(4) What are the perceived facilitating and impeding factors regarding the
implementation of the intervention program Language! at Rolling Hills
Elementary School?
Design of the Study
This study utilized a pre-post quasi-experimental design. The data was
gathered via both quantitative and qualitative methodologies. Both formative and
summative evaluations were combined in examining the research questions
pertaining to this study. Two groups were used for comparison in evaluating the
reading intervention program, Language!, in this study: an experimental group and a

18
non-equivalent control group. The experimental group consisted of a sample of 45
students from Rolling Hills School identified at-risk in the fourth, fifth, and sixth
grades by the school’s leadership data team. These students were enrolled in the
2007-2008 school year, receiving the intervention program designed by Rolling
Hills’s leadership data team. A matching non-equivalent control group sample was
selected electronically from the Valley Unified School District (VUSD) data base for
comparison from 2007-2008 to 2008-2009 school years. This control group sample
was matched to the following variables of the experimental group sample profiles:
(a) grade, (b) gender, (c) ethnicity, (d) CST ELA scores, (e) English learner status,
(f) California English Language Development Test (CELDT) Overall proficiency
levels, and (f) special education identification.
The boundaries of this study included a time frame from the 2007-2008 to
2008-2009 school years, and a focus on the experimental group from Rolling Hills
Elementary and on the matched non-equivalent control group within the Valley
Unified School District, which included Grades 4 through 6.
The summative evaluation consisted of comparing the 2007 and 2008 CST
ELA data of the at-risk students (experimental group) at Rolling Hills School who
received the reading intervention program. Moreover, these students were
administered three placement tests (part of the intervention program) in September
2007: (a) Degrees of Reading Power (DRP), (b) Test of Silent Word Reading
Fluency (TOSWRF), and (c) Spelling Inventory (SI). These same tests were re-
administered to the at-risk students in May 2008, which were part of the Language!

19
program. However, only the raw scores of the DRP and TOSWRF were used for
comparison since the focus of this study was on reading. These at-risk students’ raw
scores from 2008 were compared to their scores of 2007 to measure growth. This
study also included the 2007 and 2008 CST ELA data of the students assigned to the
non-equivalent control group.
Triangulation of data was facilitated by the use of formative evaluations. This
included, but were not limited to unobtrusive program observations; unobtrusive
interviews with the Language! teachers and members of the leadership data team
including the principal; and review of school documents at Rolling Hills Elementary
School.  Additionally, purposeful sampling was used to specifically collect data
about the implementation of the Language! program by the teachers. This
information adds to the quantitative data and is valuable for modifications of the
program.
Limitations and Delimitations of Study
Due to the nature of the methodology design utilized in this study, a pre-post
quasi-experimental design cannot prove causation because the observed gains in a
pre-post design can also be influenced by other factors aside from the intervention; in
other words, the design lacks internal validity because of selection bias, where both
the experimental and control groups may not be equal to begin with (Creswell, 2003;
King & Minium, 2003). Thus, the statistical findings of this study indicate (a) if the
implementation of the Language! program results in an observed student
performance growth in ELA that is statistically significant, and (b) if the

20
experimental group (the students receiving Language! instruction) has a significantly
greater performance than the non-equivalent control group (the students who did not
receive the reading intervention).
Another limiting factor is that the results of this program evaluation are
delimited to the Rolling Hills Elementary School setting. The main focus of this
study is to evaluate the effectiveness of the Language! intervention program on
student achievement in ELA of the identified at-risk upper grade elementary students
at Rolling Hills School.
Significance of Study
As discussed earlier, the focal objective of this study is to provide
information to the leadership data team at Rolling Hills Elementary School, about
whether or not the Language! intervention program has an impact on the English
Language Arts (ELA) achievement performance of the students participating in this
program, particularly on the CST ELA. The findings of this study will be utilized by
the school and district. The data will serve in determining whether or not the
Language! reading intervention program should be modified at the school and if a
proposal should be developed in implementing the program at other elementary
school sites throughout the district.
Lastly, the results of this study add to the body of research literature on
Language! as this literature is presently limited (Robinson, 2002), especially with
elementary school children. The unique contribution of this research is that the
existing research on Language! is that the majority of the research done has not

21
involved a control group, therefore, making it impossible to attribute the observed
gains on those studies to the use of Language! (Robinson, 2002).
With the high numbers of non-proficient readers in the upper grades (58% to
79%) at Rolling Hills Elementary School, the school’s leadership data team seeks
reading intervention programs that are research-based to address the achievement
gap in ELA. These upper grade non-readers are soon transitioning to middle school,
with limited basic reading skills.  

22
CHAPTER TWO
LITERATURE REVIEW
“Every educator, parent, and child knows that reading is the most important
skill taught in elementary school.”
— Learning First Alliance

The urgency in addressing our students’ lack of reading proficiency cannot be
understated. Nationally, about 70% of eight graders are scoring below proficient
levels in reading according to the 2007 results of the National Assessment of
Educational Progress (NAEP) (National Center for Education Statistics [NCES],
2007b). Sadly, this quandary has lingered nearly unchanged when examining the
results of the first NAEP in 1992 (NCES, 2007b). With the “wake-up call” (Foorman
& Nixon, 2006, p. 157) of the 1983 A Nation at Risk and the delivery of the 2001 No
Child Left Behind (NCLB), the reform movement towards a standards-based
education and a corresponding accountability driven system evolved. The goal: to
provide high-quality education to all students through scientifically-based research
programs.
As these policies were instituted, the research on reading development and
instruction increased (Fletcher & Lyon, 1998; Foorman & Nixon, 2006).
Concurrently, the National Reading Panel (NRP) was convened in 1997 by the
National Institute of Child Health and Human Development (NICHD) as directed by
Congress to better help address the reading deficiency being faced by the nation. The
goal: “to assess the status of research-based knowledge, including the effectiveness

23
of various approaches to teaching children to read” (NICHD, 2000, p. 1-1). By early
1999, the NRP presented their report to Congress in which it delineated five key
components in effective reading instruction: phonemic awareness, phonics, fluency,
vocabulary, and comprehension. Subsequently, reading programs encompassing
these five essentials developed and are referred to as scientifically-based reading
research (SBRR) programs (Moats, 2007). Language! A Literacy Intervention
Curriculum (Second Edition) is one of them.
In this literature review, prior to examining the literature on the reading
intervention program Language!, first the different level factors hindering students’
learning and achievement will be analyzed. Next, the five essential components of
reading instruction including explicit and systematic instruction will be reviewed.
Last of all, the Language! intervention reading program will be discussed.
Factors that Hinder Students’ Learning and Achievement
Marzano (2003) identified three critical factors that affect students’ learning
and performance: school-level factors, teacher-level factors, and student-level
factors. However, the most vital ingredient in school improvement, according to
Marzano, is effective leadership. He emphasizes that leadership affects factors across
all levels: the school, the teacher, and the student. Thus, for this study, the leadership
dynamics at Rolling Hills School are first examined. Following this, and by utilizing
Marzano’s three sources of influence, we can begin to analyze the problem that
Rolling Hills Elementary School is facing in terms of low performing fourth, fifth,
and sixth grade students on the California Standards Test (CST) English Language

24
Arts (ELA), and determine the probable causes contributing towards this
achievement gap.
Leadership as a Factor
Leadership is such a vital element in an organization, it is at the root of all
effective change that may be executed (Marzano, 2003). For instance, literature
suggests that there are sound correlations between leadership and students having an
opportunity to learn (Duke & Canady, 1991; Murphy & Hallinger, 1989); teachers’
attitudes (Oakes, 1989; Purkey & Smith, 1983); teachers’ classroom practices
(Brookover et al., 1978; Brookover & Lezotte, 1979; Miller & Sayre, 1986);
curriculum method and instruction (Eberts & Stone, 1988; Oakes, 1989); and the
climate found in schools and classrooms (Griffith, 2000; Villani, 1996).
In analyzing what some of the probable causes are in the achievement gap
being faced by Rolling Hills Elementary School, leadership is seen as the key, as
well as the “umbrella” factor under which many other causes and effects can be
grouped. The history at Rolling Hills School reveals that it has been under the
leadership of four different principals in a span of five years. As a result of these
frequent changes in leadership, Rolling Hills School has faced organizational
challenges such as a lack of a consistent focus, difficulty to maintain a stable
teaching base, inconsistent teaching of the core curriculum, minimal follow through
or support for proposed interventions, and a sense of loss regarding the school
community and culture. These observed challenges (e.g., inconsistent focus, unstable

25
teaching base, inconsistent curriculum teaching, etc.) were reported by the current
principal and her leadership team.
This lack of consistent leadership had a trickle down effect and adverse
manifestations were seen across the school-level, teacher-level, and student-level
factors.  In short, over a prolonged period, the school had felt like a ship without a
captain, and lacking the needed direction, supervision, support, and continuity for the
school staff and its community.  Recent research (Crawford & Torgesen, n.d.;
Datnow, Park, & Wohlstetter, 2007; Fermanich et al., 2006; Williams et al., 2005)
continues to show strong and effective leadership to be one of the “common threads”
in successful school systems. Other observed similarities were (a) achievement of all
students comes first, (b) use of assessment data to drive instruction, and (c) provision
of instructional intervention for struggling students identified through the data
analysis.
These similarities can be identified in the reform that Rolling Hills
Elementary School is going through today, specifically in the intervention for the
identified upper-grade at-risk students that is presently in place. Rolling Hills School
is under a second year leadership with the focus of improving student achievement.
To aid in this school reform, the current principal has developed a leadership data
team that consists of the school’s curriculum resource teacher, literacy coach, and
language development resource teacher. This approach follows one of Marzano’s
(2003) research-based principles of leadership for change, where “the principal and
other administrators [operate] as key players and [work] with a dedicated group of

26
classroom teachers” (p. 175). Marzano indicates that when leadership transforms
from an individual to a team approach, successful change is more readily achieved.
School Level Factors
Rolling Hills School was a Program Improvement (PI) school as identified
during the 2000-2001 school year. However, the school exited from PI in 2005-2006
with a Year 3 placement under NCLB, which was the last year that the school
applied PI activities. Rolling Hills School has been out of PI for two years, starting
with the 2006-2007 school year. Based on this history, the data indicate that the at-
risk students for this study most likely did not receive a “guaranteed and viable
curriculum” during the years cited prior to 2006. As per Marzano (2003), a
guaranteed and viable curriculum is one of the school-level factors that most
influences students’ learning.
The Valley Unified School District (VUSD) adopted and implemented the
state-approved curriculum for English Language Arts (ELA) Open Court Reading
Program (OCR) in the 2003-2004 school year for K-6. This means that the current
identified struggling students (if they were enrolled in the district at that time in
2003-2004) were in kindergarten, first grade, and second grade, just at the time that
the ELA curriculum was beginning to be implemented throughout the district. In
other words, these at-risk students missed their window of opportunity for learning to
read, as research indicates that primary grades are critical time for acquiring reading
skills (Francis, Shaywitz, Stuebing, Shaywitz, & Fletcher, 1996; Juel, 1988;
Torgesen, 2002; Torgesen & Burgess, 1998; Torgesen, Rashotte, & Alexander,

27
2001). In his 35 years of educational research, Marzano (2003) reports that having an
opportunity to learn is the strongest factor correlated with a student’s learning
proficiency.
Having the opportunity to learn, especially reading skills, is crucial. There is
substantial research that indicates that children who struggle with reading in their
primary grades, and are particularly poor readers by the end of their first grade year,
almost never achieve reading proficiency levels by the time they transition to middle
school (Francis et al., 1996; Juel, 1988; Torgesen & Burgess, 1998; Torgesen et al.,
2001). In other words, time is of the essence when it comes to learning how to read,
and it is imperative that students acquire the needed literacy skills for reading at the
appropriate time.
Another factor in the lack of reading proficiency among students was that the
school lacked consistent focus in terms of having established clear learning and
academic goals. This absence was not only manifesting itself at the school level, but
it also extended to the teacher and student levels. For instance, it was reported that
there was minimal follow through or support for proposed interventions at Rolling
Hills School. Marzano (2003) emphasizes the importance of establishing challenging
objectives and a useful feedback system at the different levels of a school for
effective instruction. Research on the effects of feedback on student achievement is
remarkable, indicating a strong correlation between the use of effective feedback and
academic performance (Hattie, 1992; Kumar, 1991).

28
These challenging goals have to be clear and communicated to all those
involved (such as teachers and students). The feedback given, whether to students or
teachers, needs to be specific to the learned materials and be provided at various
times during the school year. For example, in regards to student feedback, this can be
accomplished through the use of formative assessments.  Formative assessment is
defined as a continuous process of teachers’ adapting their instruction by the
information (data) gathered periodically from their students through the means of
different teaching procedures, which also include students’ self-assessments (Black
& Wiliam, 1998a, 1998b). This collected feedback (the gathering of data) serves as a
marker to inform the teacher how the student is doing, and if the teacher needs to
modify her or his instruction to address students’ needs (Black & Wiliam, 1998a,
1998b). Through their vast review of research on formative assessment that Black
and Wiliam conducted, they indicated that augmenting the application of these
formative assessment practices increases students’ standards of academic
performance. Their work on classroom formative assessment seems to have been the
catalyst for the growing trend towards data driven instruction. Moreover, this type of
classroom assessment underscores the roles of student and teacher in the reciprocal
process of teaching and learning which leads to student achievement.
Having these factors in place, the establishment of goals and provision of
feedback influence not only students’ academic achievement but also assists in
developing collegiality among the school staff. Likewise, having well-defined goals
and expectations and effective feedback increases motivation of not only the student,

29
but also the school staff, consequently raising student achievement (Schunk, Pintrich,
& Meece, 2008).
The fact that Rolling Hills School had lost the sense of school community
and culture was also seen as a detrimental school level factor affecting student
achievement. The inconsistency of leadership supervision and support impinged on
the sense of community and involvement. This in turn cascaded down to a feeling of
disconnection between the school and the school staff, students, parents, and
community. Experiencing this sense of lack of connection and partnership impacted
not only the extent of parent and community involvement with the school but also
the degree of collegiality and professionalism amongst the school staff (Marzano,
2003). In addition, this lost sense of connection and the lack of leadership are
probable contributing factors to the high turn over of teachers that Rolling Hills
School experienced between 2003 and 2007.
Teacher Level Factors
Marzano (2003) identifies three teacher-level factors that affect student
achievement. These factors deal mainly with the decisions that teachers make in the
classrooms such as teaching strategies, classroom managing, and curriculum
planning. Unlike the school-level factors discussed earlier, these teacher-level factors
do not have discrete impacts that are segregated from each other, but rather work in
tandem. Marzano (2003) indicates that “all researchers agree that the impact of
decisions made by individual teachers is far greater than the impact of decisions
made at the school level” (p. 71). For instance, the literature shows that the teacher

30
has a profound and critical impact on student achievement (Haycock, 1998; Sanders
& Horn, 1994; Wright, Horn, & Sanders, 1997).
As discussed earlier, one of the challenges that Rolling Hills School faced in
addition to an unstable leadership structure was the inconsistency of teaching to the
core curriculum. This deficiency appears to be due to the fact that the district did not
adopt a state approved English Language Arts (ELA) curriculum until 2003, and the
teachers were inexperienced with this new adopted framework. This problem is
addressed by two of Marzano’s (2003) teacher-level factors, namely instructional
strategies and classroom curriculum design.
First, in regards to instructional strategies, the program of Open Court
Reading (OCR) is a fixed, scripted, fast-paced, phonics-based program. Thus, the
design of OCR does not take into account the individual creativity or experience of
teachers, as the unit lessons are pre-designed. Marzano (2003) reports that an
effective way for teachers to apply instructional strategies is to design “an
instructional framework for units…[that] guides teachers to the most appropriate use
of research-based strategies but does not constrain them as to day-to-day lesson
design” (p. 83).
Secondly, in addressing the factor of classroom curriculum design, Marzano
(2003) recommends that the new material being introduced in the lessons should be
presented several times by various means of input channels (i.e., visual, auditory,
etc.). Moreover, in order to achieve automaticity or mastery of the new material,
teachers should provide various opportunities for practice and or repetition of this

31
new information (Marzano, 2003). Unfortunately, the way that OCR is structured
(fixed and fast-paced), it does not provide sufficient time for practice and review,
and seems to discard the research-based principles listed by Marzano.
Student Level Factors
In education, there are two aspects of a student’s background that are
perceived as detrimental for appropriate levels of student achievement: low socio-
economic status (SES) and low parental level of education (Marzano, 2003). These
are aspects over which the school has no control. Based on his extensive research,
Marzano reasons that by using interventions that are school-based, the effects of a
negative student background could be compensated for. Consequently, Marzano
(2003) identifies three factors: “home environment, learned intelligence and
background knowledge, and motivation” (p. 125) that can be addressed by the
school.
As mentioned earlier, Rolling Hills School went through a period of having
an unstable leadership. Unfortunately, one of the many affects from this was the loss
of a sense of school community and involvement, which more than likely also
affected the dynamics of the relationship between the school and the parents.
Marzano (2003) indicates the importance of learned intelligence and
background knowledge as pertaining to student achievement. Other researchers have
revealed that a strong correlation exists between background knowledge and student
achievement (Dochy, Segers, & Buehl, 1999; Tamir, 1996; Tobias, 1994), and
secondly, between learned intelligence and vocabulary (Chall, 1987; Nagy &

32
Herman, 1984). Based on these findings, Marzano infers that increasing vocabulary
will lead to increased learned intelligence which will lead to an increase of
background knowledge which in turn will result in an increase of student
achievement.
Taking this knowledge to a deeper level, there is additional research on the
relation between vocabulary development and socio-economic status (SES). Nagy
and Herman (1984) found that there is a substantial difference (about 4,700 words)
in vocabulary knowledge between students of low and high SES levels. Their
findings were also corroborated by Graves and Slater’s (1987) study, who found that
the difference of vocabulary knowledge between first grade students of low and high
SES levels is about double. Considering that Rolling Hills Elementary School is a
Title I School with approximately 85% of its student population on free and reduced
lunch, has about a 64% Latino and 21% African American student population, and
about 26% are identified as English Language Learner (ELL), the student-level
factor of learned intelligence and background knowledge plays a role as a possible
cause that contributes to the student achievement gap. It is appropriate to assume that
many of these students come from a limited SES background and from a home
where there is another language being spoken besides English. These are
contributing variables among the student population that need to be taken into
account by the school.
As mentioned previously, a student’s SES level is out of the school’s control.
However, the school has control over the implementation of instructional methods

33
and strategies that are research-based and can address a student’s vocabulary
knowledge through the use of school-based interventions. Some of the suggestions
include the provision of an array of life experiences (such as field trips to museums,
galleries), the development of a mentoring program, and the practice of direct
vocabulary instruction (Marzano, 2003). These recommendations as outlined by
Marzano can help promote the development of prior knowledge and learned
intelligence, with the hope of increasing student achievement in the long run.
As students begin experiencing academic success and achievement in their
learning, their motivation to learn will likely increase (Marzano, 2003; Schunk et al.,
2008). It is reasonable to deduce that the identified at-risk students in fourth, fifth,
and sixth grade at Rolling Hills School most likely are experiencing or will
experience a low motivation to learn. Marzano (2003) and other researchers (Schunk
et al., 2008) assert to the relationship between student motivation and academic
achievement. Marzano emphasizes that this aspect of student motivation can also be
addressed by the establishment of clear and individual goals and the provision of
feedback, discussed earlier under school-level factors, on their progress towards their
goal. To reiterate, research on feedback indicates that there is a strong correlation
between effective feedback and student achievement (Hattie, 1992; Kumar, 1991),
and conceptually, feedback as both a dynamic and a strategy that has a
comprehensive effect across all of Marzano’s level factors.

34
The Five Essentials of Reading Instruction
"To learn to read is to light a fire; every syllable that is spelled out is a
spark."  
— Victor Hugo, Les Miserables

As reported earlier, the National Reading Panel (NRP or the Panel) of the
National Institute of Child Health and Human Development (NICHD) was formed in
1997 at the request of Congress. The Panel was to evaluate the existing research on
reading development and instruction and determine the effective reading
instructions. During that time, the National Research Council (NRC) Committee’s
report Preventing Reading Difficulties in Young Children by Snow, Burns, and
Griffin (1998) emerged. In this document, the NRC Committee isolated the research
on reading and its instruction  
… relevant to the critical skills, environments, and early developmental
interactions that are instrumental in the acquisition of beginning reading
skills. The NRC Committee did not specifically address ‘how’ critical
reading skills are most effectively taught and what instructional methods,
materials, and approaches are most beneficial for students of varying
abilities. (NICHD, 2000, p. 1-1).

The Panel, then, utilized the NRC Committee’s report and expanded on their
findings to conduct a comprehensive meta-analysis. Their initial query of the
available research on reading showed approximately 100,000 studies since 1966,
with about an additional 15,000 prior to that. Thus, the Panel developed a screening
method to focus on research studies relevant to the Panel’s intention based on
prioritized issues. A set of topics were selected after regional hearings were held
across the States to learn from the consumers (policymakers, educators, parents, and

35
students) for whom this information was intended. The chosen topics were: (a)
alphabetics, which included phonemic awareness instruction and phonics instruction;
(b) fluency; (c) comprehension – including vocabulary instruction, text
comprehension instruction, and teacher preparation and comprehension strategies
instruction; (d) teacher education and reading instruction; and (e) computer
technology and reading instruction.
By 1999, the Panel completed their extensive analysis and presented their
report to Congress. In their report, the Panel delineated five essential components for
effective reading instruction: phonemic awareness, phonics, fluency, vocabulary, and
comprehension (NICHD, 2000). In addition to these essential components, the Panel
repeatedly found that reading instruction that was explicit and systematic was
observed to be most effective. Next, the five essential reading components will be
discussed with a last section on explicit and systematic instruction.
Phonemic Awareness
Prior to learning phonemic awareness, the definition of phonemes needs to be
understood. Phonemes are speech sounds or the smallest part of a spoken language
(e.g., “in” and “to” contains two phonemes or sounds). In the English language, there
are about 41 phonemes (NICHD, 2000), where phonemes can be combined to form
either syllables or words (e.g., in) or where a word can only consist of one phoneme
(e.g., a, oh). Since there is a connection between print and speech (Fletcher & Lyon,
1998), the mention of graphemes is necessary. Graphemes are parts of written
language denoting the phonemes when spelling words. Similar to phonemes,

36
graphemes can contain one written letter (e.g., p) or can be a combination of letters
to represent one phoneme (e.g., ea-).
In understanding reading acquisition, Lyon (1997) explains that the vast body
of reading research has demonstrated that the beginning block in learning how to
read is two-fold. First, the student needs to be able to understand that spoken
language can be broken down into units of sounds (phonemic awareness). Second,
the student learns that these speech sound units (phonemes) are represented in
written forms. This awareness, as indicated by Lyon, is known as the alphabetic
principle and it is needed for the acquisition of reading decoding.
Phonemic awareness (PA) is, then, the ability to identify and control
phonemes (sounds) in spoken words and syllables. Some PA skills are blending
(putting together) the sounds (e.g., take the sounds /m/-/a/-/p/ to say map) or
segmenting (breaking down) words (e.g., the first sound in map is the sound of /m/).
Phonemic awareness should not be mistaken with auditory discrimination, which is
the ability to discern if two spoken words are different or the same (NICHD, 2000)
or phonics (which will be discussed later). According to the Panel, the literature
identifies phonemic awareness and knowledge of letters as “the two best school-
entry predictors of how well children will learn to read during their first 2 years in
school” (NICHD, 2000, p. 2-1). Their findings resulted in causal claims that
phonemic awareness instruction improves not only phonemic awareness in students,
but also their reading and spelling.

37
The Panel’s findings reiterate the results of other research studies on reading.
Liberman, Shankweiler, and Liberman (1989) found that children’s reading and
writing proficiency is contingent on their phonological awareness. Fletcher and Lyon
(1998) found that poor readers who had difficulties in word recognition (or reading
decoding) stemmed from their inability to chunk the words and syllables into
phonemes. All in all, many studies found that students’ reading decoding skills
increased with training in phonemic awareness and explicit instruction (Foorman,
Francis, Fletcher, Schatschneider, & Mehta, 1998; Torgesen, 1997; Vellutino et al.,
1996; Wise & Olson, 1992, 1995).
Phonics
Equally important to phonemic awareness is phonics. Phonics is a method of
instruction to teach that each letter or combination of letters has sounds (letter-sound
correspondence and spelling pattern) and that mastering these letter-sound
correspondences and spelling patterns will be indispensable in reading and writing
(NICHD, 2000). Findings of the Panel’s (2000) meta-analysis showed that the
utilization of systematic phonics instruction significantly improved students’ ability
to read and spell not only in grades K through 6, but also of those students struggling
in learning to read. Specifically, kindergartners demonstrated enhanced reading and
spelling skills, and first graders read and spelled better, and their comprehension
considerably improved. With the older students, they exhibited better decoding,
spelling, and fluency, but their comprehension remained unchanged.

38
The Panel’s report (2000) emphasized that systematic phonemic awareness
training and phonics instruction were not to be thought of as a complete means to
reading instruction. This observation was also reiterated earlier on by Fletcher and
Lyon (1998), where they stated that skills such as phonemic awareness, phonics,
fluency, and comprehension were all “necessary but not sufficient components of a
complete approach to reading instruction” (p. 56). As students begin to apply phonic
skills fluently in their reading and they develop automaticity, students still need to be
guided in fluency and comprehension to become competent readers in oral reading
and reading comprehension (NICHD, 2000). The next three sections will discuss
further the last essential reading components as delineated by the Panel.
Fluency
Reading with fluency means that one reads accurately, with speed, and
appropriate idiom. Without fluency, comprehension will most likely not take place.
The implication is, as stated by Moats (2007), that if students are able to read
accurately, know what the words mean, and read the passage with ample speed to
promote understanding, then students are more likely to comprehend passages. For
instance, in 2005, the National Center for Education Statistics reported that there
were strong correlations among reading fluency, fluency rate, accuracy, and
comprehension of fourth-graders. As reading fluency increases, the likelihood of
reading longer and longer passages will increase too. If students do not develop
reading fluency, they will continue to read slowly and with great exertion, adversely
affecting their comprehension (NICHD, 2000).

39
A recognized procedure to develop reading fluency is to practice reading
(NICHD, 2000). In their literature analysis, the Panel (2000) found that the practice
of guided oral reading, whether it was from peers, teachers, or parents, had a
significant effect on students’ reading decoding, fluency, and comprehension. These
positive results were seen across students from different grade levels and with
students that were fine readers or having difficulties. Furthermore, the Panel
concluded that “classroom practices that encourage repeated oral reading with
feedback and guidance leads to meaningful improvements in reading expertise for
students” (p. 3-3). Some fluency practices to increase accuracy and speed consist of
repeating readings of the same passages, partner-reading, and timed-readings (Moats,
2007).
Vocabulary
The magnitude that vocabulary knowledge carries on reading has been
acknowledged as early as the mid-1920s (NICHD, 2000). As eloquently explained
by the Panel (2000),
As a learner begins to read, reading vocabulary encountered in texts is
mapped onto the oral vocabulary the learner brings to the task. The reader
learns to translate the (relatively) unfamiliar words in print into speech, with
the expectation that the speech forms will be easier to comprehend. Benefits
in understanding text by applying letter-sound correspondences to printed
material come about only if the target word is in the learner’s oral
vocabulary. When the word is not in the learner’s oral vocabulary, it will not
be understood when it occurs in print. Vocabulary occupies an important
middle ground in learning to read. Oral vocabulary is a key to learning to
make the transition from oral to written forms. Reading vocabulary is crucial
to the comprehension processes of a skilled reader. (p. 4-3)


40
The research studies analyzed by the Panel (2000) demonstrated that
vocabulary instruction improved reading comprehension, as long as the methods
utilized were developmentally appropriate to the students’ age and ability. In the
Panel’s findings, the following implications were derived to maximize vocabulary
instruction: (a) teaching can be both direct and indirect; (b) provision of multiple and
repetitive exposures to new words; (c) use of computer technology, incidental
learning, and rich contexts; (d) restructure task as needed; and (e) continuous active
student engagement.
Earlier it was discussed that the practice of direct vocabulary instruction, as
suggested by Marzano (2003), was to help foster background knowledge and learned
intelligence with the ultimate goal of improving student achievement. His
conclusions along with the findings of many other scholars (Chall, 1987; Dochy et
al., 1999; Nagy & Herman, 1984; Tamir, 1996; Tobias, 1994) augment the
importance of the role that vocabulary plays, especially since many of the students
that our schools serve come from a low socio-economic status (SES) background
with a poor vocabulary knowledge base.
Comprehension
The last essential reading component to be discussed is comprehension. In
their endeavor of analyzing the vast reading research, first, the Panel (2000)
reviewed the progression of the definition of reading comprehension. The area of
reading comprehension primarily expanded in the 1970s. Reading comprehension
was explained as being an “intentional, problem-solving thinking process” (NICHD,

41
2000, p. 4-39) in which readers obtain meaning from reading text. They explained
further that this text comprehension increased when the readers actively linked what
they read to their prior knowledge and made mental representations. Although
readers develop some cognitive strategies on their own to aid in their comprehension,
the Panel found that text comprehension can be considerably increased through
explicit instruction of a combination of comprehension strategies.
The Panel (2000) identified seven types of instruction that research showed to
have a positive impact on comprehension of good readers: (a) comprehension
monitoring, (b) cooperative learning, (c) use of graphic and semantic organizers and
story maps, (d) question answering, (e) question generation, (f) story structure, and
(g) summarization. They concluded that comprehension instruction should employ a
combination of these strategies to improve reading comprehension. Utilizing these
comprehension strategies correctly will help with retelling, recalling, answering
questions, generating questions, summarizing, and increasing performance on
standardized achievement tests (NICHD, 2000).
Explicit and Systematic Instruction
It is important to notice that the Panel (2000) through their extensive review
of the literature on reading development and reading instruction repeatedly came
across a recurrent theme: instruction that was explicit and systematic was found to be
most effective. Lyon (1997) indicated that research being conducted by the National
Institute of Child Health and Human Development (NICHD) called for explicit
instruction of phonics in order to be a good decoder and speller. Henry (1997)

42
specified that reading decoding and spelling needs to be instructed both explicitly
and systematically.  Fletcher and Lyon (1998) pointed out that research had
underestimated the effects of reading instruction, and that the more current research
was beginning to reveal that explicit teaching of reading was best for many students.
In a research conducted by Torgesen (1997), he concluded that reading
instruction needs to be more explicit and inclusive. He studied upper grade students
(grades 3 through 5) who were identified as reading disabled. Two intervention
groups were formed. The first group received an instructional program that consisted
of explicit phonemic awareness through articulatory mouth positions and synthetic
phonics. The second group received an explicit phonics curriculum with emphasis
not only on reading but also on writing connected to the text. The students received a
total of eighty hours of intervention during a period of eight weeks. Torgesen found
that there was a significant effect on the students’ reading abilities of both groups.
However, the group with the more explicit intervention program had the greater
improvement in reading pseudowords (i.e., non-sense words).
In concluding this section, the findings and recommendations reported by the
National Reading Panel (NRP) resulted in compelling ramifications and influenced
educational policies thereafter. In 2002, the Reading First program was underway,
authorized under No Child Left Behind (NCLB) Title I, Part B, Subpart 1. Its
purpose: “to ensure that all children in America learn to read well by the end of third
grade” (US Department of Education [USDE], 2002a, p. 1). The Reading First
program provides funding to States and districts to institute scientifically-based

43
reading programs for the primary grades K-3. The five essential component of
reading instruction need to be embedded in the adopted reading programs along with
explicit instructional methods lasting more than ninety minutes daily. The Reading
First funds can assist not only with teachers’ professional development to effectively
utilize these research-based reading programs, but also with the screening and
classroom-based instructional assessment tools to inform and monitor students’
reading progress. This provision can be found in the US Code as 20 USC 6368.
Having an understanding of what the research says constitutes a
scientifically-based reading program, the reading intervention program utilized at
Rolling Hills Elementary School will be discussed next.
Language! An Intervention Reading Program
“Learning to read is one of the most important things children accomplish in
elementary school because it is the foundation for most of their future
academic endeavors. From the middle elementary years through the rest of
their lives as students, children spend much of their time reading and
learning information presented in text. The activity of reading to learn
requires students to comprehend and recall the main ideas or themes
presented in . . . text.”

— Stevens, Slavin, & Farnish, 1991, p. 8

The Language! A Literacy Intervention Curriculum (Second Edition) is one
of the eight reading intervention programs adopted by the California State Board of
Education for grades 4 to 8. This program is recommended by the State Board for
“students who are two or more grade levels below grade” (California Department of
Education [CDE], 2007, August). Language! is a comprehensive English Language

44
Arts (ELA) curriculum that was devised by Jane F. Greene back in the mid-1990s in
particular for struggling readers grades 3 to 12. It is a three-year ELA curriculum
with the objective that the necessary language skills will be mastered by its students
to be competent readers and writers (Moats, 2004).
The field research on Language! indicates that this intervention program is
research-based and aligns with No Child Left Behind (NCLB) (Sopris West
Educational Services, n.d.). The program provides explicit instruction in the five
main components of reading instruction: (a) phonemic awareness, (b) phonics, (c)
fluency, (d) vocabulary, and (e) comprehension, as delineated by the National
Reading Panel (2000). The program also allows Reading First schools to obtain
funding under NCLB (CDE, 2007, April).
The Language! intervention program is highly structured and departs from
the typical teaching practice of whole-group instruction to small-group teaching,
where many opportunities are built in for repetition and reviewing of new or already
learned concepts. The curriculum program fosters teacher-directed and scaffolded
instruction, which are teaching approaches that have been supported through
research (Biancarosa & Snow, 2004; Swanson, 1999; Torgesen, Alexander, Wagner,
Rashotte, Voeller, & Conway, 2001).  These built-in opportunities address the
instructional strategies, classroom curriculum design, and student motivation factors
at the teacher and student levels as recommended by Marzano (2003). In addition,
the highly structured curriculum that Language! provides facilitates the teacher to
deliver instruction in a systematic, sequenced, cumulative, and explicit manner, as

45
prescribed by the National Reading Panel’s (2000) extensive study and that of other
scholars (Fletcher & Lyon, 1998; Henry, 1997; Lyon, 1997; Torgesen, 1997).
The Language! program consists of three levels, with each level consisting of
three books. Each book is comprised of six units. The approximate time to complete
one level is one year (Greene, 1996). The Language! program is set up to be
delivered daily in 90 minutes, through a six-step lesson plan. The six-steps lessons
include (a) phonemic awareness and phonics; (b) word recognition and spelling; (c)
vocabulary and morphology; (d) grammar and usage; (e) listening and reading
comprehension; and (f) speaking and writing. The program also has built-in
“flexibility instructional pacing” (Greene, n.d.) to allow individualized instruction to
the students’ level of performance. Having differentiated instruction and flexible
grouping be integral features of the program are components that offer research-
based evidence (Juel & Minden-Cupp, 2000; Lou et al., 1996). The Language!
curriculum contains 17 strands, and these include:
1. Phonemic/phonological concepts
2. Phonemic awareness
3. Phoneme-grapheme
correspondences
4. Syllabication
5. Word recognition
6. Vocabulary development
7. Text reading
8. Comprehension
9. Spelling
10. Orthography
11. Mechanics
12. Composition
13. Grammar and usage
14. Syntax and sentence structure
15. Semantic relationships
16. Figurative language
17. Morphology

Furthermore, the Language! program also includes placement and assessment
elements (such as formative and summative) keeping teachers informed about their

46
students’ progress (Black & Wiliams, 1998a, 1998b). The collected assessment data
allow teachers to not only engage in flexible grouping, but also to adapt their pacing
and engage in differentiated instruction. This program feature not only meets the
NCLB provision, but also corresponds to Marzano’s (2003) framework in addressing
(a) challenging goals and effective monitoring at the school level; (b) instructional
strategies and classroom curriculum design at the teacher level; and (c) student
motivation at the student level.
The assessment system is a critical component of the Language! program for
it to be successful. The assessment system involves three levels: (a) placement, (b)
ongoing, and (c) summative. The first level is the placement assessment, which is
comprised of three group-administered tests: (a) Degrees of Reading Power (DRP),
(b) Test of Silent Word Reading Fluency (TOSWRF), and (c) Spelling Inventory
(SI). These tests are administered prior to the start of the program and the results
determine at which of the two entry points in the curriculum a student should be
placed. In addition, the results are used to establish a baseline to monitor students’
progress throughout the curriculum as ongoing testing is built-in in the program.
The curriculum was introduced by Greene (1996) during a study with 45
middle and high school youth offenders who had delays in reading, spelling, and
writing. This treatment group received the Language! curriculum for a period of one
year. A comparison group was also formed that consisted of 51 juvenile offenders.
This comparison group was randomly selected from a list of students whose pre-
testing results showed delays in reading, spelling, and writing. The comparison

47
group participated in a rigorous intervention program that was described as eclectic
and involving whole-group instruction. However, Greene noted that the results of the
pre-tests for the treatment group showed greater delays in reading, spelling, and
writing than the results for the comparison group. In spite of this fact, results
indicated that the treatment group had made significant improvement in all areas
(Greene, 1996).
The available literature on this program, though not extensive and
independent, indicates that it is appropriate for use with students who are not
proficient readers, are identified as English language learners (ELL), or are in special
education. This is relevant information as 53% of the identified at-risk students at
Rolling Hills Elementary School are ELLs and 27% are in special education. One of
the two peer reviewed articles found was a recent program evaluation study
conducted by Moats (2004) on the effectiveness of the Language! program with 552
middle and high school students as poor readers. Many of the study participants were
at-risk ELL students (n = 198). This was the first time that the schools were
implementing this program and that the teachers were using this ELA curriculum.
The implementation of the program was for about nine months. Moats (2004) found
significant growth in the students that received Language! instruction in the areas of
“basic word recognition, word attack, and passage comprehension” (p. 156). This
research study provides supporting evidence to Rolling Hills Elementary for the
implementation of Language! as a reading intervention for their identified at-risk
students in the upper grades. Although the research for this intervention program is

48
not vast and has been conducted by scholars affiliated with the creation and
marketing of this program, the existing literature about the effectiveness of this
literacy intervention program is assuring (Moats, 2004; Robinson, 2002) in addition
to the existing extensive research validating the different components that make-up
the Language! program that were presented earlier.
Lastly, by implementing Language! which is a scientific-based research
intervention program, qualifies schools under NCLB to receive grants for their
teachers’ professional training to use the program (Sopris West Educational Services,
n.d.). Both initial coaching and ongoing training is offered by Sopris West
Educational Services (www.sopriswest.com).
It also needs to be noted that there is a new and revised version of Language!
and it is called Language! The Comprehensive Literacy Curriculum (3
rd
Edition)
which is an extension of the earlier edition. The publishing company indicates that
they are in the process of collecting data on the effectiveness of this new revised
edition.
Thus, based on the literature review on Language!, the prognosis is hopeful
that a statistical difference will be seen in the at-risk Rolling Hills’s students
academic achievement in reading at the end of the 9-month reading intervention.

49
CHAPTER THREE
METHODOLOGY
The aim of my study is to examine the effectiveness of the Language!
intervention program being implemented in the 2007-2008 school year with the
identified, at-risk, fourth, fifth, and sixth grade students at Rolling Hills Elementary
School. Although both summative and formative evaluations were used to study the
reading intervention, the focus of this research was primarily a summative approach.
Particularly, a concurrent mixed methods approach was applied to better comprehend
the effectiveness of the intervention program. This was attained by triangulating both
quantitative student achievement data and qualitative program information data. The
employment of triangulation in this study was based on the postulation that various
sources of data not only strengthens an inquiry (Patton, 2002), but utilizing a mixed
methods approach also helps us to better understand the research problem (Creswell,
2003).
In this case, three achievement tests were utilized within the experimental
group to compare the at-risk students in English Language Arts (ELA) or reading
achievement, prior to the intervention (pre-tests) as compared to their reading
achievement at the end of the intervention (post-tests). Moreover, one of the three
achievement tests (CST ELA) was used to compare the reading achievement of the
experimental group to the reading achievement of students who did not receive the
intervention program in the control group. Simultaneously, the strengths and
weaknesses of the implementation of the intervention program, Language!, were

50
examined utilizing observations of the intervention program, review of documents
and materials, and informal interviews with the two teachers implementing the
intervention program and the leadership data team members.
The following sections of this chapter introduce elements of the research
design. These various aspects of the design include both summative and formative
evaluations, including the types of group designs, the parametric and nonparametric
statistics, and the types of qualitative data. Then, an explanation of the intervention
program Language! used in this research study is presented, with a summary of the
setting and participants in this study to follow.  Additionally, the instrumentation,
materials, and procedures that were utilized in this investigation are described. These
consist of the different tests scores used to measure a student’s reading achievement,
and the different types and means of collecting the qualitative data. Lastly, this
chapter concludes with an account of the different data analyses that were applied in
determining both the statistical and practical significance of the quantitative data and
in formulating the meaning of the qualitative data.
Research Design
Summative Evaluation Design
The research questions steering this summative inquiry were: Did the
intervention program of Language! have an effect in the English Language Arts
(ELA) achievement of the identified at-risk students? Was there a statistical
significant gain from pre- to post-intervention in the experimental group? Was there
a practical gain from pre- to post-intervention in the experimental group?

51
To measure ELA achievement, the collected summative data was across three
dependent variables, with its units of analysis ranging from raw scores to scaled
scores: (a) California Standards Test (CST) ELA scaled scores; (b) Degrees of
Reading Power Reading Test (DRP) raw scores; and (c) Test of Silent-Word
Reading Fluency (TOSWRF) raw scores (see Table 10). These data were gathered
from the 42 identified at-risk students in the fourth, fifth, and sixth grade at Rolling
Hills Elementary School. The time span of when the data was collected was from the
2007-2008 school year to the 2008-2009 school year.
As mentioned earlier, a concurrent mixed methods approach was utilized in
analyzing the collected data in-depth in determining the effectiveness of this
intervention program. This was accomplished by employing two separate designs,
Design A and Design B, which are discussed next.
Design A. Pre-post dependent groups design. As the name of the design
suggests, this design measures the same participants twice, before and after a given
intervention. By assessing the same participants under both conditions (pre and post),
the possibility of random error is minimized, making the effects of the treatment
more apparent. In this study, the change in the reading achievement of the study
participants was measured, from their achievement performance prior to the
intervention (pre) to their achievement performance after the intervention (post). The
annotated scientific notation for this design is:
O1 X O2

52
For this study Design A, the scientific notation is described as follows:
At-Risk Students Grades 4 to 6 Pre (2007) X Post (2008)
    DRP   DRP
    TOSWRF  TOSWRF

In this design, two of the three dependent variables were utilized to measure
change in ELA achievement from the pre-intervention 2007 to the post-intervention
2008 phase and they are depicted in Table 10. The objective of this investigation was
to evaluate the change in ELA achievement of the at-risk students after one year of
the implementation of the Language! intervention program.

Table 10
Dependent Variables Measuring Reading Achievement
Pre-Intervention 2007 Group Post-Intervention 2008 Group
CST ELA 2007 Scaled Scores CST ELA 2008 Scaled Scores
DRP 2007 Raw Scores DRP 2008 Raw Scores
TOSWRF 2007 Raw Scores

TOSWRF 2008 Raw Scores



It needs to be noted that utilizing a pre-post design is deficient in internal
validity (Creswell, 2003; King & Minium, 2003). This means that the observed
change may possibly be due to other factors aside from the intervention. One way to
address this deficit is to include a randomized control group. However, in the field of
education, it is difficult, almost impossible, to randomly assign participants to groups
(King & Minium, 2003; National Research Council, 2002). This leads us to the next

53
design, Design B – Nonequivalent Control Group Design, which was also employed
in this summative evaluation.
Design B. Nonequivalent control group design. This design is also known as
a quasiexperimental design and it is one of the most common designs used to
evaluate the effectiveness of an intervention as randomization is difficult to achieve
in the education arena (King & Minium, 2003). As its name indicates, this
nonequivalent control group design lacks assignment randomization, thus the name
nonequivalent. In this study, two independent variables were used, an experimental
group receiving the intervention and a control group not receiving the intervention.
The participants in these groups were not randomly assigned; therefore, the use of a
nonequivalent control group design is appropriate.
The experimental group consisted of 42 identified students in the upper
grades at Rolling Hills Elementary receiving the intervention program in the 2007-
2008 school year. The control group, which did not receive the intervention, was a
matched control group (n = 42) to the experimental group, which was composed of
students within the Valley Unified School District (VUSD) having similar profiles as
the students in the experimental group. The control group was matched to the
experimental group by their grade, ethnicity, gender, English language status and
English level of proficiency using the CELDT 2007 Overall proficiency levels,
special education status, and CST-ELA 2007 scores. The data for the control group
was selected electronically from the district office with non-identifiers. Student
identifiers such as names, birthdates, etc. were not drawn for the control group. This

54
type of matching assisted in controlling any confounding variables that may affect
the observable outcome. The participants in the matched control group within the
district received the state-approved curriculum for ELA Open Court Reading
Program (OCR), which is implemented district-wide. However, the experimental
group did not receive the OCR curriculum.
In this study, the nonequivalent control group design was utilized to measure
the effectiveness of the intervention program, using matching, as mentioned
previously. The ELA achievement of the experimental and control groups was
compared, utilizing a third dependent variable, the CST ELA. The annotated
scientific notation for this design is:
E O1 X O2
C O1  O2
For this study Design B, the scientific notation is described as follows:
Experimental Group At-Risk 4
th
-6
th
 Pre (2007) X Post (2008)
  Rolling Hills Elem. CST ELA  CST ELA

Control Group  Matched 4
th
-6
th
Pre (2007)  Post (2008)
  VUSD   CST ELA  CST ELA

The pre-intervention collected data were the CST 2007 ELA scores. The
treatment, indicated by X in the scientific notation above, is the Language!
intervention program implemented in the school year 2007-2008 to the at-risk
students (the experimental group). The post-intervention data consist of one
dependent variable, the CST 2008 ELA scores, measuring reading achievement of

55
the at-risk students and the students in the control group. These post-intervention
data indicate reading achievement after one school year of the intervention program.
Lastly, this nonequivalent control group design compares the observed
outcomes of 2008-2009 ELA performance between the experimental group at
Rolling Hills School and the matched control group within VUSD by significant
subgroups, such as ethnicity, grade, and English language status. The objective of
this analysis was to determine whether the difference in the ELA performance
amongst these subgroups was scientifically significant.
A delimitation of this study was that the findings of this program evaluation
were delimited to the Rolling Hills Elementary site. Moreover, limitations in this
study were seen in the methodologies being utilized of a pre-post dependent group
and nonequivalent control group designs which do not render assertions for
causation. For one, in a pre-post design, the observed difference between the two
samples (pre- and post-intervention) can be due to factors other than the intervention
itself. Thus, it holds an internal validity threat. Similarly, a nonequivalent control
group design also lacks internal validity. The experimental and control groups in this
design were not randomly assigned, and therefore, they may not be truly equal. Thus,
the observed difference between the two groups cannot be attributed to the
intervention.
Formative Evaluation Design
To strengthen and add depth to the findings of the summative quantitative
evaluation, a formative qualitative evaluation was conducted simultaneously. The

56
data obtained from both of these evaluation designs were triangulated as
recommended by Patton (2002) and Creswell (2003). The research question guiding
the formative evaluation was: What were the perceived facilitating and impeding
factors regarding the implementation of Language! at Rolling Hills Elementary?
The qualitative data was drawn together by using unobtrusive observations of
the intervention program, informal interviews with the two teachers who
implemented the intervention program and with the leadership data team members,
and review of documents and materials of the intervention program. The
accumulated data helped isolate the strengths and weaknesses of the implementation
of the intervention program at the school. The qualitative data provided information
to the leadership data team at Rolling Hills Elementary concerning the facilitation of
as well as those factors that impeded the carrying out of the intervention program.
Thus, this information may well serve as a foundation from which revisions can be
planned for organizational improvement. The triangulation of the various sources of
information (e.g., interviews, observations, etc.) provided an increase in the validity
of the collected data as it employed different strategies to attain accuracy (Creswell,
2003; Patton, 2002).
The information gathered from the formative evaluation assists in revealing
the improved areas of the intervention program that need to be addressed by the
school. The collected data is valuable to the Rolling Hills leadership data team in
helping them modify the intervention program, especially if the intervention is to be
continued the following year.

57
Intervention
The 2006-2007 CST Summary in ELA for Rolling Hills Elementary revealed
that 68.4% of its students Grades 2 through 6 scored below the proficient level. This
statistic denotes that all students tested performed in the “far below basic,” “below
basic,” and “basic” levels. Disaggregating the data of the school by grade levels,
58% of the second and fourth graders, 69% of the sixth graders, and 79% of the third
and fifth graders scored below proficiency in ELA of the CST at Rolling Hills
Elementary. Even when the data was disaggregated by grade level, the numbers of
students at Rolling Hills performing in the “far below basic,” “below basic,” and
“basic” levels in the ELA of the CST in 2006-2007 were still extreme. These levels
indicated a significant achievement gap for all students at the school. Being that 79%
of its third and fifth grade students scored below proficiency in ELA of the CST in
2006-2007, the leadership data team at Rolling Hills Elementary was concerned that
these students were soon moving on to middle schools not being proficient readers.
Since the numbers of identified at-risk students in the upper grades at Rolling
Hills Elementary were 42, two intervention classrooms were contrived: one fourth
grade classroom with 17 students and one fifth-sixth grade combination classroom
with 25 students (15 fifth and 10 sixth graders). Two teachers were mutually selected
to teach these two intervention classrooms. Prior to beginning the Language!
intervention program at Rolling Hills School in 2007-2008, the two teachers attended
a one-week training on Language! in the summer of 2007 given by the publisher of

58
the program, Sopris West Educational Services, on how to implement the Language!
curriculum.
Then, at the beginning of September 2007, the identified at-risk students were
administered the placement tests in Language!. The students were placed
accordingly in either Book A or Book C, the two entry points for Language!,
depending on their patterns of reading performance on the placement tests. Each
book consists of six units. Thus, as the teachers finish a unit, the students are
evaluated on each unit. Their students’ performance on these ongoing assessments
provides insight to the teachers as to the areas that need further attention. The
teachers review their students’ performance and depending on how they perform, the
teachers review, reteach, or move on to the next unit. The program is descriptive and
it provides prescriptive teaching boxes throughout the units in the Teacher Edition
books to assist with the modifications of the lessons.  After the completion of the last
unit in a book, the students are administered two built-in summative assessments
(Summative Tests and Progress Indicators) to monitor their progress. At this point,
based on their students’ performances on the tests, the teachers decide what units or
steps are needed to be reviewed or retaught.
Setting and Participants
Setting
The setting for this study was Rolling Hills Elementary located in Los
Angeles County, and the school is about 15 minutes from downtown Los Angeles.
The school is within the Valley Unified School District (VUSD) and in the San

59
Gabriel Valley. Adjoining cities consists of Eagle Rock, La Cañada, South Pasadena,
San Marino, and Arcadia. Rolling Hills Elementary School is a kindergarten through
sixth grade elementary school with a student enrollment of 402 students. The school
follows a traditional calendar school year. The school’s student population was
composed of Latino (63.7%), African American (21.1%), White (10.2%), Asian
(1%), Filipino (0.5%), American Indian/Alaska Native (0.2%), and Multiple/No
Response (3.2%).
Out of the 402 student population, 104 students (25.9%) were indicated as
English Language Learners (ELLs). Furthermore, 18 students (13.8%) of the ELL
population were redesignated as Fluent English Proficient (RFEP) as per 2006-2007
school data. This means that these RFEP students no longer needed English language
development support and could participate in the regular English language program.
In addition, eighty-five percent of the student population at Rolling Hills Elementary
partakes in the free or reduced-price lunch program. Thus the number of students
considered to be socio-economically disadvantaged at Rolling Hills Elementary was
considerable.
Participants
The participants in this study involved 42 students in the upper grades at
Rolling Hills School: 17 fourth graders, 15 fifth graders, and 10 sixth graders (see
Table 11). These students were identified as at-risk by the leadership data team at the
school right at the beginning of the 2007-2008 school year. The students were
selected based on their recent 2007 scores in English Language Arts (ELA) in the

60
California Standards Tests (CST), their final grades in 2006-2007, and their previous
year teachers’ comments and recommendations.

Table 11
Experimental Group by Grade Levels
Grade n
4 17
5 15
6 10
Total 42


The breakdown of the students in the experimental group by their
demographics were the following: (a) by gender, 60% females and 40% males; (b)
by ethnicity, 71% Latinos, 19% African Americans, and 10% Whites; (c) by English
Language (EL) status, 55% English Language Learners (ELL) and 45% English
Only; by Special Education (SPED) status, 26% SPED and 74% non-SPED (see
Tables 12, 13, and 14).

61
Table 12
Experimental Group by Gender
Females  Males
Grade n %  n %
4 10 59  7 41
5 9 60  6 40
6 6 60  4 40
Total 25 60  17 40

Table 13
Experimental Group by Ethnicity

Latinos  
African
Americans
Whites
Grade n %  n %  n %
4 10 59  5 29  2 12
5 11 73  3 20  1 7
6 9 90  0 0  1 10
Total 30 71  8 19  4 10

Table 14
Experimental Group by EL and SPED Statuses
ELL  EO  SPED  Non-SPED
Grade n %  n %  n %  n %
4 8 47  9 53  3 18  14 82
5 9 60  6 40  4 27  11 73
6 6 60  4 40  4 40  6 60
Total 23 55  19 45  11 26  31 74
Note. ELL = English language learners; EO = English only; SPED = special education; Non-SPED =
not special education.

62
The effectiveness of the Language! intervention program was evaluated with
these at-risk students at Rolling Hills Elementary by using a range of ELA
achievement scores. The scores used to measure the students’ achievement in ELA
were gathered within a timeframe of the 2007-2008 to 2008-2009 school years.
These collected data were limited to the intervention program school site and to the
data of the matched control group within the school district.
Aside from the participants in the experimental group, there were 42 students
selected (not randomly) to form a non-equivalent matched control group. The control
group was composed with students within the district that had similar profiles of the
at-risk students in the experimental group. These students were matched by their
grade, ethnicity, gender, English language status and English proficiency levels,
special education status, and CST-ELA 2007 scores. The levels of English
proficiency were matched using the CELDT 2007 Overall proficiency levels when
applicable. The data for the control group were provided by the district office with
non-identifiers. Student identifiers such as names, birthdates, addresses, etc. were not
drawn for the control group.
A third group of participants in this study included six certificated staff
members at Rolling Hills School. They were the two teachers teaching the
intervention program, and the members of the leadership data team: the curriculum
resource teacher, the language development resource teacher, the literacy coach, and
the principal. These participants were involved to provide information about the
intervention program implementation and outcome. These certificated staff were

63
asked to supplement the quantitative findings of this study by sharing their thoughts
about those factors that either facilitated or hindered the implementation of this
intervention program and their suggestions in regards to how to improve the
program.
Instrumentation and Materials
Student English Language Arts (ELA) Achievement
California Standards Tests (CST) in English Language Arts (ELA). The
scaled scores of the CST ELA were utilized from the CST administrations in 2006-
2007 and 2007-2008 school years. The collected quantitative data for the
experimental group were gathered from Rolling Hills Elementary with non-
identifiers, and the data for the matched control group were provided by the district
office, also with no student identifiers.
As per the California Department of Education (2007c), “the CSTs are
standards-based tests that measure the achievement of state content standards in
English-language arts, mathematics, science, and history-social science” (p. 3). The
CST is administered to all students in grades 2 through 11 in the spring semester. In
the case of the CST in English Language Arts (ELA), the CST is based on content
standards for that one specific grade. For instance, the CST in ELA for Grade 4, 24%
of the items test in the area of Word Analysis, Fluency, and Systematic Vocabulary
Development; 20% in the area of Reading Comprehension; 12% in Literary
Response and Analysis; 24% in Written and Oral English Language Conventions;

64
and 20% in Writing Strategies (CDE, 2008). The CST in ELA for grades 4 through
11 contains 75 multiple-choice questions and an added six field-test questions.
One way that students’ CST results are reported is by performance levels
granted by the State Board of Education. These performance levels are: (a) advanced
(A), which exceeds the content standards; (b) proficient (P), which meets the content
standards; (c) basic (B), which is approaching the content standards; (d) below basic
(BB), which is below the content standards; and (e) far below basic (FBB), which is
well below the content standards. Proficient or above performance levels are the set
targets for all students.
As well as the levels of performance, the results of the CST are also reported
in the format of scale scores. The scale scores represent the adjustment done to the
set criterion for meeting proficiency that varies from year to year and to make the
tests equal from one year to the next.
Degrees of Reading Power (DRP). The DRP is a criterion-based
measurement of reading comprehension that measures students’ capabilities to
construct meaning (Greene, n.d.). Students read short passages with missing words
and then, they choose the word that makes the most sense from a multiple choice
question. Raw scores are then converted to DRP unit scores and grade equivalents.
The DRP is one of the three placement tests of the intervention program, Language!,
and can be administered either individually or in a group. Students’ performance on
this test contributes data for program placement entry.

65
Test of Silent Word Reading Fluency (TOSWRF). The TOSWRF is one of the
three placement tests of the Language! program, which can be both individually or
group administered. It is a reading fluency test and it measures students’ reading
speed and precision of word recognition (Greene, n.d). Thus, it is a timed test.
Students are given a list that contains rows of words in order of reading
difficulty, with no spacing in between words:
oakbuildemptyfullsentdeepablenut/
The students are to read silently and mark with a line in between words that they
identify within the time of three minutes:
oak/build/empty/full/sent/deep/able/nut
There are two forms that are administered, and the raw scores of each form are
averaged for each student. Contrary to the DRP, the TOSWRF’s raw scores are only
converted to grade equivalents.
Interviews
In addition to the quantitative data, qualitative data was collected as another
source of information. This was accomplished by conducting interviews as
recommended by Patton (2002). The foremost reason for interviewing is because
“we cannot observe everything” (Patton, 2002, p. 341), including thoughts, feelings,
intentions, and previous behaviors, to name some. As Patton (2002) stated, “we
interview to find out what is in and on someone else’s mind, to gather their stories”
(p. 341).  

66
Three types of interviews are discussed by Patton (2002): the informal
conversational interview, the general interview guide, and the standardized open-
ended interview. The first one, informal conversational interview, is a natural and
unprompted approach to asking questions during the course of an interaction or an
observation. As its name points out, this type of interview is like having an informal
and casual conversation. The next approach is the general interview guide, which
includes a framework of issues to be covered (e.g., checklist) that guides the
interview. Lastly is the standardized open-ended interview. This interview approach
is unlike the other two interview approaches. This type of interview involves an
array of questions that are to be asked in the same order and with the same exact
words. Thus, its name “standardized” (Patton, 2002) which helps minimize variance
as respondents are interviewed.  
For the objective of this study, the general interview guide approach was
applied as suggested by Patton (2002). This approach to interviewing was in
particular selected as Patton explained:
The interview guide is prepared to ensure that the same basic lines of inquiry
are pursued with each person interviewed…within which the interviewer is
free to explore, probe, and ask questions…with the focus on a particular
subject that has been predetermined. (p. 343)

No matter which type of interviewing one chooses, Patton provides the reader with a
“matrix of question options” (p. 352). This matrix presents the researcher with a
range of  18 types of queries to be used as a guide in developing questions for
research interviewing. The following questions were developed utilizing Patton’s

67
“matrix of question options” (p. 352) with the assistance from Professor Dr. Dennis
Hocevar and colleague Marvin Horner for the general interview guide approach
utilized in this study:
(1) How is the Language! intervention program implemented at Rolling Hills
Elementary School?
(2) How do you monitor and support the implementation of the Language!
program at Rolling Hills School?
(3) What factors do you see as facilitating the implementation of Language!?
(4) What factors do you see as hindering the implementation of Language!?
(5) To what extent does Language! address the English Language Arts
curriculum that is not addressed by the Open Court Reading Program?
(6) How well do you think Language! is working with the identified at-risk
students at Rolling Hills Elementary School?
(7) How would you improve the Language! program?

Direct Observations
Another source of qualitative data collection for this research was direct
observations. The purpose of observations is to portray the setting that is being
observed. This includes the people, the activities, the interactions, and the
significance of these to others. Patton (2002) discussed in detail six advantages of
conducting direct observations as a means of gathering data:

68
(1) The inquirer is better capable to understand and apprehend the context
within which people interact.
(2) The inquirer is able to be more open and more discovery oriented with
firsthand experience and less reliable on prior conceptualizations.
(3) The inquirer has the chance to observe things that may normally escape
awareness among the people in the setting.
(4) The inquirer has the opportunity to discover things that people would be
reluctant to talk about in an interview.
(5) The inquirer has the chance to move beyond the selective perceptions of
others.  
(6) The inquirer has the ability to make use of personal knowledge during the
formal interpretation stage of analysis. (pp. 262- 264)
The focal point of the direct observations was aimed at the consistency and
fidelity of the instruction delivery of Language! with the students at Rolling Hills
Elementary School. The observations included the teacher-student actions during the
implementation of Language!. In general, the collected information from the
observations of the intervention program was used with a holistic approach.  
Procedures
Student English Language Arts (ELA) Achievement
The method utilized in this study to evaluate students’ achievement in ELA
entailed the collection of various quantitative data such as California Standards Test
(CST) scores, Degrees of Reading Power (DRP) scores,, and Test of Silent Word

69
Reading Fluency (TOSWRF) scores. These measures were gathered from the
experimental group at Rolling Hills Elementary within a time frame from 2006-2007
to 2008-2009 school year. However, only one of the measures, CST scores, was used
from the matched control group within the district. The CST scores for the matched
control group were collected from the tests administrations in 2007-2008 and 2008-
2009 school years.
The objective of the quantitative investigation was two-fold. One was to
evaluate the change in ELA achievement of the at-risk students at Rolling Hills
School (experimental group) after one year of the implementation of the Language!
intervention program. This was carried out by comparing the means of the pre- and
post-test data in the ELA achievement of the experimental group utilizing the DRP
and TOSWRF scores. The second objective was to measure change in ELA
achievement between the experimental and control groups after one school year of
intervention was given to the experimental group. This was achieved by comparing
the means of the pre- and post-test data in ELA achievement between the
experimental and control groups after the means for the pre-test data (CST 2007
ELA scores) had been adjusted.
Interviews
In this study, some of the certificated staff at Rolling Hills Elementary School
was interviewed to attain their thoughts and feelings about the intervention program,
Language! The interviews were conducted at the school during the school-day. The
timeframe for these interviews were within the last two months of the 2007-2008

70
school year. The interviews fluctuated in time from fifteen to thirty minutes. The
respondents were six certificated staff. They included the two teachers implementing
the Language! intervention program, the curriculum resource teacher, the language
development resource teacher, the literacy coach, and the principal. The interview
information was documented during and after the interviews. These collected data
was recorded by hand and in a note-taking manner to preserve the interview
approach of being unobtrusive.
Observations
Direct observations of the implementation of the intervention program
Language! was performed at Rolling Hills Elementary School. This means that the
two intervention classrooms were observed. The observations were conducted during
the school-day, at random dates throughout the last four months of the 2007-2008
school year. These observations lasted anywhere from twenty to thirty minutes.
Throughout the observations, the two teachers implementing the Language!
intervention program and the identified at-risk students were studied. Moreover, the
classroom environments were also examined. The collected data from the
observations were documented both during and after the observations. These data
were recorded by hand and in a note-taking manner to preserve the observation
approach of being unobtrusive. Lastly, the inquirer of this study retained a non-
participant role during the collection of overt observations.

71
Data Analyses
Qualitative Analysis
In examining the data collected from the formative evaluation, the six
standard steps discussed by Creswell (2003) were applied as follows:  
(1) Organize and prepare the data for analysis, which includes transliterating
interviews, observations, etc., and sorting the data in a specific order or
by a certain criteria.
(2) Read through all the data, with the intention of acquiring a general sense
of what the data is trying to convey, such as the nature of ideas and its
depth.
(3) Begin detailed analysis with a coding process, meaning organizing the
data into categories.
(4) Use the analysis from Step 3 to describe the setting and the people, and to
yield themes and their interconnectedness.
(5) Expand how the depiction and themes from Step 4 will be exemplified in
the qualitative narrative.
(6) Construct an interpretation or meaning of the analyzed data. (pp. 191-195)
In sum, this qualitative analysis entails “making sense out of text and image
data, preparing the data for analysis, conducting different analyses, moving deeper
and deeper into understanding the data, representing the data, and making an
interpretation of the larger meaning of the data” (Creswell, 2003, p. 190).

72
Quantitative Analysis
In conducting an evaluation on an intervention program, the objective of this
type of study is to determine statistical significance by employing a quantitative
analysis. This form of data analysis assists in investigating the quantitative data
collected from the summative evaluation of this study. Various English Language
Arts (ELA) achievement scores were collected from Rolling Hills Elementary
School and the Valley Unified School District (VUSD) district office. These various
data were analyzed in both the pre-post dependent groups design and the
nonequivalent control group design.
In the pre-post dependent group design, the change in the ELA achievement
of the experimental group from the pre-intervention (2007) to the post-intervention
(2008) was analyzed. This design utilized both parametric and nonparametric
statistics for the data analyses of two dependent variables, the Degrees of Reading
Power (DRP) and the Test of Silent Word Reading Fluency (TOSWRF). The tests
statistics used included: (a) a dependent groups t-test, to determine if the observed
change is of statistical significance (p < .05); (b) Cohen’s d (a measure of effect
size), to assess the practical significance utilizing a criterion of d > .30; and (c)
calculating a percentage change, to evaluate practical significance using a criterion
of 10% improvement as delegated by No Child Left Behind (NCLB).
Similarly, the nonequivalent control group design facilitates the analysis of
the data of the experimental group (the 42 identified students in Grades 4 to 6 at
Rolling Hills Elementary School) and of the one control group (n = 42, matched to

73
the experimental group), which was not randomly chosen. The ELA achievement on
the CST of the participants in both the experimental and the control groups were
compared on the pre- and post-test data, after the means of the pre-test (CST 2007
ELA) were adjusted. Because CST 2007 ELA is an observable predictor of CST
2008 ELA, one needs to control for the effect of the CST 2007 scores. Both
parametric and nonparametric statistics for the data analyses of the dependent
variable (CST 2008 ELA scores) and covariate (CST 2007 ELA scores) were
applied. The test statistics utilized were: (a) an analysis of covariance (ANCOVA),
to determine if the observed difference between the experimental and matched
control group is of statistical significance (p < .05); and (b) Cohen’s d, to measure
the practical significance utilizing a criterion of d > .30.  

74
CHAPTER FOUR
RESULTS
The evaluation of the effectiveness of a reading intervention program,
Language!, was the main focus of this study. This reading intervention program was
implemented at Rolling Hills Elementary School during the 2007-2008 school year
with 42 identified at-risk students in grades 4 through 6. The research questions
guiding this research were whether or not this intervention program, Language!, had
an effect on the English Language Arts (ELA) achievement of the students in the
intervention group, and whether or not there was statistical and/or practical
significance from pre- to post-intervention in the experimental group. Following are
Tables 15 and 16 illustrating the participants involved in this study and the
configurations of the groups.

Table 15
Total Participants in Experimental and Control Groups
Experimental Group  Matched Control Group
Grade n  n
4 17  17
5 12
a
14
6 10  9
Total 39  40
a
The experimental group began with 15 students; however, 3 students took the California Modified
Assessment (CMA) test in 2007-2008 instead of the CST.

75
Table 16
Between-Subjects Factors
Factors  n
Group Experimental 39
Matched Control 40
Grade 4 34
5 26
6 19
Gender Female 46
Male 33
Ethnicity African Americans 15
Hispanics 57
Whites 7
EL Status ELL 41
English Only 38
SPED Status Not Special Ed 65
Special Ed 14


The summative evaluation approach of this study assimilated three dependent
variables (DVs) and two separate designs, a pre-post dependent groups design
(Design A), and a nonequivalent control group design (Design B). The three
dependent variables being measured were: (a) California Standards Test (CST)
English Language Arts (ELA) scaled scores; (b) Degrees of Reading Power Reading
Test (DRP) raw scores; and (c) Test of Silent-Word Reading Fluency (TOSWRF)

76
raw scores. These dependent variables were utilized to measure students’ ELA
achievement.
Group Designs
Design A - Pre-post Dependent Groups Design
This design was applied to evaluate the change in the ELA achievement of
the experimental group from the pre-intervention (2007) to the post-intervention
(2008), utilizing the DRP and TOSWRF scores. The subsequent parametric statistics
were employed for the data analyses of two of the three dependent variables: (a) a
dependent groups t-test, to assess the statistical significance of the observed change
(p < .05); (b) Cohen’s d (a measure of effect size), to evaluate the practical
significance (d > .30); and (c) percentage change, to measure practical significance
(criterion of 10% improvement).
Design B – Nonequivalent Control Group Design
This type of design was employed to assess the ELA achievement on the CST
of the experimental group and comparison control group that was not randomly
chosen. The experimental group was the only group that received the intervention.
To reiterate, the intervention was Language!, a state-adopted reading intervention
program for grades 4 to 8. Only one of the three dependent variables was utilized
with this design, and this was the CST ELA. The statistical analysis employed in this
comparison design of the experimental and control groups was an analysis of
covariance (ANCOVA), to measure the statistical significance of the treatment
effects (observed differences) and the effects of various factors (p < .05).

77
Additionally, Cohen’s d was employed to evaluate the practical significance (d >
.30).
Results
Design A - Pre-post Dependent Groups Design
Statistical significance. Following are Tables 17 through 19 illustrating the
pre-post significance test findings of the dependent t-test for the experimental group,
utilizing two of the three dependent variables: the Degrees of Reading Power
Reading Test (DRP) raw scores and the Test of Silent-Word Reading Fluency
(TOSWRF) raw scores.


Table 17
Pre- versus Post-Intervention English Language Arts (ELA) Achievement
Paired Samples Statistics
 n Mean Std. Deviation Std. Error Mean
Pair 1 DRP 2008 42 17.55 6.932 1.070
DRP 2007 42 11.50 5.302 .818
Pair 2 TOSWRF 2008 40 98.98 25.627 4.052

TOSWRF 2007 40 63.15 19.055 3.013

78
Table 18
Pre- versus Post-Intervention English Language Arts (ELA) Achievement
Paired Samples Correlations
n Correlation Sig.
Pair 1.  DRP 2008 & DRP 2007 42 .635 .000*
Pair 2.  TOSWRF 2008 & TOSWRF 2007 40 .551 .000*
Note. * p < .05

Table 19
Pre- versus Post-Intervention English Language Arts (ELA) Achievement
Paired Differences Samples Test
Paired Differences  
Mean Std. Deviation t df Sig. (2-tailed)
Pair 1. DRP 2008 –
DRP 2007
6.048 5.432 7.215 41 .000*
Pair 2. TOSWRF 2008 –
TOSWRF 2007
35.825 21.952 10.321 39 .000*
Note. * p < .05

Findings of the dependent t-test for the experimental group indicate that for
Pair 1, the increase in students’ ELA achievement performance on the DRP from
11.50 to 17.55 was statistically significant, t(41) = 7.21, p = .001 (see Tables 17 and
19). Similar results were also found for Pair 2, in which students’ ELA achievement
performance increased on the TOSWRF from 63.15 to 98.98. This gain was also
statistically significant, t(39) = 10.32, p = .001 (see Tables 17 and 19).

79
Furthermore, since regression to the mean is an internal validity threat to this
design, the amount of regression to the mean was measured to determine if the
amount of regression is statistically significant or not. This evaluation was possible
through the utilization of the correlations depicted in Table 18. The correlations
reported for the pre- and post-change of the DRP and TOSWRF are .635 and .551,
respectively. These results indicate that the amount of regression to the mean is
significant, indicating that the observed change could be inflated by a problem of
regression to the mean. However, when reviewing results in Table 19, the observed
change is enormous for both measures, more than one standard deviation, suggesting
that the amount of increase is of no concern.
Practical significance. The statistical test just reported in the above section
tells us whether or not the observed change in the means is statistically significant or
not, but it does not inform us if the observed change is practically significant or not
(Hocevar, 2008). Thus, two statistical approaches were utilized in this analysis:
effect size (utilizing Cohen’s d) and percentage change. The effect size was
calculated by using the ratio of the difference from the pre- (2007) to the post-test
(2008) change to the pre-test (2007) standard deviations (SD). The percentage
change was computed utilizing the ratio of the difference from the pre- (2007) to the
post-test (2008) data to the pre-test (2007) mean. Results are illustrated in Table 20.

80
Table 20
Pre- versus Post-Intervention English Language Arts (ELA) Achievement
Practical Significance
Pre Mean Pre SD Pre-Post Change Effect Size % Change
DRP 11.50 5.302 6.048 1.14* .52*
TOSWRF 63.15 19.055 35.825 1.88* .57*
Note. * Effect size (d > .30) and percentage change (> .10).

Effect size. The students’ ELA achievement performance on both tests (DRP
and TOSWRF) surpassed the accepted level of .30 for practical significance. The
effect sizes were large, 1. 14 and 1.88 respectively (see Table 20).
Percentage change. Similarly to the effect size results, the students’ ELA
achievement performance on both tests outdid the preset level of 10 % improvement.
The increases were large, 52% on the DRP test and 57% on the TOSWRF (see Table
20).
Design B - Nonequivalent Control Group Design
Statistical significance. For this next design, the following Tables 21 through
23 illustrate the findings of the main treatment effect and the effects of the various
factors through the use of ANCOVA, where the dependent variable was the CST
ELA 2008 and the covariate was the CST ELA 2007. The different between-subjects
factors utilized were: type of group (experimental or control), students’ grade,
gender, ethnicity, English language status (ELL), and special education status

81
(SPED). All interactions were suppressed due to the problem of small cell
frequencies.
Results of the ANCOVA test (see Table 21) indicate three main findings.
First, the covariate, CST ELA 2007, is a statistically significant predictor of the
dependent variable, CST ELA 2008, F(1, 69) = 18.04, p = .001.  Secondly, there is a
statistically significant Grade effect on the CST ELA 2008 after controlling for the
effect of the CST ELA 2007, F(2, 69) = 6.50, p = .003. The estimated mean for
Grade 4 (306) demonstrated to be higher than the estimated means for Grades 5 and
6 (285 and 284, respectively). Lastly, there is also a statistically significant Group
effect on the CST ELA 2008 after controlling for the effect of the CST ELA 2007,
F(1, 69) = 6.15, p = .016. However, the results were the opposite of what was
expected. The intervention group scored significantly lower than the non-
intervention group on the CST ELA 2008 after controlling for the effect of the CST
ELA 2007 (see Table 22).

82
Table 21
ANCOVA – Tests of Between-Subjects Effects
Dependent Variable: CST ELA 2008 with Covariate: CST ELA 2007
Source of
Variation
Type III Sum
of Squares
df Mean Square F Sig.
Corrected Model 26342.549
a
9 2926.950 4.755 .000
b
Intercept 11257.814 1 11257.814 18.289 .000
b
Group 3788.075 1 3788.075 6.154 .016
b
Grade 8003.432 2 4001.716 6.501 .003
b
Gender 101.118 1 101.118 .164 .687
Ethnicity 472.789 2 236.395 .384 .683
ELL 39.776 1 39.776 .065 .800
SPED 960.856 1 960.856 1.561 .216
CST ELA 07 11101.924 1 11101.924 18.036 .000
b
Error 42473.527 69 615.558  
Total 7114761.000 79    
Corrected Total 68816.076 78    
Note.
a
R Squared = .383 (Adjusted R Squared = .302).
b
p < .05.

Table 22
Group Effect on the CST ELA 2008
Estimated Marginal Means
95% Confidence Interval
Group Mean Std. Error
Lower Bound Upper Bound
Experimental Group 284.4
a
5.749 272.899 295.839
Control Group 298.3
a
5.942 286.470 310.179
 
a
Covariates appearing in the model are evaluated at the following values: CST ELA 2007 = 279.99.

83
Table 23
Grade Effect on the CST ELA 2008
Estimated Marginal Means
95% Confidence Interval
Grades Mean Std. Error
Lower Bound Upper Bound
4 305.9
a
5.834 294.233 317.512
5 284.6
a
6.865 270.886 298.276
6 283.6
a
7.306 269.011 298.163
 
a
Covariates appearing in the model are evaluated at the following values: CST ELA 2007 = 279.99.

Furthermore, post-hoc t-test analyses were conducted to determine the extent
of the differences or effects among Grades 4 to 6. These results are depicted in Table
24.

Table 24
Post hoc t-tests of Grade Differences and their Effect Sizes
Differences    
Mean t df Sig. (2-tailed) SD Effect Size
Grade 4 – Grade 5 21.3 1.776 58 .081 29.703 .717*
Grade 4 – Grade 6 22.3 2.738 51 .008* 29.703 .751*
Grade 5 – Grade 6 1 .821 43 .416 29.703 .034
Note. *t-test (p < .05) and effect size (d > .30).

84
Results of the post-hoc t-tests indicated that Grade 4 significantly
outperformed Grade 6, t(51) = 2.738, p = .008. There were no significant differences
found between Grades 4 and 5 and Grades 5 and 6. However, due to the fact that the
CST tests are not vertical equated and it is a given that the CST tests of an earlier
grade level is easier than a later grade level, these findings on Grade level effects and
differences are of unimportance.
Additionally, differential regression to the mean was evaluated by analyzing
the correlations between the experimental and control groups in this design. Results
indicated that the correlations between both groups, the experimental and control
groups, utilizing the CST ELA as a measure were the same (r = -.546 for the
experimental group and r = -.571 for the control group). These results imply that
differential regression to the mean was not a problem.
Practical significance. The practical significance of the observed difference
in the means was determined by calculating the effect size of the differences among
the grades data using Cohen’s d. These results are reported in Tables 24 and 25. The
accepted levels for practical significance were set at d > .30.

Table 25
Experimental versus Control Groups Practical Significance
Mean Difference CST 2008 ELA SD Effect Size
Experimental – Control -13.9 29.703 .468*
Note. * Effect size (d > 30).

85
Overall results indicate that the practical significance between Grades 4 and 5
resulted in Cohen’s d = .717, indicating a medium effect size as per King and
Minium’s (2003) conventional standard guide (see Table 24). Similarly, the practical
significance between Grades 4 and 6 was also found to have a medium effect size,
Cohen’s d = .751 (see Table 24). Lastly, when comparing the observed change in the
means of the experimental and control groups, results demonstrate a small effect
size, Cohen’s d = .468 (see Table 25).

86
CHAPTER FIVE
SUMMARY, DISCUSSION, AND RECOMMENDATIONS
The previous chapters of this study have depicted the foundation and the
principles of the purpose of this research: to examine the effectiveness of the
Language! intervention program on student achievement with identified at-risk
students in the fourth, fifth, and sixth grades at Rolling Hills Elementary School.
This last chapter brings to a close this evaluation study, where the qualitative
findings are combined with the quantitative results, and provides constructive and
practical implications and recommendations for the educational arena.
Research Summary
The main focus of this research was to examine the effectiveness of the
Language! intervention program on student achievement in English Language Arts
(ELA) of the identified at-risk upper grade students at Rolling Hills Elementary
School. The observed change in the ELA achievement of the experimental group
from the pre-intervention (2007) to the post-intervention (2008) was evaluated not
only for statistical significance but also the effect size and percentage change were
considered for practical significance. While both summative and formative
assessments were employed to evaluate the effectiveness of this intervention
program on student ELA achievement, the primary focus of this study was a
summative approach. Student ELA achievement was measured through summative
data across three dependent variables: (a) California Standards Test (CST) ELA
scaled scored; (b) Degrees of Reading Power Reading Test (DRP) raw scores; and

87
(c) Test of Silent-Word Reading Fluency (TOSWRF) raw scores. Moreover, a
concurrent mixed-methods approach was applied to better understand the
effectiveness of the intervention program.
The participants in this evaluation involved 42 students in the upper grades at
Rolling Hills Elementary School: 17 fourth graders, 15 fifth graders, and 10 sixth
graders. These participants were identified as at-risk students by the school’s
leadership data team at the beginning of the 2007-2008 school year, and they
participated in the reading intervention program at the school. This group of at-risk
students constituted the experimental group. Additionally, there were 42 students
selected (not randomly and with non-identifiers) within the district to form a non-
equivalent matched control group. These students were matched by their grade,
ethnicity, gender, English language status, special education status, CST-ELA 2007
scores, and CELDT 2007 Overall proficiency levels when applicable.
Lastly, a third group of participants in this study included six certificated staff
members at Rolling Hills School: two classroom teachers teaching the reading
intervention program, the curriculum resource teacher, the language development
resource teacher, the literacy coach, and the principal. These certificated staff was
included in this evaluation to provide information about the implementation of the
intervention program and its outcome through interviews. Observations were also
completed at Rolling Hills Elementary School to substantiate this study. Results of
the collected qualitative data are conveyed and linked within the summary of these
study findings, which is discussed next.

88
Summary of Research Findings
The research questions directing this study were as follows:  
(1) Did the reading intervention program, Language!, have an effect in the
English Language Arts (ELA) achievement of the experimental group?
(2) Was there a statistical significance gain from pre- to post-intervention in
the experimental group?
(3) Was there a practical gain from pre- to post-intervention in the
experimental group? and
(4) What were the perceived facilitating and impeding factors regarding the
implementation of the intervention program, Language!, at Rolling Hills
Elementary School?
In evaluating the findings from this study, the results would indicate that the
effects of the Language! intervention on two of the three ELA achievement
categories of the experimental group, were remarkable overall. Not only were the
observed change in student achievement from pre-intervention (2007) to post-
intervention (2008) on the DRP and TOSWRF statistically significant (p < .001), but
also the effect size and percentage change were large enough for practical
significance, surpassing the preset levels of acceptance (see Table 20). This
significant, observed change can be attributed to various factors, as revealed through
the interviews and observations that were conducted as part of this study.
One aspect of the findings discussed above was the consistency of the
implementation of the explicit and systematic instruction embedded in Language! on

89
a daily basis for a duration of 90-minute lessons. This particular finding not only
attains the main research-based reading instruction components as delineated by the
National Reading Panel (NICHHD, 2000), but it also achieves Marzano’s (2003)
school-level factor, “guaranteed and viable curriculum” (p. 10). The qualitative data
gathered through the interviews indicated that Language! provided these teachers
with an intervention tool that incorporated direct instruction to help students develop
basic reading skills at their individualized level, ongoing assessment to monitor
students’ progress and provide feedback, built-in opportunities for repetition and
reviewing to enhance students’ learning, and opportunities for students to experience
success, increasing their self-confidence and motivation. All of these things tap into
Marzano’s (2003) recommendations at the school-level, teacher-level, and student-
level.
Both of the intervention teachers reported that at the beginning of the school
year when this reading intervention was implemented, it was challenging. The
teachers felt that these students were struggling to “buy in” to the program because
they already had a history of being struggling readers, and consequently, these
students were displaying behavioral difficulties, low self-esteem, lack of self-
confidence, and poor motivation. (It should be noted that the Language! program is
not designed to address these social-emotional issues). For the teachers, as this was
their first time implementing this program, they both felt that they did not have all of
the necessary pieces right in place at the beginning of the implementation. One of the
intervention teachers also expressed that she felt that she did not have adequate

90
support. For instance, some of the things mentioned by this intervention teacher
were: lack of access to instructional materials, extra time to prepare for lesson
planning and sustain the fidelity of the intervention program in regards to the
ongoing assessment, and not enough training. Although there were observations
done in her classroom by the administrative staff at the school and the district, she
felt that the feedback she received was limited and unconstructive.
In contrast, the other intervention teacher felt that she received ongoing
training and support during her first year of implementing this program. Her training
consisted of approximately seven full-days spread out through the school year in
addition to the one week of training received at the end of the summer right before
the start of the school year. This teacher felt that she received constructive feedback
from the support staff at the school and from the district. She also visited the high
schools in the district to observe experienced teachers with the implementation of
this program. Additionally, the support staff such as the curriculum resource teacher
and literacy coach reported that they supported the intervention teachers by doing
regular walk-through of these classrooms, making sure these teachers had the
necessary program materials, assessing new students that were enrolled in these
classrooms, helping teachers level their classroom library readers based on the DRP
scores, and providing small-group instruction a few times a week to address listening
skills, authors descriptive, academic language and vocabulary, and comprehension.
These support staff also indicated some of the weaknesses in the implementation of
Language! at the school. They both expressed concern in regards to the fidelity of

91
the implementation of this program. They also felt that additional time for planning
needed to be given to the teachers implementing this program, in addition to time for
looking at the data from the ongoing assessments. They felt that these were
important pieces that were lacking in the implementation of this program because the
teachers did not have enough time to look at the data to develop goals to address
their students’ needs.
Another point that these support staff made was that they themselves needed
more training on this program in how to support the teachers implementing this
program. Another factor brought out by teachers and members of the leadership data
team was that Language! is teaching students at their reading level. Teachers
commented that the intervention curriculum teaches students basic reading skills
such as phonics, provides a variety of reading strategies, includes more scaffolding
of content, and has adequate pacing where students can keep up with the pace and
stay on-task. This outlook was also observed in the classrooms, where students
demonstrated that they were on-task and engaged in the instruction and activities,
and no problems with behaviors or classroom disruptions were seen.
In regards to students’ performance on the CST ELA, the third dependent
variable, the statistical analysis indicated three significant findings: One, students
performance on the CST ELA 2007 was highly correlated to the students
performance on the CST ELA 2008. Two, students in Grade 4 scored significantly
higher than students in Grade 6. These findings need further explanation: One
possible reason for the discrepancy in scores is that the CST ELA for Grade 6 is

92
more difficult than the CST ELA for Grades 4 and 5. Another factor that may have
affected the scores is that the curriculum becomes more rigorous as one moves up in
grade levels (not vertically equated). Lastly, students in the control group scored
higher than students in the experimental group. One possible explanation for this
finding is that the experimental group received only a single reading intervention
program, Language!, as their core ELA curriculum, whereas students in the control
group received the ELA core curriculum, Open Court Reading Program (OCR). This
fact is important as it needs to be understood that OCR “matches state-measured
reading skills” (McGraw-Hill Education, n.d., p. 13) and “meets the expectations of
the California English Language Arts framework and state standards” (McRae, 2002,
p. 6). In other words, the CST ELA tests the curriculum covered by the ELA core
curriculum program, therefore, students who did not receive the core curriculum in
ELA would not be expected to perform well on the CST ELA because they were not
exposed to the material covered by the test.
The latter point was also discerned by the intervention teachers. They
commented that the Language! curriculum is a reading intervention program.
Although Language! addresses the California ELA framework content standards,
these content standards are at a much lower grade level and gradually moves up
through the grades. This implies that these at-risk students are not accessing their
grade level ELA core curriculum, which is the specific content that is tested at the
end of the school year with the CST ELA test. The CST ELA test is based on the
California content standards for English language arts for that one specific grade

93
(CDE, 2007c). Thus, it is reasonably to deduce that students who did not receive an
ELA core curriculum that is aligned to their corresponding grade ELA content
standards during that particular school year, are not likely to show significant gains
in their scores on the CST ELA given in that school year; the results of this study
confirms that theory.
Implications
The implications of this study indicate that a promising practice to address
the lack of reading proficiency at the elementary level is to adopt a reading
intervention program for those struggling readers, such as Language! or a similar
reading intervention program that is scientifically-research-based. In this study, two
of the three measures employed to assess reading achievement resulted in significant
outcome, both at the scientific and practical levels. These attained results corroborate
with the earlier studies (Greene, 1996; Moats, 2004) conducted on this reading
intervention program, Language!  With the provision of a reading intervention
program early on in the primary grades, struggling readers would acquire the needed
basic reading skills to become better readers, which in turn would help students
improve in their overall academic performance prior to moving on to middle school.
The end results of improving students’ reading proficiency would be numerous.
Nonetheless, the use of the CST ELA as a measure to detect growth in
struggling readers receiving a reading intervention demonstrates not to be the best
and most fair tool with which to assess this. Not only is the CST a measure that is not
vertically equated but also the test is not comparable from one year to another. This

94
is an important fact to recognize as schools in California are being judged on their
students’ performance on the CST as a result of NCLB.
Recommendations
In general, this reading intervention program, Language!, provided not only
the upper-grade students struggling to read  with a resource to support their needs,
but also the school with a tool to address the needs of these students. This valuable
resource was made possible by a new principal at Rolling Hills Elementary.
Although the school staff found Language! to be a beneficial resource, they still
expressed concerns in regards to receiving additional professional development in
this area and time for reviewing formative assessment data and planning. To address
the staff’s concerns regarding the implementation of Language! at Rolling Hills
School and with the objective of improving the implementation of this intervention
program, the model of a professional learning community is discussed next.
Based on the findings of this study, it would be fair to say that through the
implementation of the Language! program, Rolling Hills School began the process
of creating a professional learning community with their new leadership; over time,
they can further explore and develop what has started. For instance, one repeated
theme that arose from the interviews with the intervention teachers and support staff
was the need for more training and support, and additional time for planning and
implementing all of the features of the intervention program. According to Hord
(1997), a school that has developed into a professional learning community would be
instituting the following principles: “supportive and shared leadership, collective

95
creativity, shared values and vision, supportive conditions, and shared personal
experience” (Attributes of Professional Learning Communities section, ¶ 1). This
concept is aligned with Marzano’s (2003) research-based principles of leadership for
change, when leadership transforms from an individual to a team approach,
successful change is more readily achieved. Moreover, Fermanich et al. (2006)
stated, “the key elements of improving teaching and learning do not happen without
support from leadership and a culture of professional community” (p. 33). The fact
that the new principal at Rolling Hills Elementary has been steady and building a
strong team approach, is one of the key ingredients in creating a professional
learning community.
As a result then, one of the recommendations for Rolling Hills School is to
further explore the notion of professional learning community, as this concept
demonstrates to be in the beginning stages of development with their new leadership.
Through the analysis of the problems being faced by Rolling Hills, results indicated
that the current principal on board has brought in to the school, not only her vision
and leadership, but has also developed and promoted this leadership to be shared and
with a team approach. When management and supervision transform in such a way,
where everyone is a stakeholder for students’ achievement, then successful change or
reculturing can be easily achieved (Goldberg & Morrison, 2003; Marzano, 2003).
The notion of a school “reculturing,” where schools become communities for
learners (or professional learning communities) is one approach that has evolved in
recent years (Goldberg & Morrison, 2003). According to these scholars, schools turn

96
out to be better when “local culture” is created and nurtured in the school structure.
Goldberg and Morrison (2003) stated, “the focus is on social relationships that
adhere among the various people that play a vital role in the everyday life of the
school – teachers, administrators, students, parents, and other community members”
(p. 58). Thus, reculturing the school to a professional learning community, all
members of this professional community (in this study, Rolling Hills Elementary
School) do have a key role in students’ learning (including the students themselves),
and therefore, everyone is a stakeholder for students’ achievement. In other words,
all members of the school collaborate as a team to frequently review students’
progress and plan interventions accordingly to improve students’ learning.
Community members support the school by developing collaborative relationships
and volunteering their time or services towards increasing students’ achievement.
Moreover, the North Central Regional Educational Laboratory (n.d.) depicts
a professional learning community as “a collegial group of administrators and school
staff who are united in their commitment to student learning” (¶ 1). Hord (1997)
further explains that a professional community is an influential and effective plan
towards staff and school improvement. The attributed elements in this concept
encompass continuous professional development, motivation of all learning
community members, and a “sense of community [and support] in the school” (Hord,
1997, People Capacities section, ¶ 2). Most vital is that all of these factors (shared
vision, a feedback system, sense of community, professional development,
motivation, and support) address Marzano’s (2003) school-, teacher-, and student-

97
level factors affecting student achievement as discussed in the problem analysis
section.
If Rolling Hills School is to embark towards the reculturing of the school to
develop into a professional learning community, following are some of the
suggestions in creating a professional learning community by Goldberg and
Morrison (2003): (a) scheduling time for teachers to plan and reflect in their daily
schedule, (b) creating teams to mentor a group of students over a few years and
developing teacher-student relationships, (c) organizing “research lessons,” where
teachers observe other experienced teachers’ instruction, and then conferred about
their observations; and (d) making resources available to teachers (e.g., professional
journals, professional resources on the Internet, etc.). Other recommendations to
address the need for more support would be for teachers and/or support staff to write
grants to fund additional resources (such as materials and personnel) that could
facilitate and enhance students’ learning. Also, the leadership data team could review
and reallocate resources that the school already has to maximize support for teachers
in meeting all students’ learning needs (Crawford & Torgesen, n.d.; Fermanich et al.,
2006; Williams et al., 2005).
A second recommendation is for Rolling Hills School to become a
performance-driven school. By creating, promoting, and reculturing to this type of
system, the teachers at Rolling Hills can improve their quality of instruction by
analyzing formative data to see students’ performance trends and gaps, and to begin
isolating the “holes in instruction” in underperforming students to plan their

98
intervention (Crawford & Torgesen, n.d.; Datnow, Park, & Wohlstetter, 2007;
Fermanich et al., 2006). As a result, augmenting the use of formative assessment can
increase students’ academic performance (Black & Wiliam, 1998a, 1998b), such as
one of the built-in features of the Language! intervention program where student’s
achievement data drives instruction. Therefore, students’ data cannot be more
emphasized. Datnow, Park and Wohlstetter (2007) encapsulate the following
message from the work of Supovitz and Taylor (2003) and Togneri and Anderson
(2003), “quite simply, high-performing districts [schools] make decisions based on
data, not on instinct” (p. 11).
The basis of this notion of a performance-driven school system is embedded
in one of the underlying concepts behind Language!, where ongoing formative
assessments are conducted to inform the instructor what students have learned or not
so the instructor can plan their lesson based on student’s performance (Greene, n.d.).
In other words, the instructor is making decisions based on students’ data. By
bringing the model of using students’ data to make informed decisions to the school
is a method of utilizing an innovative principle and bringing about change –
reculturing the school.
Additionally, Datnow, Park, and Wohlstetter (2007) have stated that “the
gathering and examining of data is merely a starting point to developing a culture
and system of continuous improvement that places student learning at the heart of its
efforts” (p. 5). This principle aligns with the notion of “reculturing” the school
(Goldberg & Morrison, 2003), which Rolling Hills School is in the process of doing,

99
and it addresses one of the recommendations suggested by one of the support staff at
Rolling Hills Elementary School. This staff member expressed the need to use more
of a team approach to analyze the achievement data of students, to set a procedure to
have students test-in and test-out of the intervention program more often based on
the student’s data, and that this intervention should be in addition to the ELA core
curriculum so students can have access to the core curriculum.
Recent research (Crawford & Torgesen, n.d.; Datnow, Park, & Wohlstetter,
2007; Fermanich et al., 2006; Williams et al., 2005) that has analyzed schools
achieving success in improving student achievement found that the “common
threads” in these school systems were not only strong leadership, but also the use of
assessment data to drive instruction and the provision of instructional intervention
for struggling students identified through the data analysis. For instance, Fermanich
et al. (2006) discussed the importance of school systems using formative assessments
to drive teachers’ differentiated instruction, which is actually one of the built-in
components of Language!. Furthermore, many of the aspects of formative
assessment include planning, presenting the information, observing and checking for
understanding, and proceeding accordingly (Black & Wiliam, 1998a, 1998b; Dembo
& Eaton, 2000; Tucker, 1996).
Lastly, another key aspect to establishing a performance-driven school
system is the development of collaboration and collegiality among staff members
and the allocation of time for these to occur (Datnow, Park, & Wohlstetter, 2007).
This feature ties in with the concept of a professional learning community, where all

100
members of the school collaborate as a team to frequently review students’ progress
and plan interventions accordingly to improve students’ learning. Datnow and her
colleagues found that the teachers in these high-performing school systems “relied
heavily on one another for support, new instructional strategies, and discussion about
data” to improve student achievement (p. 46). This important element would also
address the shared concern by the intervention teachers and support staff that not
enough time was provided for planning and for data analysis.
Third, in the light of the findings of this study, another recommendation is
that the Language! reading intervention program be provided to the at-risk students
in reading in addition to the ELA core curriculum, rather than as a replacement of the
ELA core curriculum. This recommendation is substantiated by the mutual concern
of the intervention teachers and support staff at Rolling Hills in regards that the
Language! reading intervention program includes lower grade level ELA content
standards and addresses more basic reading skills. It is also important that the
students receiving the reading intervention program receive their appropriate grade
level ELA core curriculum because of three reasons: one, each grade level ELA core
curriculum is aligned to the corresponding grade state ELA content standards; two,
the CST ELA for each grade level (starting at Grade 2) is aligned to the
corresponding grade ELA content standards; and three, schools are evaluated based
on how their students perform on these annual CST tests as driven by the No Child
Left Behind Act (NCLB).

101
Last but not least, it would be of interest for future research to analyze the
effects of the reading intervention program, Language!, after two- and three-years of
receiving this intervention, using the CST ELA as a measurement of students’
achievement. The publishers of the Language! Program describe this program as a
three-year ELA curriculum devised to provide its students with the necessary skills
to be proficient readers and writers (Moats, 2004).
Limitations
Although the overall results of this study are encouraging, one must proceed
with caution in attempting to generalize these results. Following are the various
validity threats that need to be kept in mind as a causal inference is being made
between the treatment and outcome (internal validity) and if this causal relationship
will generalize outside the context of this study (external validity) (McEwan &
McEwan, 2003).
Design A – Pre-post Dependent Groups Design
Internal Validity. One threat to the internal validity of this design is
regression toward the mean, where the observed change between the pre- and post-
test scores could have been due to chance and not to the intervention treatment,
Language! (McEwan & McEwan, 2003; Trochim, 2006). In addition, maturation of
the participants is another threat, where the observed change could have been due to
expected maturation or growth of the participants and not to the intervention itself
(McEwan & McEwan, 2003; Trochim, 2006).

102
A last plausible threat to the internal validity of this pre-post design is testing,
where observed change could have been due to pre-test “priming” (where the actual
taking of the pre-test “primes” the students ultimately affecting their performance on
the post-test) and not to the intervention program (McEwan & McEwan, 2003;
Trochim, 2006).
External Validity. One limiting factor of this study is that the results of this
intervention program evaluation are delimited to the Rolling Hills Elementary
School setting. Therefore, it cannot be inferred with certainty that the same findings
obtained at Rolling Hills School can be attained in other settings (McEwan &
McEwan, 2003; Trochim, 2006). Along the same lines, the participants can be a
plausible threat to the external validity of this design, due to the fact that the
participants in this study may not resemble other students in other school settings
(McEwan & McEwan, 2003; Trochim, 2006).
Last but not least, there are the treatment implementation and measurements
threats. The treatment implementation is a threat because the implementation of the
intervention program could have been implemented in a variety of ways (McEwan &
McEwan, 2003; Trochim, 2006). The threat of measurements is evident in the fact
that as depending on the type of test one uses, one might get different outcomes, as
different tests measures different skills. So this threat raises the question, “does the
observed change seen on this test implies that observed change will be seen if a
different test is used?” (McEwan & McEwan, 2003; Trochim, 2006).

103
Design B – Nonequivalent Control Group Design
Internal Validity. In this design, a limiting factor is the methodology design
employed in this study, a pre-post quasi-experimental design, which lacks
assignment randomization and poses an internal validity threat (Creswell, 2003; King
& Minium, 2003), as the experimental and control groups may not have been equal
to begin with. The control group used in this study was a non-equivalent control
group sample from the district matched to various characteristics of the students in
the experimental group. Since the control group was accumulated from within the
district, the students in the matched control group came from various schools within
the district, thus, it is uncertain whether or not these students received any type of
intervention at their school site.
With the acquired findings, it is possible that irrelevant factors aside from the
intervention, Language!, may have influenced the treatment effects. For instance,
selection bias plays a role as a validity threat as the students receiving the
intervention program were not randomly selected, therefore, making it difficult to
say with certainty that the observed differences were due to the intervention program
itself and not to the unobserved differences among the students prior to the
treatment.
External Validity. Similarly to the pre-post dependent group design external
validity threats, generalizations of the findings of this study cannot confidently be
made to other settings and other students in other school settings due to the fact that

104
the results of this intervention program evaluation are delimited to the Rolling Hills
Elementary School setting and to the participants in this study.
Likewise, the threats of treatment implementation and measurements apply
also to this design, where the implementation of the intervention program could have
been implemented in a variety of ways resulting in different observed differences,
and different observed differences could have been seen if a different measure would
have been utilized instead (McEwan & McEwan, 2003; Trochim, 2006).
Conclusions
The decision taken by the Rolling Hills Elementary School leadership team to
adopt Language!  as their reading intervention program produced overall positive
outcomes. The at-risk students who received this reading intervention demonstrated
significant gains (p < .001) in their reading achievement on two of the three
measures, the DRP and TOSWRF, between the pre- (2007) and post-test (2008).
However, when the reading performance of these at-risk students who participated in
the intervention program was compared to students with similar profiles who did not
receive this intervention program utilizing the CST ELA test as a measure, the
findings were the reverse. Results showed that the students who did not receive the
intervention made greater gains on the CST ELA 2008 (post-intervention) than the
students in the intervention group. However, one needs to keep in mind that (a) the
intervention group did not receive the corresponding ELA core curriculum; and (b)
the intervention program, Language!, is a three-year ELA curriculum.

105
The results of this research study enlighten us as to the overall effects of the
reading intervention program, Language!, on the reading achievement of the at-risk
students at Rolling Hills School. Given the findings, the overall intervention effect of
Language! was positive. Although the inference that this intervention program has a
positive effect on the CST ELA achievement was not attained, this reading
intervention program has provided those at-risk students with more basic reading
skills. What is more, this reading intervention program has enriched Rolling Hills
School with an added resource to target their at-risk students, bringing them closer to
proficiency. With a stable and solid leadership, staff commitment, and a research-
based reading intervention program in place, Rolling Hills Elementary School is a
step closer to reculturing into not only a professional learning community but also
into a performance-driven school. Last of all, the findings of this study adds to the
body of research literature on this reading intervention program, Language!

“Coming together is a beginning; keeping together is progress;
working together is success.”
- Henry Ford

106
REFERENCES
Bennett, C. (2001). Genres of research in multicultural education. Review of
Educational Research, 71(2), 171-217.

Bennett, C. I. (2002). Enhancing ethnic diversity at a big ten university through
project TEAM: A case study in teacher education. Educational Researcher,
31(2), 21-29.

Biancarosa, G., & Snow, C. E. (2004). Reading next: A vision for action and
research in middle and high school literacy: A report from Carnegie
Corporation of New York. Washington, DC: Alliance for Excellent
Education. Retrieved February 1, 2008, from
http://www.kyreading.org/documents/ReadingNext.pdf  

Black, P., & Wiliam, D. (1998a). Assessment and classroom learning. Assessment in
Education, 5(1), 7-74.

Black, P., & Wiliam, D. (1998b). Inside the black box: Raising standards through
classroom assessment. Retrieved January 8, 2007, from Phi Delta Kappa
International Web site http://pdkintl.org/kappan/kbla9810.htm

Booher-Jennings, J. (2006). Rationing education in an era of accountability. Phi
Delta Kappan, 87(10), 756-761.

Brookover, W. B., & Lezotte, L. W. (1979). Changes in school characteristics
coincident with changes in student achievement (Occasional Paper No. 17).
East Lansing, MI: Michigan State University Institute for Research on
Teaching.

Brookover, W. B., Schweitzer, J. H., Schneider, J. M., Beady, C. H., Flood, P. K., &
Wisenbaker, J. M. (1978). Elementary school social climate and school
achievement. American Educational Research Journal, 15(2), 301-318.

California Department of Education (2007a). California standardized testing and
reporting (STAR): State of California – All students. Retrieved February 1,
2008, from http://star.cde.ca.gov/star2007/viewresults.asp

California Department of Education (2007b). DataQuest. Retrieved January 19,
2008, from http://dq.cde.ca.gov/dataquest


107
California Department of Education (2007c). Standardized testing and reporting
(STAR) program: Explaining 2007 STAR internet reports to the public.
Retrieved March 6, 2008, from http://www.cde.ca.gov/ta/tg/sr/resources.asp

California Department of Education (2007, April). California’s reading first plan.
Retrieved February 17, 2008, from
http://www.cde.ca.gov/nclb/sr/rf/index.asp

California Department of Education (2007, August). State board adopted
instructional materials. Retrieved February 3, 2008, from
http://www.cde.ca.gov/ci/rl/im/rlaadoptedlist.asp

California Department of Education (2008, February). California Standards Tests
Grade 4 English-Language Arts. Retrieved July 1, 2008, from
http://www.cde.ca.gov/ta/tg/sr/blueprints.asp

Chall, J. S. (1987). Two vocabularies for reading: Recognition and meaning. In M.
G. McKeown & M. E. Curtis (Eds.), The nature of vocabulary acquisition.
(pp. 7-17). Hillsdale, NJ: Erlbaum.

Crawford, E., & Torgesen, J. (n.d.). Teaching all students to read: Practices from
reading first schools with strong intervention outcomes: Summary document.
Retrieved March 28, 2007, from Florida Center for Reading Research Web
site http://www.fcrr.org/Interventions/pdf/
teachingAllStudentsToReadSummary.pdf

Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed
methods approaches (2
nd
ed.). Thousand Oaks, CA: Sage Publications.

Cummins, J. (1986). Empowering minority students: A framework for intervention.
Harvard Educational Review, 56(1), 18-36.

Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with data: How high-
performing school systems use data to improve instruction for elementary
students. Los Angeles, CA: Center on Educational Governance, Rossier
School of Education, University of Southern California.

Dembo, M. H., & Eaton, M. J. (2000). Self-regulation of academic learning in
middle-level schools. Elementary School Journal, 100, 473-490.

Dochy, F., Segers, M., & Buehl, M. M. (1999). The relationship between assessment
practices and outcomes of studies: The case of research on prior knowledge.
Review of Educational Research, 69(2), 145-186.

108
Duke, D., & Canady, L. (1991). School policy. New York: McGraw Hill.

Eberts, R. W., & Stone, J. A. (1988). Student achievement in public schools: Do
principals make a difference? Economics of Education Review, 7(3), 291-
299.

Fermanich, M., Mangan, M. T., Odden, A., Picus, L. O., Gross, B., & Rudo, Z.
(2006). Washington learns: Successful district study – Final report. Retrieved
February 20, 2007, from
http://washingtonlearns.wa.gov/materials/SuccessfulDistReport9-11-
6Final_000.pdf

Fletcher, J., & Lyon, R. (1998). Reading: A research-based approach. In W. Evers
(Ed.), What’s gone wrong in America’s classrooms? (pp. 49-90) Stanford,
CA: Hoover Institution Press.

Foorman, B. R., Francis, D. J., Fletcher, J. M., Schatschneider, C., & Mehta, P.
(1998). The role of instruction in learning to read: Preventing reading failure
in at-risk children. Journal of Educational Psychology, 90, 37-58.

Foorman, B. R., & Nixon, S. M. (2006). The influence of public policy on reading
research and practice. Topics in Language Disorders, 26(2), 157-171.

Francis, D. J., Shaywitz, S. E., Stuebing, K. K., Shaywitz, B. A., & Fletcher, J. M.
(1996). Developmental lag versus deficit models of reading disability: A
longitudinal, individual growth curves analysis. Journal of Educational
Psychology, 88(1), 3–17.

Gallimore, R., & Goldenberg, C. (2001). Analyzing cultural models and settings to
connect minority achievement and school improvement research.
Educational Psychologist, 36, 45-56.  

Garcia, G. E. (2002). Introduction. In Student cultural diversity: Understanding and
meeting the challenge (3
rd
ed., pp. 3-39). Boston: Houghton Mifflin.

Goldberg, B., & Morrison, D. M. (2003). Co-Nect: Purpose, accountability, and
school leadership. In J. Murphy & A. Datnow (Eds.), Leadership lessons
from comprehensive school reforms. Thousand Oaks, CA: Corwin Press.

Graves, M. F., & Slater, W. H. (1987, April). Development of reading vocabularies
on rural disadvantaged students, intercity disadvantaged students and middle
class suburban students. Paper presented at the AERA conference,
Washington, DC.

109
Greene, J. F. (1996). Language! Effects of an individualized structured language
curriculum for middle and high school students. Annals of Dyslexia, 46, 97-
121.

Greene, J. F. (n. d.). Language! assessment overview: Including placement, content
mastery and fluency tasks, summative tests, progress indicators, online
assessment system. Longmont, CO: Sopris West. Retrieved November 18,
2007, from
http://store.cambiumlearning.com/Resources/Assessment/pdf/sw_Assessment
_Language3_01.pdf

Griffith, J. (2000). School climate on group evaluation and group consensus: Student
and parent perceptions of the elementary school environment. The
Elementary School Journal, 101(1), 35-61.

Hattie, J. A. (1992). Measuring the effects of schooling. Australian Journal of
Education, 36(1), 5-13.

Haycock, K. (1998). Good teaching matters…a lot. Thinking K-16, 3(2), 1-14.

Henry, M. K. (1997). The decoding/spelling curriculum: Integrated decoding and
spelling instruction from pre-school to early secondary school. Dyslexia, 3,
178-189.

Hocevar, D. (2008). Module 4: Effect size and power, Education 536 Inquiry II. Los
Angeles, CA: Rossier School of Education, University of Southern
California.

Hord, S. M. (1997). Professional learning communities: What are they and why are
they important? Retrieved December 8, 2006 from
http://www.sedl.org/change/issues/
issues61.html

Juel, C. (1988). Learning to read and write: A longitudinal study of 54 children from
first through fourth grades. Journal of Educational Psychology, 80, 437–447.

Juel, C., & Minden-Cupp, C. (2000). Learning to read words: Linguistic units and
instructional strategies. Reading Research Quarterly, 35(4), 458-492.

King, B. M., & Minium, E. W. (2003). Statistical reasoning in education and
psychology (4
th
ed.). NY: Wiley Higher Education.


110
Kumar, D. D. (1991). A meta-analysis of the relationship between science instruction
and student engagement. Education Review, 43(1), 49-66.

Liberman, I. Y., Shankweiler, D. P., & Liberman, A. M. (1989). The alphabetic
principle and learning to read. In D. P. Shankweiler & I. Y. Liberman (Eds.),
Phonology and reading disability: Solving the reading puzzle. IARLD
Monograph series. Ann Arbor: University of Michigan Press.

Lou, Y., Abrami, P. C., Spence, J. C., Poulsen, C., Chambers, B., & d’Apollonia, S.
(1996). Within-class grouping: A meta-analysis. Review of Educational
Research, 66(4), 423-458.

Lyon, G. R. (1997). Report on learning disabilities research. Retrieved April 20,
2008, from http://www.ldonline.org/article/6339

Marzano, R. J. (2003). What works in schools: Translating research into action.
Alexandria, VA. Association for Supervision and Curriculum Development.

McEwan, E. K., & McEwan, P. J. (2003). Making sense of research: What’s good,
what’s not, and how to tell the difference. Thousand Oaks, CA: Corwin Press,
Inc.

McGraw-Hill Education (n.d.). Results with open court reading. Association for
Supervision and Curriculum Development and Council of Chief State School
Officers.

McIntyre, E., Jones, D., Powers, S., Newsome, F., Petrosko, J., Powell, R., et al.
(2005). Supplemental instruction in early reading: Does it matter for
struggling readers? The Journal of Educational Research, 99(2), 99-107.

McRae, D. J. (2002). Test score gains for open court schools in California: Results
from three cohorts of schools. SRA McGraw-Hill. Retrieved January 17,
2009, from
http://www.sraonline.com/download/OCR/Research/testscoresgain.pdf

Miller, S., & Sayre, K. (1986, April). Case studies of affluent effective schools. Paper
presented at the annual meeting of the American Educational Research
Association, San Francisco.

Moats, L. C. (2004). Efficacy of a structured, systematic language curriculum for
adolescent poor readers. Reading and Writing Quarterly, 20(2), 145-159.


111
Moats, L. C. (2007). Whole-language high jinks: How to tell when scientifically-
based reading instruction isn’t. Retrieved November 18, 2007, from
http://www.edexcellence.net/doc/Moats2007.pdf

Murphy, J., & Hallinger, P. (1989). Equity as access to learning: Curricular and
instructional differences. Journal of Curriculum Studies, 21, 129-149.

Nagy, W. E., & Herman, P. A. (1984). Limitations of vocabulary instruction (Tech.
Rep. No. 326). Urbana, IL: University of Illinois, Center for the Study of
Reading. (ERIC Document Reproduction Service No. ED248498).  

National Commission on Excellence in Education (1983). A nation at risk. Retrieved
January 18, 2008, from http://www.ed.gov/pubs/NatAtRisk/risk.html

National Center for Education Statistics (2007a). Achievement levels for reading,
California. Retrieved March 16, 2008, from
http://nced.ed.gov/nationsreportcard/states/achievement.asp

National Center for Education Statistics (2007b). Percentages of students at each
achievement level for reading, grade 8, all students [total]: By jurisdiction,
1992, 1994, 1998, 2002, 2003, 2005 and 2007. Retrieved January 18, 2008,
from http://nced.ed.gov/nationsreportcard/nde/viewresults.asp

National Institute of Child Health and Human Development (2000). Report of the
national reading panel: Teaching children to read: An evidence-based
assessment of the scientific research literature on reading and its
implications for reading instruction: Reports of the subgroups. (00-4754).
Washington, DC: U. S. Government Printing Office. Retrieved April 19,
2008, from http://www.nichd.nih.gov/publications

National Research Council (2002). Scientific research in education. Washington,
DC: National Academy Press.

North Central Regional Education Laboratory. (n.d.). Professional learning
community. Retrieved December 8, 2006 from
http://www.ncrel.org/sdrs/areas/issues/content/currclum/cu3lk22.htm

Oakes, J. (1989). Detracking schools: Early lessons from the field. Phi Delta Kappan,
73, 448-454.

Patton, M. Q. (2002). Qualitative research and evaluation methods (3
rd
ed.).
Thousand Oaks, CA: Sage Publications.


112
Purkey, S. C., & Smith, M. S. (1983). Effective schools: A review. The Elementary
School Journal, 83(4), 427-452.

Robinson, C. (2002). Florida Center for Reading Research: Language! Retrieved
December 22, 2007, from
http://store.cambiumlearning.com/Resources/Research/pdf/sw_Research_Lan
guage

Sanders, W. L., & Horn, S. P. (1994). The Tennessee value-added assessment system
(TVAAS): Mixed-model methodology in educational assessment. Journal of
Personnel Evaluation in Education, 8, 299-311.

Schunk, D. H., Pintrich, P. R., & Meece, J. L. (2008). Motivation in education:
Theory, research, and applications (3rd ed.). Upper Saddle River, New
Jersey: Pearson Education, Inc.

Snow, C. E., Burns, S. M., & Griffin, P. (Eds.). (1998). Preventing reading
difficulties in young children. Washington, DC: National Academy Press.

Sopris West Educational Services (n.d.). Language! and No child left behind.
Retrieved December 22, 2007, from
http://store.cambiumlearning.com/Resources/Research/pdf/sw_Research_Lan
guage3_RB03.pdf

Stevens, R. J., Slavin, R. E., & Farnish, A. M. (1991). Effects of cooperative learning
and direct instruction in reading comprehension strategies on main idea
identification. Journal of Educational Psychology, 83, 8-16.

Supovitz, J., & Taylor, B. S. (2003). The impact of standards-based reform in Duval
County, Florida, 1999-2002. Philadelphia, PA: Consortium for Policy
Research in Education.

Swanson, H. L. (1999). Reading research for students with LD: A meta-analysis of
intervention outcomes. Journal of Learning Disabilities, 32(6), 504-532.

Tamir, P. (1996). Science assessment. In M. Birenbaum & F. J. R. C. Dochy (Eds.),
Alternatives in assessment of achievements, learning processes, and prior
knowledge (pp. 93-129). Boston: Kluwer.

Tobias, S. (1994). Interest, prior knowledge and learning. Review of Educational
Research, 64(1), 37-54.


113
Togneri,W., & Anderson, S. (2003). Beyond islands of excellence: What districts can
do to improve instruction and achievement in all schools. Washington, DC:
Learning First Alliance.

Torgesen, J. K. (1997). The prevention and remediation of reading disabilities:
Evaluating what we know from research. Journal of Academic Language
Therapy, 1, 11-47.

Torgesen, J. K. (2002). The prevention of reading difficulties. Journal of School
Psychology, 40(1), 7-26.

Torgesen, J. K., & Burgess, S. R. (1998). Consistency of reading-related
phonological processes throughout early childhood: Evidence from
longitudinal-correlational and instructional studies. In J. Metsala & L. Ehri
(Eds.), Word recognition in beginning reading (pp. 161–188). Hillsdale, NJ:
Erlbaum.

Torgesen, J. K., Alexander, A. W., Wagner, R. K., Rashotte, C. A., Voeller, K. K. S.,
& Conway, T. (2001). Intensive remedial instruction for children with severe
reading disabilities: Immediate and long-term outcomes from two
instructional approaches. Journal of Learning Disabilities, 34(1), 33-58, 78.

Torgesen, J. K., Rashotte, C. A., & Alexander, A. (2001). Principles of fluency
instruction in reading: Relationships with established empirical outcomes. In
M. Wolf (Ed.), Dyslexia, fluency, and the brain (pp. 333–355). Parkton, MD:
York Press.

Trochim, W. M. (2006). The research methods knowledge base (2nd Edition).
Retrieved May 4, 2009, from
http://www.socialresearchmethods.net/kb/intsing.php

Tucker, S. (1996). Benchmarking: A guide for educators. Thousand Oaks, CA: Sage
Publications.

United States Department of Education (2002). No child left behind act of 2001.
Retrieved February 1, 2008, from
http://www.ed.gov/policy/elsec/leg/esea02/107-110.pdf

Vellutino, F. R., Scanlon, D. M., Sipay, E., Small, S., Pratt, A., Chen, R., & Denckla,
M. (1996). Cognitive profiles of difficult-to-remediate and readily remediated
poor readers: Early intervention as a vehicle for distinguishing between
cognitive and experiential deficits as basic causes of specific reading
disability. Journal of Educational Psychology, 88, 601-638.

114
Villani, C. J. (1996). The interaction of leadership and climate in four suburban
schools: Limits and possibilities. Doctoral dissertation, Fordham University,
New York, NY. (UMI No. 9729612)

Williams, T., Kirst, M., Haertel, E., et al. (2005). Similar students, different results:
Why do some schools do better? A large-scale survey of California
elementary schools serving low-income students. Mountain View, CA:
EdSource. Retrieved January 10, 2007, from
http://www.edsource.org/pdf/SimStu05.pdf

Wise, B. W., & Olson, R. K. (1992). Spelling exploration with a talking computer
improves phonological coding. Reading and Writing, 4, 145-156.

Wise, B. W., & Olson, R. K. (1995). Computer-based phonological awareness and
reading instruction. Annals of Dyslexia, 45, 99-122.

Wright, S. P., Horn, S. P., & Sanders, W. L. (1997). Teacher and classroom context
effects on student achievement. Implications for teacher evaluation. Journal
of Personnel Evaluation in Education, 11, 57-67. 
Asset Metadata
Creator Klijian, Giuliana Milagros (author) 
Core Title What about the children left-behind? An evaluation of a reading intervention program 
Contributor Electronically uploaded by the author (provenance) 
School Rossier School of Education 
Degree Doctor of Education 
Degree Program Education (Leadership) 
Publication Date 07/12/2011 
Defense Date 05/06/2009 
Publisher University of Southern California (original), University of Southern California. Libraries (digital) 
Tag at-risk students,basic reading skills.,elementary school,Language! program,nonproficient reader,OAI-PMH Harvest,poor reading performance,reading achievement,reading intervention,reading intervention effect,reading intervention program,reading program 
Place Name California (states), Los Angeles (city or populated place) 
Language English
Advisor Hocevar, Dennis (committee chair), Brown, Richard Sherdon (committee member), Keim, Robert G. (committee member) 
Creator Email drgklijian@yahoo.com,mgklijian@yahoo.com 
Permanent Link (DOI) https://doi.org/10.25549/usctheses-m2353 
Unique identifier UC1100665 
Identifier etd-Klijian-3013 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-407345 (legacy record id),usctheses-m2353 (legacy record id) 
Legacy Identifier etd-Klijian-3013.pdf 
Dmrecord 407345 
Document Type Dissertation 
Rights Klijian, Giuliana Milagros 
Type texts
Source University of Southern California (contributing entity), University of Southern California Dissertations and Theses (collection) 
Repository Name Libraries, University of Southern California
Repository Location Los Angeles, California
Repository Email uscdl@usc.edu
Abstract (if available)
Abstract Scholars and educators in the field of education continue to express concern about the "underachievement" of minority students, and suggest that the data reflects the educational inequity that exists with regard to certain ethnic, student populations. Given that the diversity of our student population continues to grow, the issues relevant to educational inequity is particularly acute. Current research projects that by the year 2026, 25 percent of the total student enrollment will be limited English proficient, and that 70 percent of the total student population will be non-White and Latinos. 
Tags
at-risk students
basic reading skills.
Language! program
nonproficient reader
poor reading performance
reading achievement
reading intervention
reading intervention effect
reading intervention program
reading program
Linked assets
University of Southern California Dissertations and Theses
doctype icon
University of Southern California Dissertations and Theses 
Action button