Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Adequate funding for educational technology
(USC Thesis Other)
Adequate funding for educational technology
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
ADEQUATE FUNDING FOR EDUCATIONAL TECHNOLOGY
by
Jason B. Angle
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2010
Copyright 2010 Jason B. Angle
ii
DEDICATION
This journey which was taken to finish my education and earn a doctorate was not taken
alone, nor can I alone take all of the credit for the accomplishment. First, I would like to
recognize my wife, Tina, for all of her loving support, patience, and encouragement
throughout this long process. Second, I must recognize the sacrifices made by my
children for missed evenings, Saturdays, and vacation time while I attended classes and
wrote. Third, I want to honor the memory of my high school English teacher, Rosemary
Taylor, who taught me how to write and saw the potential in me. Fourth, I would like to
thank family, friends, and co-workers for their ongoing encouragement, support, prayers,
and kick in the pants to finish. Finally, I would like to give all praise to my Lord and
Saviour, Jesus Christ, through whom all things are possible.
To Him be the glory Amen!
iii
ACKNOWLEDGEMENTS
This dissertation was made possible through the long suffering support of my dissertation
chair, Dr. Lawrence Picus, who stuck with me for many years and saw me through to the
finish. I would also like to acknowledge Dr. Guilbert C. Hentschke and Dr. John Nelson,
who worked with my compressed time constraints and helped me to cross the finish line.
Thank you gentlemen very much.
iv
TABLE OF CONTENTS
DEDICATION ii
ACKNOWLEDGEMENTS iii
LIST OF TABLES vii
LIST OF FIGURES viii
ABSTRACT ix
CHAPTER ONE: OVERVIEW OF THE STUDY 1
Introduction 1
Statement of the Problem 5
Purpose of the Study 6
Importance of the Study 7
Summary of Methodology 9
Limitations 10
Delimitations 11
Assumptions 11
Definitions 12
CHAPTER TWO: REVIEW OF THE LITERATURE 14
Introduction 14
History of School Finance 19
School Finance Adequacy 31
Technology’s Place in Adequacy Studies 38
Educational Technology in the 21
st
Century 40
Educational Technology’s Impact on Student Achievement 43
Current Trends in Educational Technology 47
Data Driven Decision Making 52
Funding for Educational Technology 54
CHAPTER THREE: METHODOLOGY 57
Introduction 57
Purpose and Research Questions for the Study 59
Population and Sample 60
v
Instrumentation 63
Data Collection 65
Data Analysis 68
Summary 71
CHAPTER FOUR: ANALYSIS OF THE DATA
AND INTERPRETATION OF THE FINDINGS 72
Introduction 72
Findings for Research Question Number One:
Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement? 74
Summary of Findings for Research Question Number One:
Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement? 88
Findings for Research Question Number Two:
What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement? 90
Summary of Findings for Research Question Number Two:
What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement? 114
Findings for Research Question Number Three:
What additional support, funding, and budgets should be provided to
support school level efforts to raise student achievement? 119
Summary of Findings for Research Question Number Three:
What additional support, funding, and budgets should be provided to
support school level efforts to raise student achievement? 136
Findings for Research Question Number Four:
Do state technology surveys and state achievement test data indicate a
positive correlation between spending money on technology and high
student achievement? 140
Summary of Findings for Research Question Number Four:
Do state technology surveys and state achieved test data indicate a
positive correlation between spending money on technology and high
student achievement? 149
Conclusions 152
CHAPTER FIVE:
SUMMARY, CONCLUSIONS, AND IMPLICATIONS OF THE FINDINGS 155
Introduction 155
Overview of the Problem 156
Purpose of the Study 157
vi
Review of Methodology 158
Findings 159
Conclusions 176
Implications 178
Recommendations 179
Suggestions for Further Research 180
BIBLIOGRAPHY 183
APPENDICES 191
Appendix A: Teacher Survey 191
Appendix B: Principal Survey 193
Appendix C: District Notification Letter 200
Appendix D: Principal Cover Letter 202
vii
LIST OF TABLES
Table 1: School Site Student to Computer Ratios 141
Table 2: School Site Computers Connected to Internet 142
Table 3: School Site Computer Locations 143
Table 4: Age of School Site Computers 144
Table 5: Expected Change in School Site Computer Availability 145
Table 6: Average School Site Hardware Fix Time 146
Table 7: School Site Level Technical Support Staffing (FTE per 1,000 students) 147
Table 8: School Site Level Technology Curriculum Support Staff (FTE per 1,000
students) 148
viii
LISTS OF FIGURES
Figure 1: Program Improvement Background Information – Principal Survey
Results 79
Figure 2: Instructional Leadership in Technology – Principal Survey Results 83
Figure 3: Exiting Program Improvement –Principal Survey Results 86
Figure 4: Technology Hardware – Principal Survey Results 94
Figure 5: Technology Hardware – Primary Teacher Survey Results 97
Figure 6: Technology Hardware – Intermediate Teacher Survey Results 98
Figure 7: Technology Software Programs – Principal Survey Results 101
Figure 8: Technology Software/Programs – Primary Teacher Survey Results 106
Figure 9: Technology Software/Programs – Intermediate Teacher Survey Results 107
Figure 10: Data Analysis and Data Driver Decision Making – Principal Survey
Results 109
Figure 11: Technology Support – Primary Teacher Survey Results 121
Figure 12: Technology Support – Intermediate Teacher Survey Results 122
Figure 13: Technology Support – Principal Survey Results 125
Figure 14: Hardware Replacement – Principal Survey Results 127
Figure 15: Educational Technology Costs, Funding, and Budgets – Principal
Survey Results 134
ix
ABSTRACT
Public schools are currently operating in a pressure-cooker of accountability
systems in which they must teach students to high standards and meet ever increasing
targets for student proficiency, or face increasingly severe sanctions. Into this mix is
thrown educational technology and the funding for that technology. The literature
espouses the benefits of technology for engaging digital natives and preparing them for
the 21
st
Century workplace. But what about the more important and pressing issue of
increasing student achievement?
While technology, in and of itself, is unlikely to raise student test scores or play
more than a supporting role to key educational components such as class size and quality
of teachers, expenditures on technology do play a role in achieving an overall adequate
educational system. The purpose of this study was to use a successful schools model to
survey site principals and site teachers to seek their professional opinions and answer the
following research questions: (1) Overall, does the utilization of educational technology
resources support school level efforts to raise student achievement and exit from program
improvement? (2) What types, amounts, configurations and uses of educational
technology resources support school level efforts to raise student achievement? (3) What
additional support, funding, and budgets should be provided to support school level
efforts to raise student achievement? (4) Do state technology surveys and state
achievement test data indicate a positive correlation between spending money on
technology and high student achievement?
The successful schools population selected for this study consisted of the 25
California public elementary schools which exited from Year 4 or Year 5 of Program
x
Improvement under the federal No Child Left Behind (NCLB) Act in 2006 and 2008. To
reach these levels of Program Improvement, these schools did not make Adequate Yearly
Progress (AYP) for at least 5 years as measured on the annual California Standards Test
(CST). After writing an Alternate Governance Plan to document how they were going to
reorganize how the school was run and the new instructional practices which were going
to be implemented to increase student achievement, these 25 successful schools
significantly increased student achievement and met federal mandates for Adequate
Yearly Progress on the California Standards Test for 2 successive years in order to exit
from Program Improvement. The sample for this study consisted of the 11 out of 25
schools that agreed to participate in the study and complete the surveys.
The overall findings as related to the four research questions were: (1) The
utilization of educational technology resources is indeed supportive of school level
efforts to raise student achievement and exit Program Improvement. (2) LCD projectors
and document cameras represented the most effective technology hardware resources for
increasing student achievement, followed closely by high speed internet connectivity and
wireless internet connectivity. In terms of technology software resources, technology
proved invaluable in analyzing student achievement data and in supporting data driven
decision making (DDDM), and many respondents believed in the efficacy of leveled
reading programs and computer based learning programs. (3) Regarding technology
support, teachers indicated that staff development and hands-on training in technology
was the most important factor in helping to raise student achievement. Less than half of
the surveyed principals felt the $250 dollars per student proposed by Odden and Picus
xi
was adequate. (4) Results from the California School Technology surveys were mixed,
but positive correlations between technology and student achievement were found.
1
CHAPTER ONE
OVERVIEW OF THE STUDY
Introduction
Under the Public Schools Accountability Act (PSAA) of 1999 in California,
teaching students to high standards has meant rewarding and sanctioning schools based
upon meeting targets in their Academic Performance Index (API) which is based upon
yearly standardized test scores (Public Policy Institute of California, 2003). Adequately
funding schools has become particularly important, because goals have to be met not only
school-wide, but also within significant subgroups such as English Language Learners
and Economically Disadvantaged. This means the amount of resources and how they are
allocated needs to be varied to effectively address the differing needs of individual
schools. A state budget crisis, in the 2003-2004 school year, led to the elimination of
programs such as teacher merit pay under the PSAA (EdSource, 2003). The current state
budget crisis led to the elimination of funding for School Assistance and Intervention
Teams (SAIT) under the Immediate Intervention/Underperforming Schools Program
(II/USP) for the 2009-2010 school year. This means California’s program of
accountability has lost much of its teeth.
While state efforts at accountability appear to have been weakening, components
such as growth on the Academic Performance Index have become integral components of
federal efforts under the No Child Left Behind Act (NCLB) which have been
strengthening (EdSource, 2003). If schools and districts do not meet goals for Adequate
Yearly Progress (AYP), they face increasingly severe sanctions. In their summary of the
Williams v. State of California case Oakes, Blasi, and Rogers (2003) describe how
2
current accountability systems such as these are flawed and inequitable for low-income
and minority students that often don’t have adequate conditions, resources, and
opportunities to learn and master the high achievement goals set before them. This
means that considerations of adequacy must identify funding that is sufficient enough and
of an appropriate nature to ensure all students meet high academic standards.
Odden and Picus (2008) concede in their findings that research is mixed in
supporting the existence of a strong correlation between money and student outcomes.
Likewise, while varying amounts of money are currently being spent on technology as a
panacea to cure what ills public schools, there may be even less research clearly defining
the amounts and types of funding for technology necessary to support student academic
success. Odden and Picus (2008) do indicate that thinking about how money is spent and
focusing on student achievement does matter and this admonition should apply to the
topic of funding for school technology as much as any other.
The key concept that Odden and Picus (2008) encourage states to consider in
creating school finance policy is “adequacy”, which they define as “the provision of a set
of strategies, programs, curriculum, and instruction, with appropriate adjustments for
special-needs students, districts, and schools, and their full financing, that is sufficient to
teach students to high standards.” More specifically for this study, the focus will be on
the provision of technological resources including hardware, software, networking,
training, school-wide technology-based academic programs, technology supported data
analysis, and other educational technologies, with appropriate adjustments for special-
needs students, districts, and schools, and their full financing, that is sufficient to support
teaching students to high standards, as part of a larger adequacy model.
3
In trying to determine “adequacy”, states approach the issue from either the input
side or the output side (Odden & Picus, 2008). When looking at the input side, states
identify programs and services that they feel are necessary to adequately educate students
and then simply cost them out. When defining adequacy from the output side, states
focus on the costs necessary to get all students to meet high academic standards, taking
into account adjustments for special needs students. With the advent of legislation such
as the Public Schools Accountability Act and No Child Left Behind holding educational
institutions accountable for academic outcomes, there has been a necessary shift in focus
from the former to the latter.
Odden and Picus (2008) and Baker, Taylor, and Vedlitz (2003) have summarized
many approaches taken by different states in conducting educational adequacy studies.
The state of Washington conducted an average expenditure study focused on inputs by
identifying statewide average staffing costs and then used this figure as a base spending
level for all districts. Illinois and Alaska proposed taking such a base spending limit and
adding costs for effective programs and special needs, but never implemented the
proposals. Wyoming, Kansas, Maryland, and South Carolina used the professional
consensus/judgment approach by convening focus groups of educators and policymakers
to determine sufficient educational inputs. Illinois, Maryland, and Ohio conducted
successful schools studies focused on the output side of the equation by identifying
districts that were scoring well on state achievement tests and averaging their spending
levels to determine adequate funding statewide. Odden and Picus (2008) point out that
these figures often matched the median spending for all districts statewide, which again
raises the question about the relationship between increased spending and increased
4
student results. Statistical modeling studies in New York, Wisconsin, Texas, and
Illinois have used the cost function approach to estimate the cost of achieving a
designated set of outcomes while adjusting adequacy levels for student characteristics,
such as a high concentration of impoverished students, but the results were not utilized.
Finally, the Arkansas Joint Committee on Educational Adequacy recently submitted a
report in which they used the research-based state-of-the-art approach and found that
effective schools focused on dramatically improving instruction through professional
development and performance pay (Odden, Picus, & Fermanich, 2003). The
recommended cost of implementing the study’s findings is one third more than current
expenditures, even after carefully reallocating resources (Odden & Archibald, 2001) as
much as possible to spend them on research based strategies such as Success For All and
class size reduction.
A review of the literature by the researcher found that through 2008, adequacy
studies had been conducted in at least 39 of the 50 states, including two studies
completed in California in 2007 utilizing the professional judgment model. These studies
could prove instrumental as the state responds to the Robles-Wong v. California lawsuit
filed in 2010.
In 2003, two school finance adequacy studies were completed in Kentucky, one
utilizing the state-of-the-art or evidence-based approach (Odden, Fermanich, & Picus,
2003) and the other using a professional judgment model (Picus, Odden, & Fermanich,
2003). A third report prepared for the state of Arkansas utilized an evidence-based
approach with a great deal of input from professional judgment panels (Odden, Picus, &
Fermanich, 2003). Regardless of the approach used or the state being examined,
5
adequacy studies such as these, and those generated by others, culminate in a list of
recommended educational changes and the total amount of finances required to
adequately fund and implement those changes. Several key recommendations are
common to many of these studies because there is a great deal of empirical evidence to
support them such as Allan Odden’s (2000) own research which has found performance
pay to be a strong motivator in getting teachers to adopt educational reform efforts which
lead to higher student achievement. Thus, to better insure accountability for results in
Arkansas, Odden, Picus, and Fermanich (2003) also included performance pay for
teachers as a key component in their Arkansas adequacy study.
Statement of the Problem
While the vast majority of the recommendations in these adequacy studies are
research based and have been adjusted for the special needs of individual states with the
input of professional judgment panels, there are key areas of expenditures such as central
office administration and site administration for which there is very little empirical data
for determining an adequate level of funding. Average historical expenditures are often
used to determine funding levels in these areas, regardless of their efficacy.
Technology is one of these areas on which there is a great deal of money spent,
but very little research on what an adequate amount would be to support student
achievement or what those expenditures should look like. For example in the Arkansas
adequacy study (Odden, Picus & Fermanich, 2003) a nominal amount of $250 per student
was given regardless of whether the student was in elementary, middle, or high school.
Relative to higher profile core instructional components of the expenditure matrices, such
as teacher compensation, there were fewer explanations on how the technology figure
6
was generated, how it was to be expended, or how it was to be adjusted for students,
schools, or districts with special needs. Rather, expenditures for non-core instructional
areas such as technology, maintenance, and central office administration in the
prototypical schools are often given much less empirical support, and at times, are
determined by simply averaging the expenditures spent in each area up to the time of the
adequacy study.
Purpose of the Study
While technology, in and of itself, is unlikely to raise student test scores or play
more than a supporting role to key educational components such as class size and quality
of teachers, expenditures on technology do play a role in achieving an overall adequate
educational system. The purpose of this study will be to use a successful schools model
to more accurately determine an adequate amount of funding in the area of technology
needed to help raise student achievement for prototypical elementary schools in future
adequacy studies. In particular, this study will attempt to determine an adequate level of
funding for technology for public elementary schools in the state of California.
Research questions will address the relationship between funding for technology
and student achievement, as well as, focus on specific expenditures on technology
including the following:
1. Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement?
2. What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement?
7
3. What additional support, funding, and budgets should be provided to support
school level efforts to raise student achievement?
4. Do state technology surveys and state achievement test data indicate a positive
correlation between spending money on technology and high student
achievement?
Importance of the Study
This study should help to answer important questions related to whether or not
technology positively impacts student achievement and just how financial resources
should be allocated to support these positive outcomes. Answering such questions is
important to state legislators who, if they are going to take such findings into account,
must make sure enough general fund money is budgeted to purchase adequate
technological resources after major expenses such as personnel are covered. California
state legislators may even eventually choose to place a bond measure on the ballot to
make sure funds go directly to technological resources, thus circumventing teacher
unions’ access to these funds for salaries. The results of such a study may help to sway
voters who want to give their children every possible educational advantage, and may be
fed up with the status quo and what may be perceived as ineffective and inefficient
spending practices.
The results of this study may also be useful for policymakers at various
policymaking levels who are considering the creation of technology standards, simply to
better prepare students for the technology laden workplace of the 21
st
century.
Additionally, policymakers shape educational policy and outcomes through textbook
adoptions. The results of this study could help guide them in determining what types of,
8
and to what degree, technology should be embedded in new adoptions to make them
more accessible to students and better support students in learning to high academic
standards.
The results may also be used by the California Commission on Teacher
Credentialing, and similar credentialing institutions in other states, to require coursework,
or even formal exams, to better ensure teachers can properly utilize hardware and
software as important arrows in their pedagogical quivers. Of course for all of the above
mentioned reasons, the results of this study could be very useful to researchers and
policymakers.
In addition to providing guidance to policymakers, the results of this study could
prove invaluable to practitioners as they try to eek out ever increasing levels of student
achievement as required under No Child Left Behind mandates. District and site
administrators have little discretionary money to spend after covering personnel costs.
Therefore, they must stretch dollars and make every one count. While schools have
historically spent a great deal of their discretionary budgets on technology, they have not
necessarily done it in the wisest manner with empirical evidence to support their
expenditures. This study could help administrators spend money on the right types of
hardware and software, in the optimal configurations, with appropriate support (such as
staff development) to maximize the positive impact on student achievement. Rather than
simply asking students to open their textbooks to read passages and work problems,
teachers could use the results of this study to help them make concepts and curriculum
come alive for students, giving them multiple avenues for properly addressing diverse
learning styles. The results of this study may also reinforce the notion that technology
9
can be an invaluable tool for differentiating instructional opportunities to provide
remediation and interventions for students struggling to meet grade level standards, while
at the same time providing enrichment opportunities for students that need to work above
grade level standards if they are to reach their full potential.
Of course the importance of any study in the field of education must be measured
against its impact on the academic achievement of students. It is at this point that the
hyperbole must end and realistic expectations be established. A great deal of taxpayer
dollars, especially at the state level, is spent on educating students to prepare them for
future productive lives. Not only taxpayers, but parents and students themselves,
rightfully expect this money to be spent wisely and effectively. This study should help
taxpayers, parents, and students to critically examine whether or not expenditures on
technology are benefiting them in the long run.
Summary of Methodology
School level experts will be sought to begin establishing a baseline for adequate
levels of technology and funding. Successful schools will be identified and staffs
surveyed to investigate the types and amounts of technology being utilized. Successful
schools will be identified as those elementary schools in California that significantly
raised student achievement for two consecutive years in order to exit Program
Improvement status from Year 4 or Year 5 under the federal No Child Left Behind act in
2006 or 2008. Site administrators at these successful schools, as determined by 2006 or
2008 STAR test results, will be surveyed to assess the types and amounts of technology
being used and how much funds were expended annually in this area. Samples of
10
teachers at these same schools will also be surveyed to further ascertain the impact of
technology on student achievement.
Expenditure patterns on technology by these successful schools in California will
also be checked for correlations with test result data. The state requires all public schools
to complete and submit annual surveys on the types of technology available to students
and staff for instructional purposes. Annual assessment results from the California
Standards Tests (CST) are also published annually and are a key component of the state’s
accountability system. Cross referencing the two to identify positive correlations will
help to establish an empirical basis for adequately funding technology in California will
be a secondary focus of this study
Limitations
Research literature and adequacy studies will be reviewed in an attempt to
establish an evidence-based foundation for determining a proper amount of funding for
technology in schools as part of an adequate funding formula. Much of this literature and
these adequacy studies, however, rely on professional judgment and may not provide the
more objective empirical evidence basis hoped for when embarking on such a study.
While staffs at successful schools will be surveyed for their professional opinions and
state data on technology in schools and student test result data will be reviewed seeking
correlations, causal relationships between the presence and use of technology in schools
and increased student achievement may be difficult to isolate given the multitude of
factors (such as teachers salaries, class size, etc.) present in an adequate educational
system affecting student outcomes. Even if a relationship can be established, determining
an adequate level of funding for technology will be hampered by districts not reporting
11
school level expenditures on technology as Odden, Fermanich, and Picus (2003)
discovered in their Kentucky adequacy study.
Delimitations
The presence and use of technology in schools may have many benefits for
students including preparing them for full participation in careers heavily dependent upon
integrated technology. However, because adequacy for this study is focused on providing
the educational resources necessary to teach students to high standards, adequate levels of
financing for technology will be focused on expenditures which raise student
achievement as determined by state level testing. Technology expenditures and test
results will also be limited to data available from public schools and districts in the state
of California. Data will also be collected only from successful schools, defined as
California elementary schools exiting from Year 4 or 5 of Program Improvement in 2006
or 2008. These successful schools are identified by increasing student achievement on
the California Standards Tests and meeting or exceeding the benchmarks established
under No Child Left Behind for 2 consecutive years.
Assumptions
This leads to a key assumption that the vast majority of these schools in Year 4 or
5 of Program Improvement are serving students from low socio-economic status, often
with limited English proficiency skills. Thus, the effect of co-variables such as the socio-
economic status of students (often the largest factor in predicting student achievement)
was fairly equalized, and the success of these schools could be better attributed to other
factors and components of the adequacy model such as the level of expenditures on
technology. A second key assumption is that a significant shift (and improvement) in
12
educational practices must have occurred over a two year period to exit Program
Improvement, after four or five years of not making Adequate Yearly Progress.
Definitions
ACADEMIC PERFORMANCE INDEX (API)
School accountability tool used in California consisting primarily of student scores on the
criterion referenced California Standards Test (CST) and to a lesser extent on the norm
referenced California Achievement Test 6
th
Edition (CAT6). All schools are expected to
make sufficient growth towards a target score of 800 out of a possible 1000.
ADEQUACY
Provision of sufficient educational resources, and their full financing, to allow all
students to achieve at high academic standards (Odden & Picus, 2008).
ADEQUATE YEARLY PROGRESS (AYP)
Requirement under the No Child Left Behind Act (NCLB) for schools, districts, and
states to quantifiably demonstrate that students are making sufficient progress towards
mastery of high academic standards. In California this progress is demonstrated by
meeting Annual Measurable Objectives (AMOs) and by making at least one point of
growth on the Academic Performance Index (API).
AMERICAN RECOVERY AND REINVESTMENT ACT (ARRA)
Federal signed by President Obama in 2009 designed to jumpstart the economy and save
jobs including those in education such as teachers and principals. As of May 14, 2010,
approximately $84 billion in Recovery Act funds had been awarded through programs
such the State Stabilization Fund, Race to the Top, Title I, and Individuals with
Disabilities Education Act (U.S. Dept. of Education, 2010a).
13
ANNUAL MEASURABLE OBJECTIVES (AMOs)
Requirement for an increasing percentage of students to score proficient or above on
achievement tests measuring student mastery of rigorous standards in subjects such as
English Language Arts and Math. Under the No Child Left Behind (NCLB) act, 100% of
all students are expected to score proficient or above by the year 2014.
NO CHILD LEFT BEHIND (NCLB)
Federal act passed in 2001 designed to hold states, districts, and schools accountable for
ensuring all students achieve at high academic standards. Tied to federal Title I funding,
schools and districts that fail to meet targets face increasingly severe program
improvement sanctions ranging from reallocation of Title I funds to the implementation
of alternative school governance.
ROBLES-WONG V. CALIFORNIA
Lawsuit filed in 2010 attempting to establish an adequate school funding system in
California by declaring the current funding system is unconstitutional and requiring state
lawmakers to design and implement a school finance system that provides all students
equal access to the required standards based educational program.
RACE TO THE TOP (RTTT)
Federal program created by President Obama in 2009 providing competitive awards to
states “leading the way with ambitious achievable plans for implementing coherent,
compelling, and comprehensive education reform (U.S. Department of Education,
2010b). Reforms must address the four areas of 1) adopting rigorous standards and
assessments, 2) building data systems that measure student growth, 3) recruiting and
retaining effective teachers, and 4) turning around lowest-achieving schools.
14
CHAPTER TWO
REVIEW OF THE LITERATURE
Introduction
This chapter provides a summary of the literature regarding school finance. As
one of the most current and pertinent developments in the evolution of school finance, the
adequacy movement will be explored in greater depth. Finally, this chapter will examine
the role of educational technology in the 21
st
century, the role of educational technology
in increasing student achievement, and funding for technology.
“School finance concerns the distribution and use of money for the purpose of
providing educational services and producing student achievement,” (Odden and Picus,
2008, p.1). More than merely dollars and cents on a budget spreadsheet, school finance
often reflects conflicting values, struggles for power, and the future quality of life for
millions of young children. Tracing the history of school finance through the 20
th
and
21
st
centuries in terms of equity, productivity, and adequacy will often parallel the history
of school governance. For as George Giokaris (2001), at the time Deputy Superintendent
of Business Services (currently the Superintendent) of the Fullerton Union High School
District and adjunct professor at U.S.C., liked to define the Golden Rule, “He who
controls the gold makes the rules.” This association between school governance and
school finance has been born out repeatedly, from the 19
th
century common schools run
by boards of wealthy local land owners and financed completely by local property taxes
to the current No Child Left Behind (NCLB) act and Race to the Top (RTTT) program in
which the federal government tries to raise the achievement of all students nationwide
through federal monies such as Title I. Politics has also always been central to American
15
public education (Giokaris, 2001), in that politics is all about the allocation of scarce
resources. In terms of school governance, Giokaris (2001) goes on to state that these
resource allocation decisions focus around: “1) Making the decisions as to who gets
what; 2) Determining whose values are taught/supported; and 3) Determining who gets
what services.” This makes examining how the concepts of equity, productivity, and
adequacy have impacted the history of school finance crucial.
Equity most commonly refers to equal access to educational opportunities and
resources. “For most of the twentieth century, school finance policy has focused on
equity - issues related to widely varying education expenditures per pupil across districts
within a state and the uneven distribution of the property tax base used to raise local
education dollars,” (Odden and Picus, 2008, p.1). While making sure all students in any
given state receive equivalent educational dollars regardless of their school district of
attendance would seem to address the issue of equity, Augenblick, Myers, and Anderson
(1997) assert that there is still no single agreed-upon definition of equity (or adequacy for
that matter), despite the many court battles waged to remedy it. Berne and Stiefel (1984,
cited in Odden and Picus, 2008) point out that there are several factors and considerations
which need to be clarified and addressed when defining and striving for equity in school
finance. For example, while equity usually focuses on resources being supplied to
students, equity for the tax payers who must supply those resources must also be
addressed. Equity most often reflects fiscal inputs such as dollars spent per student, but
the equitable distribution of non-fiscal inputs such as curriculum and instruction, highly
qualified teachers, and school facilities must also be addressed (thus the Williams case).
16
More than just inputs though, equitable outputs such as student achievement and lifetime
incomes are now being addressed (as reflected in the shift towards adequacy).
Equity itself in all of these areas can also be looked at horizontally and vertically.
Horizontal equity assumes all educational stakeholders (students, tax payers, etc.) are the
same. Therefore, the equal distribution of resources is fair, equitable, and just. Vertical
equity recognizes that children are different and have various needs (e.g. gifted and
talented, special education, English language learners), so that students with greater needs
receive greater educational resources and services. This is equitable, if from an output
perspective, the goal of school finance is to provide sufficient resources for all students to
achieve at their fullest potential. The final factor to be considered when looking at equity
is fiscal neutrality which states that the educational resources available to students should
not vary with local sources of funding such as property tax wealth.
In general, productivity simply refers to the level of outputs achieved from a
given set of inputs. Ideally, there is a positive correlation between these two factors in
that an increase in inputs should lead to a commensurate increase in outputs.
Productivity in school finance is sought through achieving educational goals while using
a minimal amount of resources. However, by the 1990’s, many researchers in school
finance felt that after years of perseverating on equity issues, educational policymakers
should be more concerned with the low levels of productivity in the American public
school system (Odden and Picus, 2008). According to the U.S. Department of
Education, National Center for Education Statistics (2010), approximately $562.3 billion
was spent on K-12 public schools in 2006-2007; yet relative student achievement remains
flat or is declining, especially compared to results achieved by students in the public
17
school systems of other developed nations. This apparently dismal state of affairs has led
researchers such as Hanushek to repeatedly argue that “spending has absolutely no
correlation to academic achievement” and that “no matter how much money is pumped
into the educational system, it won’t make a difference unless schools fundamentally
change how they operate,” (Park, 2004, p. 3). Even if school outputs were to remain
constant, Hanushek (1996) argues that school inputs such as teachers’ salaries, books,
and transportation have increased so dramatically that the American educational system
has experienced a “productivity collapse”, even relative to the slowest sectors of the
economy.
Reflective of Hanushek’s admonition for schools to fundamentally change how
they operate in order to raise productivity, Anne Lockwood (1996) interviewed four key
researchers in the field of school finance and educational productivity to get their views
on questions such as what strategies enhance productivity and does money matter. Allan
Odden’s responses emphasized scrutinizing how all resources are used and evaluating the
value they add to the system, moving towards collaborative leadership, reallocating
resources to programs that work, implementing research-based programs such as Success
For All, establishing high performance standards for students, and holding teachers
accountable for results through rewards and sanctions (reflecting Hanushek’s 1994 call to
increase productivity through incentives tied to student performance). Lawrence O.
Picus’ responses included giving teachers greater control over staff development as an
incentive for improvement, providing greater market flexibility to better match teachers
with the different leadership styles of principals, and utilizing assessment systems that are
well explained to constituents such as parents and are not being constantly revised. Fred
18
M. Newmann recommended doing more with existing resources by simplifying what
schools are trying to accomplish and limiting goals to reading, writing, mathematics,
science, and treating people fairly. Roland S. Barth suggested engaging students in
learning experiences that will have a lasting impact such as community service projects,
allowing students to take greater control over their learning by pursuing their own lines of
questioning, and giving teachers greater choice over how educational dollars are spent.
The concept of adequacy in school finance represents the latest evolution of the
concepts of equity and productivity. Instead of focusing on horizontal equity through
equal inputs, vertical equity is sought through providing sufficient fiscal resources for all
students to achieve high minimum outcomes (Rose, 2001). According to Odden (2003,
p.2) “the equity part of the goal requires dramatically diminishing the ‘achievement gap’
between low income, minority children and all other students,” and “the excellence
[productivity] part of the goal requires at least a doubling of education performance over
the next decade or so.”
This philosophy is currently being reflected in California where state
Superintendent of Public Instruction, Jack O’Connell, is recognizing schools that are
closing the achievement gap in 2009 and 2010 by requiring students from all significant
subgroups including English Learners and Hispanics to make significant growth on state
tests to qualify as Distinguished Schools and for students from sub groups such as Socio-
Economic Disadvantaged to double growth targets to qualify for Title I Academic
Achievement Awards. Such accomplishments require the purposeful allocation of both
fiscal and human resources to ensure vertical equity.
19
As schools strive to increase student achievement for all students, addressing
vertical equity, expenditures are targeted to make standards-based curriculum and
instruction more accessible to students while also providing differentiation to better meet
the needs of individual students. A significant portion of these expenditures at many
schools have focused on instructional technology, including hardware, software, and staff
development. Classroom based devices such as laptop computers, LCD projectors, and
Smart Boards provide greatly enhance visuals and interaction which increases student
engagement in learning for all students and makes instruction more comprehensible for
EL students. Technology based programs such as Study Island, Accelerated Math, and
Read 180 provide specific standards-based and skills-based instruction to individual
students. Training teachers in how to fully and effectively utilize progress monitoring
programs such as Online Assessment Reporting System (OARS), Data Director, and
EduSoft facilitates analysis of student assessment data and making data driven decisions
in planning instruction. In addition to initial upfront expenditures on these technological
resources, yearly licensing fees, ongoing maintenance and other considerations must be
included in the total cost of ownership.
History of School Finance
In the PBS documentary “The Story of American Education” (2001) decision
making as to who gets what was portrayed as less than democratic and equitable during
the common school era, 1770-1890. Schools were neither free nor public and only the
most privileged had the means to go on to college and university. Thomas Jefferson felt
that the survival of the democracy depended on the education of all citizens, but excluded
slaves and women. Allocating the resource of a free and public education to all citizens
20
has been an ongoing political battle since the era of the common school, punctuated by
significant battles such as the U.S. Supreme Court’s 1954 decision to desegregate in
Brown v. Board of Education of Topeka (Olson, 1999) and revived again in the recently
settled Williams case (Oakes, Blasi, and Rogers, 2003).
Before the 20
th
century, American education was a decidedly local affair growing
up from the community outward (Olson, 1999). This meant local citizens through local
politics elected local trustees that in turn decided the who, what, where, and how of what
would be taught in the local schools. Since wealthy white Protestants held political
power, it was their values and beliefs that were reflected in the local school curriculum.
Horace Mann attempted to standardize all aspects of the public school so that common
schools reflected a common body of knowledge. These small one room common schools
were usually established in small rural towns and were supported through limited local
taxes (Odden and Picus, 2008). Larger and wealthier school systems were also being
established in big cities, laying the foundations for inequities in school financing based
on local ability to pay and violating the concept of fiscal neutrality. Growing numbers of
immigrants, mostly Catholic, dissatisfied with this status quo began to exert greater
parochial interests through the plethora of local school trustees. Through the efforts of
educational leaders such as Ellwood P. Cubberly, school governance adopted “scientific”
modern standards that eventually led to the centralization of school control, greatly
reducing the number and power of local school trustees. This factory model of education
did not depoliticize school governance; it just concentrated power in the hands of, and
reflected the values of, the professional elite and their business and professional allies in
school reform (Tyack, 1999).
21
Politics has played a major role in determining who gets what. In Brown v. Board
of Education the U.S. Supreme Court ruled that “separate but equal” was inherently
unequal and ordered the desegregation of schools to provide for a more equitable
allocation of educational resources to African American students. Segregated schools for
Black children were in relatively poor areas compared to those for White students, thus
violating fiscal neutrality. Because these segregated Black schools were so under-
funded, achieving horizontal equity through desegregation was considered a major
victory.
Following the successful launch by the Soviets of Sputnik I in 1957, the National
Defense Education Act promoted excellence in the subjects of mathematics, science, and
foreign language (Robelen, 1999) at the expense of equity in allocation of resources to
other subjects. This act was extremely important in the history of school finance in that it
was one of the first times the federal government provided a large amount of funding for
education, which despite the landmark Brown decision just a few years before, remained
largely a local affair and for which responsibility remained constitutionally with the
states. Although limited to subjects which were deemed critical to winning the space and
arms races, this act foreshadowed the adequacy in school finance movement which came
more than thirty years later, in that it strove to provide adequate funds to teach students to
high standards.
Through Title I of the Elementary and Secondary Education Act passed April 9,
1965, the federal government targeted categorical funding (resources) at concentrations
of disadvantaged children (Robelen, 1999). Moving beyond the horizontal equity
achieved just a decade earlier in the Brown decision towards vertical equity, “The
22
Elementary and Secondary Education Act was developed under the principal of redress,
which established that children from low income homes required more educational
services than children from affluent homes,” (Schugurensky, 2002, p.1). According to
Spring (2001) the Elementary and Secondary Education Act had a major impact on
school finance. First, federal funding was provided as categorical aid to address specific
federal concerns such as poverty, rather than being provided as unrestricted general aide.
Second, it provided aid to programs serving targeted students, rather than to schools, thus
side stepping the separation of church and state issue, and allowing students attending
private and parochial schools to also receive federal aid. Finally, funds were channeled
through state departments of education which greatly increased their involvement in local
decision making.
Evidence of competing values and the tough trade-offs they engender are
prevalent throughout the history of American education. Public Law 94-142 was passed
in the early 1970s to provide greater equity of educational opportunities and outcomes for
students with special needs at the expense of efficiency in minimizing per pupil
expenditures. Public Law 94-142 guaranteed parents of students with special needs
receive due process rights in the allocation of resources to meet their educational needs.
While seemingly not a specific school finance topic, this initially unfunded mandate has
historically represented a major encroachment on general funds with expenditures on
special education students far exceeding any reimbursements local school districts may
receive from state or federal sources. This attempt at vertical equity has major funding
implications for adequacy models attempting to teach all students to high standards as
required by the No Child Left Behind Act.
23
Likewise, as a result of the California Supreme Court ruling in Lau v. Nichols and
the subsequent Lau Remedies, non-English speaking students were to receive instruction
in their primary language, which translated into a significant shift in the allocation of
resources. This entitlement represented further recognition of the fact that students are
different and have differing needs requiring different levels of educational resources if
vertical equity was to be achieved. The horizontal equity of providing all students with
same education in English was no longer considered equitable.
The major conflict of interest that has the greatest impact on public education,
especially the equitable allocation of resources, is the notion that for most taxpayers, the
ideal situation is to have the maximum amount spent on the education of their children
while paying a minimum amount of taxes (Spring, 2001). Most events in the history of
school finance pale in comparison to the equity issues which needed to be resolved
through landmark litigation. While states began to implement “minimum foundation
programs” through a combination of local and state sources in the 1920s to “equalize”
differences in local fiscal capacity, these programs proved to be inadequate in addressing
disparities in local tax bases, and therefore fiscal neutrality, by the late 1960s (Odden and
Picus, 2008).
The first wave of school finance litigation from approximately 1960 to 1972
challenged the legitimacy of these disparate school finance systems in federal courts on
the grounds that they violated the equal protection clause of the U.S. Constitution which
states, “No State shall… deny to any person within its jurisdiction the equal protection of
the laws,” (VanSlyke, Tan, and Orland, 1994, p.1). Attempts to address fiscal neutrality
through litigation in federal courts came to an abrupt halt in 1973 though when the U.S.
24
Supreme Court in the San Antonio Independent School District v. Rodriguez case ruled
that education was not a fundamental right because it was not explicitly protected by the
U.S. Constitution.
In the second wave of school finance legislation from approximately 1972 to
1988, plaintiffs challenged disparate school finance systems on the grounds that they
violated equal protection and education clauses in state constitutions. These issues and
tactics came to a head in the state of California in the early 1970s where cases such as
Serrano v. Priest in 1971 (and the subsequent Serrano II of 1977 which basically upheld
the original Serrano decision under the California Constitution following the Rodriguez
decision) blazed the trail for litigants in other states to follow. In 1970, California
provided aide to districts through its foundation program, giving greater aide in the form
of block grants to districts with lower assessed property values (Sonstelie, 2001) in an
attempt to establish horizontal equity. While the state limited the rate at which local
school districts could levy general purpose property taxes, local districts could override
this limit through local referendums and most districts in the state did. While local tax
rates varied widely across the state, there was usually a positive correlation between the
tax rate and the level of revenues collected. However, as Sonstelie (2001) goes on to
explain some districts such as San Francisco and Los Angeles levied the same tax rate but
collected very different levels of revenue for students due to the much higher property
values in San Francisco. These differences in assessed property values and the
subsequent violation of fiscal neutrality became the focus of the Serrano v. Priest case.
Joel Spring (2001) goes on to explain that in Serrano v. Priest the California Supreme
Court found that local property taxes are often regressive forms of taxation. This means
25
low-income families pay a higher percentage of property taxes, but reap less revenue
(resources) to be spent on education because their property is worth so much less than
that of high-income families. The Serrano decision attempted to rectify this by shifting
the revenue source for the majority of local district funding to the state. Local school
districts were required to abide by state established revenue limits, which through
increased funding from the state had to be “equalized” to within $100 per pupil within six
years (Sonstelie, 2001).
While the Serrano case went a long way in equalizing per pupil funding, local
property taxes still play a major role in funding to build schools. In September of 2001,
voters in the Colton Unified School District passed a $102 million bond issue to build
five new elementary schools, a middle school, and a high school. This will cost property
owners $60 a year for 38 years on property valued at $96,700 (Sachs, 2001). This would
be a drop in the bucket for a community such as Beverly Hills, but Colton is a working
class community and this represents a major sacrifice affecting the standard of living for
lower income families in an economy on the verge of recession. If the bond had not
passed, invaluable educational programs such as class size reduction would have been in
jeopardy due to limited classroom space. Thus, the conflict of interest between paying
minimum taxes and spending a maximum amount on the education of children continues
to this day with significant consequences either way the pendulum swings.
While using varying arguments, the second wave of school finance litigation
continued on successfully in other states. In New Jersey, litigants successfully argued
that the state’s education clause of the state constitution could be applied to school
finance in both the Robinson v. Cahill case of 1973 and subsequent Abbott v. Burke case
26
of 1989 (Odden and Picus, 2008). In Robinson the court addressed fiscal neutrality and
tried to create greater horizontal equity by ordering “school financing reform after
determining that the state’s constitutional mandate for ‘thorough and efficient’ education
required an equal opportunity for all children, a mandate that the court felt was not being
met because of funding disparities present in the existing system,” (VanSlyke, Tan, and
Orland, 1994, p.5). In Abbot the New Jersey court tried to move towards vertical equity
by recognizing that the state needs to spend more money on economically disadvantaged
children if they are to be able to compete with their more advantaged peers. Other state
school finance cases under this second wave of litigation were important in that they
established the states’ responsibility in funding education due either to past practice such
as in Horton (Connecticut, 1977), or to implied guarantees in the equal protection clauses
of state constitutions such as in Pauley (West Virginia,1979), and Washakie (Wyoming,
1980).
In the state of California, “Proposition 13 was designed to reform the property tax
rather than the school finance system, but its indirect effect on that system has been
profound,” (Picus, 2001, p.13). By limiting property taxes to 1% of assessed value at the
time of its passage in 1978, Prop 13 provided horizontal equity to tax payers who had
previously been paying property taxes at widely varying rates across the state. It also had
the unintended effect of equalizing revenues available to school districts and achieving
greater fiscal neutrality as called for in Serrano, thus reflecting greater horizontal equity,
but did so by reducing the amount of local revenue available to wealthier districts rather
than raising the amount of revenue available to poorer districts thus reducing the total
funds available for schools across the state.
27
This untenable situation forced the state to respond by making up the shortfall and
eventually taking control of K-12 funding (EdSource, 2003b; Picus, 2001; Sonstelie,
2001). In time however, Prop 13 has led to greater inequities among tax payers because
assessed property values can rise by only 2% per year since 1978, and property can only
be reassessed at market value when it is sold (Odden and Picus, 2008). This creates fiscal
neutrality inequities based not on geography but on timing, because while all property in
a given locality may be taxed at the same rate, sharp differences in assessed values for
similar properties based on when properties were purchased means there will be sharp
differences in the actual taxes paid. Odden and Picus (2008) point out that this has also
created inequities between residential and commercial property owners. Residential
property owners end up shouldering a disproportionate amount of the tax burden because
they purchase and sell their properties more often than commercial property owners who
still benefit from a highly educated citizenry and work force.
The “Classroom Instructional Improvement and Accountability Act,” or
Proposition 98, was passed as an amendment to the California state constitution in 1988
(EdSource, 1996). This act provides stability to school finance in California by
guaranteeing a minimum amount of funding year to year and making adjustments based
on total state revenues, as well as, other factors such as per capita personal income and
changes in student enrollment. However, the current recession which has decimated
school funding for the 2009-2010 school year and will do so again for 2010-2011, has
revealed the limitations of Proposition 98 to provide stability to school funding. This is
due to the state’s overreliance on income tax revenue which plummeted dramatically with
the crash of the stock market and the relative softening of the dot.com industry in the
28
Silicon Valley. Even though the housing market and the associated property values have
also declined recently, it is unlikely that public school funding would have fluctuated so
wildly over the past few years if school funding was still based primarily on property tax
revenue as it was before the passage of Proposition 13 (and as it still is in a few basic aid
districts in the state).
A perennial debate in school finance revolves around the issue of using public
school dollars to send students to private schools through the use of vouchers.
Proponents of school vouchers argue that they will give parents choice when it comes to
educational opportunities for their students while providing horizontal equity to low-
income families in gaining access to private schools. Not only would school vouchers
create another inefficient government bureaucracy and shift private school tuition
expenditures to taxpayers, thus setting the public school system back in terms of
productivity, the attainment of adequacy would also be harmed since private schools do
not have to adhere to the same standards, accreditation, and credentialing requirements as
public schools.
There has been much press lately over President Obama’s move to more closely
tie federal funding for schools to student achievement. Race to the Top (RTTT) clearly
makes funding contingent upon the implementation of innovative practices and possibly
foreshadows the reauthorization of the Elementary and Secondary Education Act (ESEA)
instituting the same requirements for districts to receive Title I money.
However, over a decade ago California started moving in the same direction when
it passed the Public Schools Accountability Act (PSAA) in 1999. Under this law schools
have been rewarded and sanctioned based upon teaching students to high state standards
29
as measured by the Academic Performance Index (API) which is based upon yearly
standardized test scores (Public Policy Institute of California, 2003). Schools began
differentiating expenditures to achieve vertical equity because API targets had to be met
each year school-wide, and for all significant subgroups such as English Learners and
Socio-Economically Disadvantaged. Just a few years after the passage of the law though,
a state budget crisis led to the reduction and eventual elimination of programs such as
teacher merit pay under the PSAA and funding for such things as instructional materials
and professional development under the Intermediate Intervention/Underperforming
Schools (II/USP) Program (Edsource, 2003). Schools that participated in II/USP grant
program also faced consequences if they did not make adequate year to year growth on
the API, but when the state stopped funding the School Assistance and Intervention Team
(SAIT) Program in 2009, California lost a major tool in holding schools accountable for
increasing student achievement. With the elimination of these programs the state is
moving in the wrong direction to achieve educational adequacy. While the requirements
to meet high state standards are still in place, there are insufficient resources being
provided to achieve them.
While state efforts at accountability were weakening, federal efforts under the No
Child Left Behind act (NCLB) were strengthening (EdSource, 2003c). Signed into law
by President Bush in January 2002 the NCLB act represented a bipartisan reauthorization
of the Elementary and Secondary Education Act of 1965 (EdSource, 2003a). This act
reflects a move towards adequacy in that it provides billions of dollars to states in trade
for holding them accountable to teach all students to high standards. If schools and
districts do not meet goals for Adequate Yearly Progress (AYP), they face increasingly
30
severe sanctions. However, no matter how severe the sanctions dealt out in the name of
accountability or how large the funding designed to address vertical equity, the stated
goal of having 100% of all students score at proficient or above by the year 2014 appears
extremely lofty.
While the focus of school finance for the past decade has been on addressing the
concepts of productivity and adequacy, the Williams et al. v. State of California et al.
case filed by the ACLU demonstrated that issues of horizontal equity and fiscal neutrality
have yet to be resolved by “charging that the state has not met its obligation to provide all
students with ‘basic educational necessities’” and “equal access to school services,”
(EdSource, 2003b, p.2). The suit charged that over a million students in California,
mostly low income, urban and minority, were denied equal access to fundamental
educational resources such as highly qualified teachers, sufficient textbooks, and
appropriate school facilities (EdSource, 2003b; Oakes, Blasi, and Rogers, 2003). In their
summary of the Williams v. State of California case, Oakes, Blasi, and Rogers (2003)
also describe how current accountability systems such as those previously described are
flawed and inequitable for low-income and minority students that often don’t have
adequate conditions, resources, and opportunities to learn and master the high
achievement goals set before them. This means that any future considerations of
adequacy in California school finance must identify funding that is sufficient enough and
of an appropriate nature to ensure all students meet high academic standards. In August
2004 a settlement was reached in the case and included a one-time allocation of $138.7
million for instructional materials for low performing schools as measured on the
Academic Performance Index and $50 million for a facilities-needs assessment for low
31
performing schools (California School Boards Association, 2004). What did not appear
to be addressed in the settlement was any increase in fiscal resources to address the
highly qualified teachers issue. Schrag (2004) points out that the settlement contained no
major incentives to attract and retain qualified teachers in impacted schools or provisions
for more preparation time or smaller class sizes.
School Finance Adequacy
Over the course of the past decade or so, school finance has shifted from a focus
on equity issues to adequacy models in which resources are being linked to outcomes to
better ensure all students learn at their fullest potential (Clune, 1994). In trying to
implement true adequacy models, Clune (1994) argues that not only do states need to
adopt high minimum achievement standards for students on the output side and provide
the necessary educational resources on the input side; they also need to develop long
range plans for spending these resources wisely on effective instructional programs. This
shift to creating adequate funding models has become the focus for the third wave of
school finance litigation targeting states across the country beginning with Kentucky in
1989 and carrying on to the present. As courts settle these cases “an equal share of too
little is becoming unacceptable” so that states must move beyond providing a minimal
education to providing a high quality education (Hadderman, 1999, p.1).
Allan Odden (2003) identifies two key factors as being responsible for the focus
of school finance shifting from equity to adequacy. The first factor involved answering
questions about productivity, in other words, is spending ever increasing amounts of
money on public schools leading to substantial increases in student achievement. As
states became increasingly responsible for funding education, following landmark cases
32
such as Serrano, state policymakers and their tax-paying constituents simply wanted to
know if they were getting their money’s worth. The second factor precipitating this shift
was the standards-based education reform movement which has been building for the past
twenty years beginning with the “A Nation at Risk” report published by the federal
government in 1983. If students are to achieve at these high standards and the
“achievement gap” between low income, minority students and all other students is to be
closed, then states must provide sufficient resources to make it happen.
In DeRolph v. State (1997) plaintiffs won their case when the Ohio Supreme
Court found that the state’s school finance system violated the state constitution (Rose,
2001). After being ordered to develop a “thorough and efficient” system, state legislators
responded by identifying successful school districts that achieved high outcomes, then
determined what a sufficient level of resources would be to enable all schools to achieve
at these levels, making adjustment for districts with special needs. In Campbell County v.
The State of Wyoming, the state Supreme Court declared the existing school finance
system to be illegal (Rose, 2001). The court then ordered the state legislature to identify
the high standards (or “basket of goods and services”) students were to achieve, and told
them to determine the costs of achieving these standards, regardless of the state’s existing
ability to fund them. In both of these cases, and many others, the state courts have sent a
clear message that they want to see student results in learning and they want state
governments to pony up the dollars and educational resources to make it happen.
In trying to determine “adequacy”, states approach the issue from either the
input/resource oriented side or the output/performance oriented side and usually use one
of four methodologies (Odden & Picus, 2008; Baker, Taylor, and Vedlitz, 2005). When
33
looking at the input/resource oriented side, states identify programs and services that they
feel are necessary to adequately educate students and then simply cost them out. The
professional judgment approach and evidenced-based approach fall on this end of the
spectrum. When defining adequacy from the output/performance oriented side, states
focus on the costs necessary to get all students to meet high academic standards, taking
into account adjustments for special needs students. The successful school or district
approach and the economic cost function approach fall on this end of the spectrum
(Conley and Picus, 2003).
The Professional Judgment Approach (also known as the Consensus Approach)
focuses on the inputs which lead to student mastery of rigorous standards then specifies
the adequate funding levels needed to achieve these results. It does this by forming
panels of experts in the field of education, and fields which impact education, who then
pool their knowledge and use their professional judgment (thus the name) to determine
what an effective educational system looks like and what resources are needed to
implement such an ideal system (Odden and Picus, 2001). Of course all figures must be
adjusted based upon the size, type, and location of the educational system being studied
as well as the students being served. For example, the resources needed to teach students
the high standards likely varies greatly between schools situated in large urban settings
on either coast and smaller rural schools situated in the Midwest or south (Picus, Odden,
and Fermanich, 2003).
The benefits of a professional judgment approach may be in the fact that these
panels are highly visible and their perceived collective knowledge and expert testimony
may prove invaluable in actually moving state legislators forward with acting upon the
34
recommendation of an adequacy study. On the other hand, weaknesses of this approach
may include practitioners working in the schools preferring a successful school model
over the expert testimony of people who do not work directly in the schools (Odden &
Picus, 2008). Adherents to the evidence-based approach may also site a lack of empirical
data and statistical precision as a weakness of the professional judgment approach
(WestEd, 2000). Just the same, over the course of the past decade, more states appear to
have utilized the professional judgment approach for adequacy studies than any other
approach. These states include California, Colorado, Illinois, Indiana, Kansas, Kentucky,
Maine, Maryland, Missouri, Montana, Nebraska, New York, North Dakota, Oregon,
South Carolina, Washington, Wisconsin, and Wyoming.
The Evidence-Based Approach (also known as the State-of-the-Art Approach)
focuses on the strategies needed to teach students to high standards then specifies the
adequate funding levels needed to achieve these results. Rather than relying on
professional judgment though, this approach relies on research and data to determine the
types and amounts of each strategy in an educational system needed to raise student
achievement. Once these strategies are identified the total costs associated with
implementing them in the educational system being studied, down to the district and
school level, are calculated (Odden & Picus, 2008). The factors included in such a
system may encompass student to teacher ratios, central office administration, staff
development, technological support, etc. Once all of the types and amounts of factors for
such an ideal system are determined, they are individually priced out then aggregated for
a prototypical school. Such a prototypical school may be envisioned to serve a rounded
35
number of students (for example 500), all of whom are envisioned to represent an
average student in a given school system.
The strength of this approach lies in the focus on identifying very distinct,
objective, and concrete strategies that produce desired student achievement results and
can be used to guide school site administrators in expending school site funds effectively
(Odden & Picus, 2008). The weaknesses of this approach centers around the reality that
the evidence supporting the efficacy of various educational strategies often is inconsistent
from study to study and open to subjective interpretation. Additionally, the evidence-
based strategies derived from studies across the country may not be able to be generalized
equally to all educational systems (WestEd, 2000). This approach has been employed in
several states including New Jersey (1998), Kentucky (2003), and Arkansas (2003).
The Economic Cost-Function Approach is more complex and relies on statistical
analyses to determine the inputs needed to teach all students in an educational system to
high standards. Reschovsky and Imazeki (2001, p.385) state, “a cost function provides
an estimate of the minimum amount of money necessary to achieve various educational
performance goals, given the characteristics of a school district and its student body, and
the prices it must pay for inputs used to provide education.” These cost functions are
used to determine per pupil funding amounts for an educational system by quantifying
the relationships between 1) the level of student performance being sought, 2) the
characteristics of the students being served, and 3) the economic, educational, and social
characteristics of the school districts in the educational system. With these elements in
mind, this approach would dictate higher funding levels for states such as California with
36
1) more rigorous academic standards, 2) a higher percentage of English Learner students,
and 3) higher teacher salaries.
The strength of this approach lies in the ability to provide highly differentiated
funding levels for specific educational situations, rather than minor adjustments to an
average funding figure. The use of statistical analysis is also preferable to blind funding
increases for political purposes such as the American Recovery and Reinvestment Act
(ARRA) which funneled billions of dollars into Title I and IDEA (Individuals with
Disabilities Education Act) of which many districts are still struggling to spend wisely on
low income students and special education students while laying off teachers due to
shrinking general fund budgets. The weaknesses of this approach lies in the complex
statistical analysis it relies on, which is difficult for stakeholders (including site
administrators and state legislators) to understand and act upon. Cost function research
has been conducted in Wisconsin (1997/2001), Texas (1999/2001), and New York
(2000/2003), but none of these states have employed this approach as part of a formal
adequacy study (Odden & Picus, 2008).
The Successful School/District Approach starts with the output side of the
adequacy equation by identifying schools and districts already teaching to high standards
then backwards mapping the strategies and funding used to achieve this success. First
researchers must specify a desired level of performance, such as mastery of California
standards. Next, researchers use scores on state tests to identify schools and districts that
are providing the desired performance levels, such as schools making Adequate Yearly
Progress (AYP) on the California Standards Test (CST). At this point, researchers will
often eliminate atypical districts, such as those at either end of the spectrum on wealth or
37
size. From this remaining pool of districts with average characteristics, average spending
per pupil is calculated for the adequacy study (Odden & Picus, 2008).
The strength of this approach is that the efficacy of the successful schools and
districts is already proven and does not rely upon the speculation of experts, the
generalization of research, or the analysis of statistics. A potential weakness of this
approach may stem from the practice of narrowing the sample of successful schools
down to those with average characteristics often results in funding figures matching the
existing median spending in the state which means the adequacy study is providing no
new actionable data (Odden & Picus, 2000). The successful school/district approach has
proven to be very popular and has been utilized in Colorado, Illinois, Kansas, Louisiana,
Maryland, Mississippi, Missouri, New Hampshire, and Ohio (Baker, Taylor, and Vedlitz,
2005).
While Allan Odden and Lawrence Picus (2004) have described the four main
approaches used to conduct adequacy studies over the past decade, the adequacy studies
that they have conducted themselves in states such as Arkansas, Kentucky, and New
Jersey have relied most heavily on the evidence based (or state-of-the-art) approach. In
doing so, Odden and Picus have developed a matrix of core instructional practices which
they believe research shows will increase student achievement. Funding figures for those
core instructional practices are adjusted based upon the state in which the adequacy study
is being conducted. Research concerning non-core expenses such as technology, district
administration, and maintenance and operation has been less prevalent though. Thus,
these non-core expenses have often been treated as fixed costs in the matrix with less
38
empirical support and justification for their role in increasing student achievement as part
of a larger educational system.
Therefore, there is a need to conduct research into these non-core instructional
practices to better ascertain resource allocation and fund levels in adequacy. While
results of such research can be fed into the evidence-based model used by Odden and
Picus, a successful schools approach may be the most expedient and manageable way to
gather data on non-core instructional factors such as technology. Before conducting a
study in an attempt to add to the research base on adequacy, a brief foundation in this
particular topic of technology needs to be laid. Exploring the effectiveness and role of
technology in today’s public schools is a natural antecedent to determining appropriate
funding levels for technology as part of a larger adequacy model.
Technology’s Place in Adequacy Studies
In two separate school finance adequacy studies conducted in the states of
Kentucky and Arkansas, Odden, Picus, and Fermanich (2003 & 2003) echo Dede’s
(1998) call for integrating technology by stating word for word in both studies that over
time schools need to embed technology in their instructional program and school
management strategies. Odden and Picus (2008) support the assertion in their findings
that research is mixed in supporting the existence of a strong correlation between money
and student outcomes. Odden, Picus, and Fermanich (2003) do, however, point to recent
research which shows integrating technology into instruction can positively impact
student test scores nearly as much as reducing class size in primary grades. Even Dede
(1998) himself though warns that investing in technology may not result immediately in
raised test scores, but will contribute to overall improved educational experiences for
39
students and will impact student achievement more directly as teachers learn how to
effectively utilize it in their instruction. While varying amounts of money are currently
being spent on technology as a panacea to cure what ills public schools, there is still little
research clearly defining the amounts and types of funding necessary to support student
academic success. Odden and Picus (2008) do indicate that thinking about how money is
spent and focusing on student achievement does matter and this admonition should apply
to the topic of funding for school technology as much as any other.
The key concept that Odden and Picus (2008, p.71-72) encourage states to
consider in creating school finance policy is “adequacy”, which they define as “the
provision of a set of strategies, programs, curriculum, and instruction, with appropriate
adjustments for special-needs students, districts, and schools, and their full financing, that
is sufficient to teach students to high standards.” More specifically for this study, the
focus will be on the provision of technological resources including hardware, software,
and technology support that are sufficient to support teaching students to high standards,
as part of a larger adequacy model.
Technology is one of these areas on which there is a great deal of money spent,
but very little research on what an adequate amount would be to support student
achievement or what those expenditures should look like. For example in the Arkansas
adequacy study (Odden, Picus & Fermanich, 2003) a seemingly generic amount of $250
per student was given regardless of whether the student was in elementary, middle, or
high school. However, the authors use this figure as a starting point in many of their
adequacy studies based upon Odden’s 1997 study of technology heavy comprehensive
school designs. Technology costs are assumed to be ongoing, therefore, this is meant to
40
be an annual figure used over ten years to continue purchasing, updating, and maintaining
computer hardware and software. In Kentucky (Odden, Fermanich, and Picus, 2003) the
state had already invested heavily in statewide technology infrastructure, as well as
computers in classrooms, therefore the figure was adjusted down to $214 per student to
cover expenditures in operations and maintenance, incremental replacement of various
technology components, and new technologies. An additional $50 per student was
recommended to decrease the student to computer ratio from five-to-one down to three-
to-one.
While technology, in and of itself, is unlikely to raise student test scores or play
more than a supporting role to key educational components such as class size and quality
of teachers, expenditures on technology do play a role in achieving an overall adequate
educational system. The purpose of this study will be to use a successful schools model
to more accurately determine an adequate amount of funding in the area of technology
needed to help raise student achievement for prototypical elementary schools in future
adequacy studies. In particular, this study will attempt to determine an adequate level of
funding for technology in public elementary schools in the state of California.
Educational Technology in the 21
st
Century
According to Harvard Law School professors John Palfrey and Urs Gasser (2008)
who have written a recent book on understanding the first generation of digital natives,
the students that public schools currently serve were born into a digitized online world
surrounded by technology. These students think and learn in very different ways from
previous generations, including the school staffs that teach them, who as immigrants, had
to learn technology as a second language. However, Palfrey and Gasser (2008) feel the
41
schools that figure out how technology informs pedagogy are going to have the most
success.
For the generation of digital natives, technology, especially the internet, plays a
major role in their social and educational lives. Currently, 87 percent of middle school
students and high school students use the internet on a regular basis, and 78 percent of
them report using the internet at school (Hitlin & Rainie, 2005). This same study found
that most teenagers believe that the internet helps them to do better in school, including
conducting research for projects.
Most students though feel that the school systems in which they are forced to live
are disconnected from the real (technology embedded) world in which they live,
according to Microsoft’s head of marketing and business development for educational
products, Martin Bean (Tantakoff, 2008). While most students believe education is
important, few think it meets their needs because schools are largely not adapting to their
students’ digital lifestyles. Bean feels that if educators are going to better meet the
instructional needs of today’s students, they must embrace technology and fully integrate
it into their instructional practices, rather than looking at it as a separate stand alone
supplement. He cites a history of resistance to new innovations in classrooms from the
use of slates in 1703, to the possible ruin of education through the use of ball point pens
in 1950.
Due to the disconnect between an educational system designed to serve students
from a previous generation, as opposed to the current digital native generation, a
widening gap has formed between the knowledge and skills students are currently
learning and what they need to learn in a 21
st
century workplace which is fully integrated
42
with technological tools and resources (Partnership for 21
st
Century Skills, 2005). In
order to bridge the gap, education futurist and current professor of Learning Technologies
at the Harvard Graduate School of Education, Chris Dede (1995, and cited in Tsantis and
Keefe, 1996) advocates a shift to an alternative instructional paradigm of distributed
learning or learning through doing, which involves participating in an individualized
sequence of ‘constructivist’ experiences that are delivered on demand in a real-world
problem-solving context. What this means is that students take a more hands-on active
approach to their education utilizing the technological tools of knowledge webs, virtual
communities, and synthetic environments. On line knowledge webs give students instant
“access to experts, archival resources, authentic environments, and shared investigations”
(Dede, 1995, p.2). Virtual communities provide a social context to provide
encouragement and motivation while trying to make sense of complex data and
constructing shared knowledge. Finally, shared synthetic environments “extend students’
experiences beyond what they can encounter in the real world” (Dede, 1995, p.3). Dede
is suggesting that teachers shift from being the primary source of student knowledge to
guiding students to construct their own knowledge through these technological mediums.
Beyond teaching standards to read, write, and do math, schools now need to equip
students with “21
st
century skills” so that they can solve real world problems by
identifying the information they need to solve these problems and using digital tools to
find that information according to Ken Kay, president of the Partnership for 21
st
Century
Skills (Khadaroo, 2009). Tsantis and Keefe (1996) point out that in society we no longer
suffer from a lack of information and that running out of it is not a problem, but drowning
in it is. Chris Dede (1995, p.1) feels that “to successfully prepare students as workers and
43
citizens” they must receive “technology-intensive learning experiences” because “the
core skill for American workplaces will… be… filtering and synthesizing a plethora of
incoming information.” Dede (1995, p.7) goes on to give an example that “the clerical
role must shift from keyboarding to utilizing database, desktop publishing, and
groupware applications. The job now demands higher-order cognitive skills to extract
and tailor knowledge from the enormous information capacity of the tool.” Dede (1995,
p. 1) concludes, “experiences of interacting with information will be central in schools
preparing K-12 students for full participation in 21
st
century society.”
Educational Technology’s Impact on Student Achievement
When President Bush and Congress passed the No Child Left Behind Act (NCLB)
in 2001, it included a lesser known mandate that states receiving federal Title I money
must develop technology standards that show, “every student is technologically literate
by the time the student finished the eighth grade, regardless of the student’s race,
ethnicity, gender, family income, geographical location, or disability” (U.S. Department
of Education, 2001, pg. 1672). While this line appeased technology advocates seeking,
“a 21
st
century curriculum that harnesses PCs and the Internet to equip kids with skills
needed in the modern workplace, like critical thinking, analysis, and communication”
(Ricadela, 2008, p.3), it was really a footnote to the crux of the message of NCLB which
was for public schools to increase student achievement or else. To this end, practitioners
in the field of education want to know if educational technology can help to increase
student achievement in core academic areas which are tested such as reading and math.
There are many variables in an educational system which can affect student
achievement outcomes. Narrowing the results of a research study down to effects of
44
educational technology or standards based academic outcomes can be difficult.
Therefore, while some studies try to answer the large question of technology’s impact on
tested reading and math skills, many studies measure technology’s impact on antecedent
skills and building blocks which contribute to student achievement.
In 1994, James Kulik conducted a meta-analysis of over 500 different research
studies that had been conducted up to this point looking at the effect of computer-based
instruction on increasing student achievement. The computer-based instruction focused
on addressing the learning needs on individual students to increase interest and better
differentiate instruction. The results across the studies were generally positive showing
student attitudes towards learning. Likewise, in 1998, Jay Sivin-Kachalala reviewed over
200 research studies conducted up to that point investigating the effect of technology on
increasing student achievement. As this study was published by the Software Publisher’s
Association, the results should be considered with great caution. Nevertheless, regular
and special education students, preschool through higher education, in technology rich
environments showed positive growth in student achievement in all subject areas and
improved attitudes towards learning. Finally, another large scale meta-analysis of
research conducted between 1993 and 2000 (Murphy, Penuel, Means, Korbak, and
Whaley, 2001) found that educational software positively impacts student achievement in
reading and mathematics, especially in grades Pre-K to 3 and 6-8.
Several studies more focused on specific technology applications and discreet
academic skills have also shown positive results. In 2002, Boster, Meyer, Roberts, and
Inge found that integrating standards-based video clips into lessons developed by
classroom teachers increased student achievement more for students that were exposed to
45
the clips than for students that were not. This study did not examine the efficacy of
showing an hour long Jiminy Cricket film on a Friday afternoon, but rather focused video
clips lasting only a few minutes to build background knowledge on standards-based
topics that students such as English Learners or students from low-income backgrounds
might not have otherwise. In 2005, O’Dwyer, Russell, Bebell, and Tucker-Seeley
conducted a study in which they controlled for variables such as socio-economic status
and previous achievement and found that students scored higher on language arts tests
and writing tests when they had edited papers throughout the school year using
technology.
As in other studies, Harold Wenglinsky (1998) controlled for variables such as
socio-economic status, class size, and teacher characteristics to find that technology had a
positive impact on student achievement in mathematics for fourth and eighth graders.
What made this study significant though is that it highlighted two critical aspects in
utilizing technology effectively to increase student achievement. First of all, teachers
needed professional development on how to use the technology. Secondly, and more
important, students needed to use the technology to support higher order thinking skills
such as problem solving and conducting simulations to see academic growth. In fact,
when students used computers for lower order thinking skills, such as drill and practice,
student achievement actually went down. Roschelle, Pea, Hoadley, Gordin, and Means
(2000) echoed these findings citing the need for teacher training and the usefulness of
technology in developing higher order skills such as critical thinking, analysis, and
scientific inquiry. They further clarified that technology is most effective when students
46
are actively engaged in authentic tasks connected to real world contexts in a collaborative
group setting with frequent interaction and feedback.
Christensen, Horn, and Johnson (2008) argue that technology offers students
today the opportunity to learn in ways that match their intelligence types in places they
prefer and at a pace they prefer. And yet, they argue, after two decades of spending over
$60 billion dollars to put computers into schools we have not seen technology’s promise
fully realized. They assert that what it comes down to is schools are not motivating
children and are simply trying to fit educational technology into the existing monolithic
and standardized system. For technology to succeed in schools it cannot be simply
attached to the existing paradigm. Rather it must be implemented in a disruptive manner
which transforms the system.
Many current writers on technology argue that it has failed to transform the
educational system because teachers are badly lacking formal education in how to create,
develop, and use these tools (Fischer, 2007). While schools are excited about
technology’s promise, they are short of trained faculty to extract many of its benefits
(Ricadela, 2008). Jaime McKenzie (2001) argues that the training programs offered in
the past may not generate the behaviors and daily practices needed for teachers to learn
and value the effective use of new technologies. He identifies the use of adult learning
theory, curriculum development projects, and informed support structures such as
coaches to be more effective at getting teachers to deeply integrate technology into their
curriculum and instruction to enhance student learning.
47
Current Trends in Educational Technology
Educational systems across the country have been investing in technology for a
quarter century now trying to increase student achievement and keep students up to date
with current skills needed in the workplace. While the days of the Apple IIe, green
monitor, and dot matrix printer are far behind us, many people don’t think of educational
technology beyond desktop computers and tutorial software. Yet, the technological
resources available to schools these days are very diverse. Furthermore, the states
exploring the ever expanding uses of these technological resources are not limited to
either coast.
There are an increasing number of states such as Nebraska (Saunders, 2008) that
are instituting one-laptop-per-student programs. While the laptops are often for school
use only in the elementary grades, middle school students and high school students are
often allowed to take the laptops home to complete assignments and conduct research. In
other states such as Virginia and New Jersey (Abramson, 2009), districts are moving
towards purchasing smaller and simpler laptops which are much less expensive and
lighter for students to carry around.
As early as 2004, Cathleen Norris and Elliott Soloway envisioned a Star Trek like
future in which schools move beyond providing a computer for every student, to a
handheld-centric classroom which could support collaborative project-based learning.
This vision has been realized in Keller, Texas where fifth graders are now being equipped
with smart-phones featuring slide-out keyboards to provide basic computing functions
and internet access for as little as $100 per student per year (Ricadela, 2008).
48
In 2009, California named the first 10 digital textbooks that meet academic
standards for high school math and science (Chea, 2009). While other states such as
Texas and West Virginia are also exploring digital textbooks, California is the first to
formally adopt. California’s governor, Arnold Schwarzenegger, intends for these free
digital textbooks to save hundreds of millions of dollars a year (Frier, 2009). Most new
textbooks come with CDs or online versions, but this is the first time there is not a
corresponding hard textbook. These digital textbooks save students from carrying
textbooks often weighing 5 pounds each from class to class and from school to home.
The limiting factor is that many schools do not have the infrastructure in terms of
computer hardware and internet connectivity both in schools and in student homes to take
advantage of these free digital textbooks. Some schools may resort to printing chapters
or providing iPods and Kindles. A huge amount of general funds is tied up in the
purchasing of textbooks. If textbooks could be accessed online through site licenses, then
a great deal of the technological infrastructure could be funded using this source of funds.
This would provide the additional benefits of textbooks not being lost and damaged, and
textbooks being revised and updated on a regular basis, possibly eliminating textbook
adoption cycles and the fear of putting all of your apples in one basket. Schools could
change the textbook publisher they use year to year without the logistical nightmares it
now entails and the big hits to the instructional budget. While the state is digitizing the
input side of the equation, some schools are leading the way in digitizing the output side
also. A few high tech charter high schools have developed “digital portfolios” for
students to publicly post their work and projects for authentic audiences (Ricadela, 2008).
49
For someone who has not been in a classroom lately, they may be surprised to
find that at tech savvy schools, overhead transparency projectors are getting increasingly
difficult to find in classrooms. Increasing numbers of these one-time mainstays of
educational technology are finding themselves stored in closets collecting dust waiting to
be surplused. Over the past decade, Liquid Crystal Display (LCD) projectors have come
down in price and size to make them viable replacements for the old overhead
transparencies. These projectors can be hooked up to a computer to display digital
textbooks, PowerPoint presentations, or videos streamed from the internet. This makes
instructional lessons much more visually appealing and interactive for the digital native
students of the 21
st
century. Document cameras can also be attached to these LCD
projectors to display student papers or hardbound textbooks. Having such visuals
available can help distractible students to focus on a passage being read or a math
problem being solved. They also help English Learner students to better comprehend
what is being verbally communicated in class by providing a visual scaffold on which to
hang ideas and concepts. In the state of Washington, Lake Washington School District
under the leadership of Assistant Superintendent for Technology, Chip Kimball,
committed in 2004 to installing ceiling mounted LCD projectors and 70 inch screens in
every classroom in the district.
While LCD projectors have represented a big step forward in the visual
representation of instruction, the next step has involved making those visual
representations more interactive for students. Several vendors such as Promethean
market these interactive whiteboards, but the most ubiquitous by far are those developed
by Smart Board Technologies. Staff members of a K-5 technology academy in Arizona
50
recently described how these interactive white boards, which can cost upwards of $5,000
each to purchase and install, combine the use of overheads, scanners, projectors, and the
internet into one system (Gordon, 2008). In this article the teachers describe how the
computer image is projected onto a screen for students to go up to touch and manipulate
like an iPhone. Smart Board Technologies has also developed Smart Tables which allow
students to manipulate images and data projected onto a horizontal surface, rather than a
vertical whiteboard. Pupils in England are already trying out desks with interactive
screens that can recognize more than one person’s finger presses (BBC News, 2008).
Several companies also currently make wireless pads which teachers can write on to
manipulate images projected onto a screen while walking around a classroom to more
closely monitor student progress. Likewise, science students can now use probe-ware
which is basically an electronic sensor which sends scientific data directly to a computer
to be displayed and analyzed (Lake Washington School District, 2004).
Just as educational hardware has made great strides over the past decade, so has
educational software. Renaissance Learning has created Accelerated Math and
Accelerated Reader which are used by many students across the country. Accelerated
math can be used to differentiate math instruction to the benefit of both advanced and
struggling students by using computers to test students on standards-based concepts, then
providing worksheets for the specific individualized concepts students are struggling
with. Similarly, in Accelerated Reader, students take a STAR reading test on a computer
to determine each student’s Zone of Proximal Development (ZPD) or grade equivalent
reading level. This program builds reading stamina by having students read texts at their
instructional level followed by 10-question reading comprehension tests on the computer.
51
In the state of Kansas, Renaissance Learning has taken this one step further by
developing a web site to the Kansas Book Connect where parents can log on to enter their
child’s reading level and interest area and a book list will be generated (Sullinger, 2008).
An even more powerful technology based reading program which is being used
by schools and districts across the country was developed by Scholastic. This Read 180
program targets students in 4
th
– 12
th
grade reading at least 2 years below grade level. In
addition to reading high interest texts for 20 minutes at their instructional level, and
receiving 20 minutes of small group instruction from a credentialed teacher, Read 180
students receive 20 minutes of computer based instruction at their instructional level in
which they complete tasks like reading words into a microphone which the computer then
checks for accuracy. While there are many personal testimonies as to the effectiveness of
Read 180, peer-reviewed journals also support its effectiveness in teaching student how
to read (Slavin, Cheung, Groff, and Lake, 2008). Other studies conducted in Florida
public schools (Lang, Torgeson, Vogel, Chanter, Lefsky, & Petscher, 2009) and in
California public schools (Papalewis, 2004) also showed that Read 180 students made
more than a year’s growth on the reading portions of state assessments. These are
significant findings in the age of No Child Left Behind.
In 2009, the U.S. Department of Education released an analysis of controlled
studies comparing online and face-to-face instruction. While most of these studies
blended both of these types of instruction, when only one type of instruction was
provided, online instruction proved more effective in raising student achievement than
instruction received face-to-face. This report does not forebode the demise of human
teachers, but U.S. Secretary of Education Arne Duncan does state that the results do
52
strongly suggest that school officials should be spending one-time federal stimulus
money for the American Recovery and Reinvestment Act (ARRA) on technology to
enhance classroom instruction. Duncan goes on to share that online learning can be
beneficial to rural communities and inner-city urban settings where there may be a
shortage of highly qualified teachers. Less than a month after Duncan suggested this, the
state of Texas approved 3 companies to provide electronic courses to home school
students in rural Waco (Gragg, 2009). Several states including Virginia (Lizama, 2009)
and Nebraska (Reutter, 2010) are taking the next logical step to utilize online tests to
measure student mastery of state standards as required under the No Child Left Behind.
Districts in both states are working through logistical issues to make sure all schools have
the computer capacity to administer tests in this manner. It is also unknown whether
students score any better or worse taking these high stakes tests online rather than in the
standardized paper and pencil format.
Data Driven Decision Making
Mike Schmoker (1999) has written a highly influential book in which he
admonishes educators to use the results of frequently administered assessments to
measure how much and well students are learning the concepts being taught. Basically, if
students aren’t learning what is being tested, then teachers are not teaching it well
enough. Instead of just blindly teaching and presenting curriculum in a certain way
because that is the way it has always been done, teachers and administrators should
analyze assessment data to determine which instructional practices should be replicated
or expanded and which instructional strategies should be adjusted or eliminated. In
addition to guiding instructional decision in the classroom, assessment results can be used
53
to identify students needing interventions. This Data Drive Decision Making (DDDM)
model is most effective when it not only looks at global trends across a school or grade
level, but when it drills down to the results of individual students, identifying specific
standards they are struggling with and specific instructional strategies and intervention
they need. The high stakes environment of No Child Left Behind has served as a catalyst
to put DDDM into practice to get results. President Obama’s stimulus plan also requires
districts to provide data showing progress in student achievement in order to qualify for
funding (Hechinger, 2009).
While Data Driven Decision Making shows great promise for increasing student
achievement; gathering, clustering, and interpreting the data can be daunting. It has not
been uncommon for groups of teachers to gather around stacks of paper and pencil tests
trying to figure out which questions most students missed, why they missed them, and
what standards those questions are aligned with. Technology can play an invaluable role
in facilitating this process by expediting the gathering, clustering, and interpretation of
assessment data which frees teachers up to focus on planning for instruction and
intervention.
On the input side, student assessment data can be recorded electronically for
future manipulation. This can be done directly by students through online testing, or
facilitated by teachers by recording student answers onto scan sheets which can be
scanned into computers, or directly by teachers using handheld devices. At the
throughput stage, once the data has been entered, software programs such as Online
Assessment and Reporting System (OARS), EduSoft, DataDirector, and Edline can
identify which standard each item was assessing, identify distracters for incorrect
54
answers, and cluster the results. Some software programs such as Voyager and Read 180
even address the output side by suggesting groupings of students to receive specific
instruction on specific deficit skills and concepts as identified by the assessment.
All of these technological resources can greatly enhance the Data Driven Decision
Making process. Of course these resources will be ineffective in increasing student
achievement if the available technology in outdated, the user interface is too complicated
for digital immigrant teachers to understand, or teachers have not received sufficient
professional development in how to fully utilize these technological tolls (COSN, 2010).
In order to continue increasing student achievement through the effective use of DDDM,
districts must address these factors by continually investing in data management
technology resources and providing the professional development needed to fully utilize
these resources (Gatnow, Park, Wohlstetter, 2007).
Funding for Educational Technology
If instructional delivery is to be truly transformed by the full integration of
technology to increase student achievement, then significant funding must be secured to
provide the necessary hardware, programs, networking, and staff development. In
discussing the economic and technological barriers to change, Dede (1995, p.7) states
that “by shifting how current resources are allocated, education institutions can deploy
and utilize powerful technologies.” Looking beyond the normal general fund and
categorical fund sources, Palo Alto Unified School District in California is spending
millions of dollars of local bond funds generated by increased taxes to revamp its
educational technology resources (Samuels, 2010). While communities such as Palo Alto
are fairly affluent, Dede (1994, p.18) recognizes that constantly upgrading software,
55
hardware, and networking capabilities can be extremely difficult for “schools with
limited financial resources located in communities with little opportunity to supply added
dollars beyond state and federal funding. Thus, this situation widens the gaps in equity.”
This situation leads many districts to rely too heavily on categorized funds which may be
unstable or one time grants such as the Enhancing Education Through Technology
(EETT) program which was authorized under the original No Child Left Behind law to
incorporate technology in the classroom and recently received a boost from President
Obama as part of his stimulus package (Chaker, 2009). Of course as Picus (2000, p.6)
states, “Simply relying on one-time grants and gifts is no substitute for the hard work of
re-ordering spending priorities or finding new sources of long-term revenue to integrate
technology into both the curriculum and the budget process.”
According to Gartner (2003), technology is an increasingly essential K-12
resource, but as budgets tighten district leaders must be able to account for all costs
associated with technology purchases and the benefits each technology resource is
expected to produce. Total Cost of Ownership (TCO) is a comprehensive set of models,
methodologies, and tools designed to answer these difficult questions. In calculating
TCO for school business officials, Kaestner (2006) identifies three categories of expenses
which must be calculated and accounted for. First, and most obvious, are the annualized
technology costs for the technology resources themselves such as computers, software,
networks, servers, and printers. Second, are the district labor costs of those who have
responsibility for ongoing technology support such as installing and maintaining the
technology resources. Third, and least obvious, are the indirect labor costs of end users
such as teachers who spend paid time learning how to use the technology resources.
56
While technology has been shown to benefit student achievement, there are many costs
associated with implementing technology which must be weighed against the benefits
received.
57
CHAPTER THREE
METHODOLOGY
Introduction
This chapter provides an overview of the methodology used to conduct this study.
After acknowledging how this study relates to others, this chapter reviews the purposes
and research questions for this study. It also provides detailed descriptions of the
population and sample, the instrumentation, the data collection procedures, and the
methods of data analysis.
In order to help determine the appropriate levels of expenditures on educational
technology in elementary schools, the different methods delineated by Odden and Picus
(2008) for conducting adequacy studies were considered. Given the researcher’s
experience as a practitioner working in elementary schools and working with schools
greatly increasing student achievement, a successful schools study was conducted to
examine how schools exiting from Year 4 or 5 of Program Improvement under No Child
Left Behind were using educational technology to help increase student achievement.
To facilitate this, a survey instrument in the form of a single two-page
questionnaire, consisting of forced-choice responses, was designed and implemented with
at least one primary teacher and one intermediate teacher at each participating school.
Second, a greatly expanded survey instrument with more open-ended and educational
leadership perspective items was administered to the principal or designee at each
participating school. Finally, the results of a School Technology Survey (STS), more
commonly referred to as the CTAP (California Technology Assistance Project) survey
which is administered by the California Department of Education (CDE) to all schools in
58
California each year were reviewed to determine the types and amounts of technology
present in the schools being studied, as well as, how that technology was being supported
and utilized.
Based upon a review of the literature pertinent to educational technology, and the
researcher’s own experience in increasing student achievement through the use of
educational technology, the survey items provided sufficient breadth, depth, organization,
and thoughtfulness, to make coherent determinations as to how expenditures on
educational technology helped to increase student achievement at the schools studied.
The School Technology Survey has been used and refined for years with California
schools.
In 2004, seven doctoral students formed a thematic dissertation group under the
direction of Dr. Lawrence O. Picus at the Rossier School of Education at the University
of Southern California. All seven researched different aspects of the matrix used by Dr.
Picus to specify adequate expenditures at prototypical schools to teach students to high
standards (Odden and Picus, 2008). Three of these students examined expenditures on
educational technology, one each focusing on elementary schools (this study), middle
schools, and high schools.
In 2006, the first ten elementary schools in the state of California exited from
Year 4 of Program Improvement. After years of being classified as Program
Improvement for not meeting state standards, and therefore unsuccessful, these 10
schools significantly increased student achievement on standards based state tests for 2
consecutive years and were now considered to be successful by the state of California.
This represented a fairly objective means for identifying successful schools. Several
59
doctoral studies identified these schools as their sample, examining topics such as
principal leadership style and school culture. The researcher identified these same 10
schools as the sample for this study using similar teacher survey and principal interview
instruments, but looking at the topic of educational technology. In 2008, the 15
California Elementary Schools exiting from Year 4 or 5 of Program Improvement that
year were added to this study to increase the sample size.
In 2007 the researcher studying expenditures on educational technology in middle
schools was invited to shift his focus to a parallel study of the 25 elementary schools in
California exiting from Year 4 or 5 of Program Improvement that year, using the same
methodology and instruments of this study. The two researchers used the same original
teacher survey instrument and collaborated in jointly developing a principal survey
instrument. Thus, the results should be comparable and useful for validation and
generalization. This parallel study of 2007 schools was successfully defended in 2008.
Purpose and Research Questions for the Study
Recognizing there are many factors, including the provision of highly qualified
well-compensated teachers and standards-based instructional materials, which affect
student achievement, the purpose of this study is to determine the types and amounts of
educational technology needed in elementary schools to support teaching students to high
standards. Using a successful schools model to study how public elementary schools in
California used educational technology to exit from Year 4 or 5 of Program Improvement
will provide guidance in adequately funding technology to increase student achievement
in California public elementary schools. Such information has become particularly
timely as schools and districts attempt to wisely spend additional federal stimulus, Title I,
60
and IDEA (Individuals with Disabilities Education Act) funds for the 2009-2010 and
2010-2011 school years. This information could also prove invaluable to educational
institutions competing for the Race to the Top funds proposed by the U.S. Secretary of
Education Arne Duncan. Finally, the results of this study should help to clarify the
educational technology expenditures for prototypical schools on the matrix utilized by
Picus in adequacy studies.
This successful school study was conducted to answer the following research
questions which address the relationship between funding for technology and student
achievement:
1. Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement?
2. What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement?
3. What additional support, funding, and budgets should be provided to support
school level efforts to raise student achievement?
4. Do state technology surveys and state achievement test data indicate a positive
correlation between spending money on technology and high student
achievement?
Population and Sample
Most of the California public schools in Years 4 and 5 of Program
Improvement(PI) in 2006 (401) and 2008 (1009) were elementary schools, reflecting that
most of the schools receiving Title I funds under No Child Left Behind were elementary
61
schools (C.D.E., 2007 and 2009). To progress to this level of (PI), these schools had to
fail to make Adequate Yearly Progress (AYP) for two successive years (placing them at
Year 1 of PI), then fail to make AYP for two successive years in order to exit. Each year
these schools failed to exit Program Improvement, they progressed deeper into the
process with increasingly severe sanctions. When these schools reached Year 4 of
Program Improvement they had to write an Alternative Governance Plan to significantly
restructure their school and put new practices in place to increase student achievement.
The population selected for this study consisted of the 25 California elementary
schools in Year 4 and 5 of PI that met all AYP targets for two consecutive years and
exited from Program Improvement status in 2006 and 2008. This population included 10
schools that exited from Year 4 of PI in 2006. No schools exited from Year 5 of PI in
2006. This population also included 5 schools that exited from Year 5 of PI in 2008 and
10 schools that exited from Year 4 of PI in 2008. These schools were identified using
lists published by the California Department of Education (2006 and 2008) and
represented a total population size of 25 elementary schools from across the state of
California. This population of elementary schools was purposefully identified (Potter,
2002) because it was hypothesized that after years of documented failure and low student
achievement in meeting state standards as measured by the criterion-referenced
California Standards Tests, a significant shift in instructional practices took place over a
focused 2 year period which raised student achievement enough to exit Program
Improvement. This focused time frame allowed the researcher to better isolate the effects
of a variable such as educational technology in increasing student achievement and
represented a means of identifying successful schools for this adequacy study based upon
62
the efficacy of educational practices rather than (and often in spite of) demographics.
This population was intended to be representative of all California public elementary
schools in Program Improvement, and more specifically the more than 1,400 California
public elementary schools in Years 4 and 5 of Program Improvement in 2006 and 2008.
All twenty five California elementary schools that exited from Year 4 or 5 of PI in
2006 and 2008 were invited to participate in this successful schools study. Of the ten
schools identified from the 2006 population three were willing to participate, and did so.
Of the fifteen schools identified for the 2008 population, eight schools were willing to
participate and actually returned data. Therefore, of the twenty-five identified schools in
the population, data was actually collected from eleven. As these remaining eleven
schools were spread across various regions of California, the researcher asserts that this
was a sufficient sample size to represent the California elementary schools exiting from
Year 4 and 5 of Program Improvement and to a lesser extent all California public
elementary schools in Program Improvement
The sampling design also included a multistage clustering procedure to stratify
the sample (Creswell, 2003), by seeking the perspectives of administrators versus
teachers, and primary teachers versus intermediate teachers within the participating
sample schools. Once the schools were identified, data for the study was sought and
collected from principals and from teachers at both primary grades (K-3) and
intermediate grades (4-6) to better ensure the study results reflected the perspectives and
uses of educational technology from the various stakeholders. As there are a greater
number of primary teachers in elementary schools due to the greater grade span and the
staffing ratios required to implement Class Size Reduction (CSR), a greater proportion of
63
data was collected from primary teachers to reflect the characteristics in the population
(Creswell, 2003).
Instrumentation
To facilitate the collection of data from the sample schools and participants in this
study, a Teacher Survey questionnaire (Appendix A) and a Principal Survey
questionnaire (Appendix B) were utilized. The Teacher Survey questionnaire was
created by the researcher and designed for this research. A nearly identical version of
this Teacher Survey questionnaire was used in the parallel study of elementary schools
exiting from Year 4 and 5 of Program Improvement in 2007. A Principal Survey
questionnaire was co-developed by the researcher and the researcher from the parallel
study of 2007 schools. However, for this study, the Principal Survey questionnaire was
modified to be more user-friendly, while capturing the same basic data. As co-creator, of
the original instrument, permission to use the modified instrument was implicit. The
research instruments, including the survey, cover letters, and district notification letters
were pilot tested and validated by teachers, site administrators, and district administrators
in the researcher’s district. At least two representatives from each of these groups
reviewed the instruments and their feedback was incorporated into the final version.
The Teacher Survey questionnaire instrument was two pages long consisting of
24 forced-choice items clustered into the major content sections of Technology
Hardware, Technology Software/Programs, and Technology Support. Three additional
open-ended items were included to allow respondents to add any additional hardware,
software, or technology support data that was not captured in the forced-choice items.
All items asked respondents to evaluate how much they felt each educational technology
64
component helped to raise student achievement. The forced-choice items measured
responses on a continuous scale with corresponding point values: “Strongly Agree” (4),
“Agree” (3), “Disagree” (2), “Strongly Disagree” (1), and “Don’t Know” (0). The scale
was purposely set up to force these successful school respondents to indicate either
agreement or disagreement with the efficacy of each technology component they were
familiar with for helping to raise student achievement. The same exact Teacher Survey
questionnaire was used for both primary teachers and intermediate teachers. This
Teacher Survey was designed to take approximately 10 minutes to complete.
The Principal Survey questionnaire instrument was originally co-developed by the
researcher of this study and the researcher of the parallel 2007 study. This original
Principal Survey was 14 pages long and consisted of 45 open-ended items and 27 forced-
choice items for a total of 72 items. Pilot testing revealed that this original Principal
Survey questionnaire instrument was too unwieldy and would decrease the likelihood of
busy practitioners participating in the study. Utilizing this feedback, the Principal Survey
was heavily revised to capture essentially the same data, in the same order, but in a much
more manageable 5 pages, instead of 14 pages. This was accomplished by converting
open-ended items to forced choice items when possible. The final composition of the
Principal Survey for this study consisted of 30 open-ended items and 60 forced-choice
items for a total of 90 items. This revised Principal Survey was designed to take
approximately 30 minutes to complete. The forced-choice items measured responses on
the same 5 level continuous scale as the Teacher Survey, with the same purpose of
forcing relative agreement or relative disagreement, though the forced choice was revised
from “Don’t Know” to “Not Applicable/Don’t Know.” In addition to the three major
65
content areas of Technology Hardware, Technology Software/Programs, and Technology
Support, the Principal Survey questionnaire added the five major content areas of
Program Improvement Background Information; Instructional Leadership in Technology;
Data Analysis and Data Driven Decision Making (DDDM); Educational Technology
Costs, Funding, and Budgets; and Exiting Program Improvement.
Finally, additional data on educational technology being used at these successful
schools was gathered by reviewing results of the annual California School Technology
Survey. This survey instrument was “designed and administered by the California
Department of Education (CDE) and the California Technology Assistance Project
(CTAP) to assess the availability and distribution of educational resources in California
K-12 public schools” (CDE, 2008). Special access to survey results for all schools
included in the study, but not generally available to the public, was granted to the
researcher by the CTAP. Survey results were gathered for 2008, the latest year data was
gathered and posted, regardless of what year the schools exited from Program
Improvement. Results within the survey are broken out by topic including
student/computer ratios; computer age; equipment location; internet connectivity; average
hardware fix times; and technical support staffing. The quantitative data gathered from
these School Technology Surveys was used to triangulate the quantitative and qualitative
data gathered for the sample successful school sites using the Teacher Survey
questionnaire instruments and the Principal Survey questionnaire instruments.
Data Collection
Successful schools to be included in this adequacy study were identified using
lists published by the California Department of Education (2006 and 2008) indicating
66
California public schools exiting from Year 4 and 5 of Program Improvement. The
researcher then looked up contact information for these schools and the districts including
names of superintendents, names of principals, addresses, phone numbers, fax numbers,
and electronic mail addresses (e-mail). The researcher’s superintendent gave him
permission to personally call each superintendent and principal and introduce himself as a
representative of his own district. This was done to increase the likelihood that districts
and schools contacted would be willing to participate in the study and with the
understanding that data gathered would benefit not only the study, but also the
researcher’s district in attempting to spend federal stimulus money wisely.
The researcher initiated contact by calling district superintendents or their
designees to seek permission to include the identified schools in the study. Districts
willing to participate in the study were e-mailed a District Notification Letter (Appendix
C) seeking written permission, on district letterhead, to include the selected school in the
study. This letter informed the district that the study was investigating the effect of
educational technology on increasing student achievement to determine the amount of
fiscal resources to be dedicated to educational technology. The letter also assured the
district that the confidentiality of the district, school, and individual participants would be
protected and the study would comply with all Institutional Review Board (IRB)
requirements.
Once permission was given by each district, school site principals were called by
the researcher to explain the study and seek participation. Principals that agreed to
participate were e-mailed the Teacher Survey questionnaire instrument, the Principal
Survey questionnaire instrument, and a Principal Survey cover letter (Appendix D). The
67
Principal Survey cover letter explained how the school had been identified and the
purpose of the study. Given that it had been 1-3 years since each of these schools had
exited from Program Improvement, principals were asked to not only consider how
technology may have contributed to exiting PI, but also how technology has been used to
continue increasing student achievement since exiting PI. This was done because
educational technology and its uses may have changed significantly in the past three
years and all schools in the sample have stayed out of program improvement and can still
be considered “successful” with professional judgments worthy of consideration.
Principals were also asked to identify at least one primary and one intermediate teacher to
complete the Teacher Survey questionnaire instrument.
Though the Principal Survey was designed to take only 30 minutes for
completion, and the Teacher Survey 10 minutes, most schools took several follow-up
phone calls and e-mail reminders before completing the surveys. Most completed
surveys were returned to the researcher as e-mail attachments, though a few schools
chose to fax their surveys or return hard copies through the U.S. mail. Upon receipt of
the completed surveys, the researcher mailed thank you cards and Starbucks gift cards to
each participant at the school address.
After collecting surveys, the researcher began downloading and printing results of
the state School Technology Survey (STS) for each participating school site from the
California Department of Education (CDE) online database. However, with reductions in
state funding and relaxed reporting requirements, data was available to the public only
through 2007. After contacting state and regional representatives of the California
Technology Assistance Project (CTAP), special access was granted to the researcher for
68
all schools and districts included in the study through 2008, including an expanded
database of educational technology resources being used at each school.
Data Analysis
Once all surveys had been collected, a mixed methods approach was used by the
researcher to analyze the data (Creswell, 2003). Following a review of the literature,
both forced choice and open-ended items were designed for use in the Teacher Survey
questionnaire instrument and Principal Survey questionnaire instrument. The forced
choice items were scored on a continuous scale and reflected a quantitative aspect which
provided numerical data and facilitated deductive, statistical data analysis of concepts
revealed in the literature review and the researcher’s own experience or “naturalistic
generalizations” (Creswell, 2003). The open-ended items reflected a qualitative aspect in
which new ideas and unique perspectives of the survey respondents were organized and
coded to facilitate an inductive data analysis. The quantitative data from the School
Technology Surveys was used primarily for deductive data analysis.
The five page Principal Survey was designed as an expansion and more in depth
examination of the themes and subject areas from the two page Teacher Survey. This
facilitated comparison of the survey results which helped to corroborate results and
provide cross validation between the two instruments. This also raised questions about
the validity of results when responses from teachers and principals from the same schools
on similar items varied. When considering the results from both sets of surveys, equal
weight and priority was given to both. Likewise, data from qualitative items were given
equal priority with quantitative items, especially when there were similar open-ended
responses from several different schools. While data for the state School Technology
69
Survey was used primarily for triangulation of results from the other two surveys, the
data aligned well with only a few items on the Teacher Survey and the Principal Survey.
Therefore, the state School Technology Surveys were considered a secondary source of
information and were not given equal priority. Integration of these various sources and
types of data occurred during the data analysis and interpretation stages in order to
answer the research questions posed by the study.
The findings from the Principal Survey, Teacher Survey, and School Technology
Survey were further interpreted by comparing them to the results of a national survey
titled “Educational Technology in U.S. Public Schools: Fall 2008” which was sponsored
by the National Center for Education Statistics (Gray, Thomas, and Lewis, 2010). This
report provided national data on the availability and use of educational technology in
public elementary (and secondary) schools during the fall of 2008
With the above theoretical framework in place, data analysis consisted of the
following steps:
Completed Teacher Surveys, Principal Surveys, and School Technology
Surveys were grouped by school, then organized and coded according to the
year (2006 or 2008) the schools exited Program Improvement and the Year (4
or 5) from which schools exited Program Improvement.
Responses to the forced-choice items (24 on the Teacher Survey and 60 on the
Principal Survey) were reviewed to begin calculating the number and
percentage of respondents who responded “strongly agree”, “agree”,
“disagree”, or “strongly disagree” to each item.
70
Results from these calculations of forced-choice items were fed into a
Microsoft Excel spreadsheet to form frequency charts, facilitate statistical
analysis, and create graphs which clearly illustrated the significant trends in
the quantitative data.
A similar process was followed in analyzing and documenting the trends
formed in the quantitative data from the state School Technology Surveys.
Responses to the open-ended items (3 on the Teacher Survey and 30 on the
Principal Survey) were coded to begin an inductive analysis of this qualitative
data.
Results from these coded responses were scrutinized to identify significant
themes and patterns and inductively determine educational technology factors
which contributed to increasing student achievement.
Results from the Principal Survey, Teacher Survey, and state School
Technology Survey were compared to results from a national technology
survey.
A final review of all the survey results, including quantitative and qualitative
items, was conducted to triangulate and summarize the findings. These
summarized finding were then used to draw conclusions for the study.
71
Summary
This chapter presented a very detailed description of the methodology used in this
successful schools adequacy study. Detailed descriptions of the sampling,
instrumentation, data collection, and data analysis used to answer the research questions
posed by the study were provided.
72
CHAPTER FOUR
ANALYSIS OF THE DATA AND INTERPRETATION OF THE FINDINGS
Introduction
This chapter presents the data collected over the course of this study, then
analyzes and interprets that data. The purpose of this study was to investigate what role,
if any, educational technology played in helping to raise student achievement to help
California elementary schools make Adequate Yearly Progress (AYP) to exit from Year 4
or Year 5 of Program Improvement in 2006 or 2008. These schools were identified for
Program Improvement under the federal No Child Left Behind Act because not enough
students passed the California Standards Test (CST) which measures student mastery of
state standards. Data was collected from three elementary schools that exited from Year
4 of Program Improvement in 2006. Data was also collected from eight elementary
schools that exited from Year 4 or Year 5 of Program Improvement in 2008. The study
was conducted to answer the following four research questions:
1. Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement?
2. What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement?
3. What additional support, funding, and budgets should be provided to support
school level efforts to raise student achievement?
73
4. Do state technology surveys and state achievement test data indicate a positive
correlation between spending money on technology and high student
achievement?
In order to answer these four research questions a mixed methods approach was
utilized, gathering both quantitative and qualitative data from each school included in the
sample. The instruments used to gather this data included, 1) a teacher survey
questionnaire consisting of mostly quantitative, forced-choice items, completed by at
least one primary teacher and one intermediate teacher, 2) a principal survey
questionnaire consisting of similar quantitative forced-choice items with additional open-
ended qualitative items added to increase the number of topics covered and the depth of
coverage for each topic, and 3) a state technology survey consisting of quantitative data
for each school site retrieved from the California Department of Education web site.
Data from the quantitative items on the teacher survey questionnaire and the
principal survey questionnaire were entered into Excel spreadsheets to tabulate results
from each item and to generate figures for each item to clearly display the trends evident
in the results for each item. Qualitative data from open-ended items on both surveys
were transcribed and organized in the same order they appeared on the survey
instruments. Results from quantitative and qualitative items on similar topics were
interspersed with one another for comparability purposes. In reporting the data,
randomly selected letter and number codes were assigned to each respondent to protect
their anonymity as promised and to provide comparability item to item for individual
respondents which may reveal overarching beliefs regarding the efficacy of educational
technology in increasing student achievement. The results from all schools, whether they
74
were 2006 or 2008, Year 4 or Year 5 of Program Improvement, were aggregated for each
item to further protect the anonymity of respondents and provide a sufficient sample size.
Data from the CDE School Technology Survey were also entered into Excel spreadsheets
to tabulate results for comparison to statewide averages.
Equal weight was given to the quantitative and qualitative data which was
integrated throughout the collection, analysis, and interpretation stages. Collating data
from teacher surveys, principal surveys, and state technology surveys triangulated the
data to more clearly identify trends and validate the results. Data from these three
sources was analyzed by comparing results from the three instruments to each other. The
findings from all three were further interpreted by comparing them to the results of a
national survey titled “Educational Technology in U.S. Public Schools: Fall 2008” which
was sponsored by the National Center for Education Statistics (Gray, Thomas, and Lewis,
2010). This report provided national data on the availability and use of educational
technology in public elementary (and secondary) schools during the fall of 2008. This
mixed methods study was carefully designed to find the most accurate answers to the four
research questions.
Findings for Research Question Number One:
Overall, does the utilization of educational technology resources support school level
efforts to raise student achievement and exit from program improvement?
In order to answer the research question as to whether or not educational
technology resources in general support increasing student achievement and exiting from
program improvement, a series of quantitative and qualitative data was gathered from the
eleven school sites using the principal survey questionnaire instrument. Questions for
each school were clustered in three focus areas of 1) Program Improvement Background
75
Information, 2) Instructional Leadership in Technology, and 3) Exiting Program
Improvement.
The data from 19 quantitative items was entered into Excel spreadsheets to tally
results and create figures to visually represent the results. In completing this survey,
principals responded on a four point Likert scale of “Strongly Agree,” “Agree,”
“Disagree,” or “Strongly Disagree” to form relative levels of agreement or disagreement.
Principals were given a fifth response option of “N.A. or Don’t Know” if they could not
provide data on a particular item. Unanswered items by various respondents were coded
by the researcher as “N.A. or Don’t Know.” The frequency of responses to each item
was graphically displayed in a horizontal manner to facilitate comparison of responses
within each topic area and more easily identify data trends.
The qualitative data from 7 open-ended items was also collected from the eleven
principals that completed the survey. The responses and descriptive comments from each
principal were transcribed and compiled to provide a cluster of responses for each
qualitative item.
In presenting and interpreting the data for the first research question, responses
from quantitative and qualitative items have been interspersed within each topic area.
This was done to generally reflect the order in which the items were presented to the
principals and to juxtapose responses so that numbers could be put to subjective
statements and clarification could be given to forced responses.
Program Improvement Background Information
In order to set the stage and better understand the background in which schools in
this study were trying to increase student achievement, principals were asked the
76
following open-ended question, “In which subgroups, and subject areas, was your school
not making Adequate Yearly Progress (AYP) causing your school to progress to Year 4
or Year 5 of Program Improvement (PI)?” The following represents the range of
responses gathered:
“The subgroups that were not making AYP included the Hispanic
subgroup, the EL subgroup, the SED Subgroup and the Special ED
subgroup.” (principal A1)
“Language Arts & Math” (principal B2)
“ELL’s, Hispanic, and SED in ELA.” (principal C3)
“ELA, EL, Hispanics, SDY.” (principal D4)
“Our Latino, English Learners, and student with needs were not meeting
AYP target goals. We were not making growth or meeting our target
growth in Language Arts.” (principal E5)
“Hispanic, SED, ELL” (principal F6)
“EL, SED” (principal G7)
“English Language Learners” (principal H8)
“English Learners” (principal I9)
“English Language Arts; English Learner (EL) Subgroup” (principal J10)
While responses varied, English-Language Arts was identified most often as the
subject in which the schools (five) were not making Adequate Yearly Progress. In terms
of specific subgroups, English Learners was named most often (nine out of ten
respondents, 90%) as the subgroup not making AYP.
77
To add further to the school’s backgrounds and identify co-variables to the
increase in student achievement, principals were asked the following open-ended
question, “What were some of the key factors/programs (not necessarily technology
related) that helped your school to raise student achievement and exit P.I.?” The
following represents the range of responses gathered:
“The key factor, in my opinion, was the focus on standards and the benchmark
assessments that allowed more focused instruction for specific students.”
(principal A1)
“Focus on doing a few things well and strong.” (principal B2)
“Targeting bubble students for intervention. Making enough progress to qualify
for Safe Harbor.” (principal C3)
“School wide implementation of effective teaching practices, Direct Instruction,
Kagan Cooperative Learning, Thinking Map, etc. Technology was not a factor in
our P.I. Exit, however we have since purchased SmartBoards for all classrooms
and they are wonderful!” (principal D4)
“We started focusing on specific essential standards. Each grade level identified
key standards and those students were assessed frequently.” (principal E5)
“Fidelity to district-adopted curriculum, teaching to state standards, data analysis
which informed our instruction, maintain rigor of instruction.” (principal F6)
“Best First Instruction, Functioning as a PLC, Continuous monitoring of multiple
measures of student achievement, strong and shared leadership” (principal G7)
“Teacher collaboration and focus on data and standards” (principal H8)
78
“Change of school culture to one of excellence for all students, focus on direct
instruction, consistent curriculum, interventions for students in need (during and
after school), focus on reading and language skills.” (principal I9)
“Training in the use of Open Court Reading 2002; Faithful implementation of
Open Court Reading 2002; Frontloading for academic success of EL students;
Common block for English Language Development (ELD); Use of sentence
frames to support language use of EL students; Collaboration between and among
grade levels; Data Analysis (using Online Assessment Reporting System (OARS);
Use of visuals-projected via document camera or LCD projector; Use of
Discovery (United) Streaming clips to help build background for EL student.”
(principal J10)
“New text adoptions (math)” (principal K11)
A common theme running through the responses was a focus on teaching to high
state standards using district adopted curriculum and effective instructional practices. A
series of four forced-choice items were then posed to the principals to quantify if they
believed educational technology was 1) integrated into or supportive of these
factors/programs, 2) an important element in raising student achievement, 3) supported
by “detailed curriculum design work,” and 4) helpful in making AYP with the schools
struggling subgroups. Figure 1 illustrates the responses of the eleven principals to these
questions.
79
Figure 1: Program Improvement Background Information – Principal Survey Results
Eight out eleven principals (73%) agreed or strongly agreed that technology was
integrated into, or supportive of, the factors and programs that helped to raise student
achievement and exit program improvement. Seven out of the eleven principals (64%)
credited the integration of technology into the core curriculum as an important element in
raising student achievement.
When asked if anyone did “detailed curriculum design work” to support successful
integration, only 3 out of the 11 principals (27%) agreed. A follow up open-ended
question for those that agreed asked who did the work, to which the three principals
responded:
2
0
1
3
4
3
6
5
4
3
2
2
0
3
0
0
1
2
2
1
Did the use, and integration, of technology help
you to make AYP with your particular struggling
subgroup and subject?
If so, did anyone do “detailed curriculum design
work” to support successful integration?
Do you credit the integration of technology into
the core curriculum as an important element of
raising student achievement?
To what extent was technology integrated into or
supportive of these factors/programs?
Program Improvement Background Information ‐Principal Survey
Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
80
“Teachers began integrating technology into the curriculum. Teachers began
pulling information from the internet on the spot to support curriculum in the
upper grades.” (principal E5)
“Teacher, TSA, Principal, Asst. Principal” (principal G7)
“Grade level leaders with the support of Reading Coach” (principal J10)
A second follow up question for those that agreed that “detailed curriculum design
work” had been done asked, “ways release time was required,” to which those three
principals responded:
“Most of the planning was done during grade level or their own time.” (principal
E5)
“Minimal” (principal G7)
“Some” (principal J10)
The final follow up question for those that agreed that “detailed curriculum design
work” had been done asked, “How was it funded,” to which the three principals
responded:
“It was not funded.” (principal E5)
“Title I” (principal G7)
“Title I; SBCP (State and Federal Categoricals)” (principal J10)
When asked, “Did the use and integration of technology help you to make AYP with
your particular struggling subgroup,” 6 out of the 11 principals (55%) agreed or strongly
agreed. A follow up question asked, “If you agree or strongly agree, please briefly
explain how.” The following represents the range of responses gathered:
81
“The use of document cameras and LCDs assisted us in making instruction most
accessible to our subgroups. Also, the use of benchmark assessment data allowed
teachers immediate feedback on the progress of students and the strengths and
challenges of individual students. These assessments are scanned into a program
that allows each individual teacher access to their data.” (principal A1)
“Math – “ (principal B2)
“Becoming a PLC school made teachers responsible for their own data. The year
that we exited PI, more teachers were analyzing and aware of their own data.”
(principal E5)
“Monitoring systems, student engagement, communication-staff, students, Dist.
Office” (principal G7)
“We used Accelerated Readers with all of its parts. Students had access to
computers throughout the school and open access to the library. Read 180 was a
structured intervention that was very effective. Teachers used computerized
reading and language programs to motivate and accelerate students. Rosetta
Stone was helpful for English Learners (and their parents) in learning English.”
(principal I9)
“It forced people to think differently. The planning that went along with the
creation or identification of tools helped teachers present lessons more
thoroughly. The identification of learning strengths and needs using reports
generated through OARS allowed for more targeted interventions to be provided
for students and also encouraged teachers to dig deeper within the curriculum to
82
evaluate how particular standards had been taught and sparked discussions as to
how to improve instruction for EL students.” (principal J10)
The responses were varied, but a few clustered around technology making curriculum
more accessible and interesting to students. A few more responses clustered around
using technology to analyze data and use that data to direct instruction and interventions.
Instructional Leadership in Technology
In trying to answer the research question of whether or not educational technology
helped to raise student achievement and exit from program improvement, a second
cluster of quantitative questions focused around the topic of “Instructional Leadership in
Technology” was asked of principals. The 5 forced choice items themselves, as well as,
the responses from the 11 principals are displayed in Figure 2.
83
Figure 2: Instructional Leadership in Technology – Principal Survey Results
The opening question asked principals, “As the de facto instructional leader of
your site, have you deliberately expanded the use of technology at your site?” To which
10 out of the 11 principals (91%) indicated agreement, 6 of whom (55%) strongly agreed.
As a follow up, principals were asked if the use of technology was expanded with the
expectation that it “in and of itself” would raise student achievement. Only 3 principals
(27%) agreed with this. However, in response to the next follow up question, 10 out of
11 principals (91%) agreed that “the expansion of [their] site’s use of technology [had]
been undertaken with the expectation that it would “help” to raise student achievement.
Nine out of eleven principals (82%) also agreed that the expansion of educational
3
0
3
0
6
6
9
7
3
4
0
0
0
5
1
1
1
0
2
0
1
1
1
1
0
If so, has the expansion of your site’s use of
technology been undertaken with an expectation
that it would require dedicated professional
development time and dollars?
If so, has facilitating your students’ mastery of
“21st century skills” been a factor in your
expansion of educational technology at your
site?
If so, has the expansion of your site’s use of
technology been undertaken with the
expectation that it would help to raise student
achievement?
If so, has the expansion of your site’s use of
technology been undertaken with the
expectation that it would, in and of itself, raise
student achievement?
As the de facto instructional leader of your site,
have you deliberately expanded the use of
technology at your site?
Instructional Leadership in Technology‐Principal Survey Results
(N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
84
technology was done to facilitate “student’s mastery of ‘21
st
century skills,’” and was
done with the “expectation that it would require dedicated professional development time
and dollars.”
An additional open-ended qualitative item asked principals to “list any additional
technology support or factors that helped to raise student achievement.” The number of
responses was limited and there were no identifiable trends. The following represents the
range of responses gathered from those that responded:
“Support from the district level was essential in implementing our plan.”
(principal A1)
“None that I am aware of since I was not the principal during these years.”
(principal C3)
“Online resources, CST release questions” (principal E5)
“Monitor, monitor, monitor – then always apply to instruction!” (principal G7)
Exiting Program Improvement
A third and final cluster of 10 quantitative questions focused on “Exiting Program
Improvement” in trying to answer the first research question. Similar items were posed
using slightly different terms such “essential,” “important,” and “significant,” in trying to
pinpoint the principals’ evaluation of the efficacy of educational technology in helping
their schools to exit program improvement. To increase the validity of the data, similar
items were posed in the negative tense using terms such as, “not important,” “only a
minor/supporting role,” and “little or no role” to see if principals responded the opposite
85
of how they had responded to items posed in the positive tense. Figure 3 illustrates the
responses of the 11 principals to these forced-choice questions.
86
Figure 3: Exiting Program Improvement –Principal Survey Results
4
0
0
3
0
0
2
0
2
4
2
0
2
5
0
2
5
2
4
3
3
5
6
1
4
6
1
3
2
2
0
4
1
1
5
1
0
3
0
0
2
2
2
1
2
2
3
3
3
2
The best efforts of my staff, and the
implementation of my program improvement
plan, might not have been effective in raising
student achievement, without the…
Educational technology played little of no role in
my site’s success in raising student achievement
and made a little or no contribution in helping
our school exit from program improvement.
Educational technology played a
minor/supporting role in my site’s success in
raising student achievement and made a small
contribution in helping our school exit from …
Educational technology played a significant role
in my site’s success in raising student
achievement and made a significant contribution
in helping our school exit from program …
Educational technology had little or no effect on
raising student achievement at my school site.
Educational technology played only a
minor/supporting role in raising student
achievement at my school site.
Educational technology had a significant impact
on raising student achievement at my school site.
Educational technology was not an important
aspect of my overall school site program
improvement plan or Alternative Governance
Plan.
Educational technology was an important aspect
of my overall school site program improvement
plan or Alternative Governance Plan.
Educational technology was an essential element
of my overall school site program improvement
plan or Alternative Governance Plan.
Exiting Program Improvement ‐Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
87
The statement with the highest degree of agreement, to which 8 out of 11
principals (73%) agreed or strongly agreed was, “Educational Technology played a
significant role in my site’s success in raising student achievement and made a significant
contribution in helping our school exit from program improvement.” The negative items
which stated, “Educational technology was not an important aspect of my overall school
site program improvement plan or Alternative Governance Plan,” and “Educational
technology played a minor/supporting role in my site’s success in raising student
achievement and made a small contribution in helping our school exit from program
improvement,” had only 2 principals (18%) agreeing, with none strongly agreeing.
An additional open-ended qualitative item asked principals, “Do you have any
other comments concerning educational technology, or is there anything else you would
like to add?” The following represents the range of responses gathered from the
principals:
“Technology alone is not the reason for exiting PI status but it is an essential tool
to assist educators in making the best instructional decisions for individual
students. Technology allowed us to focus in on each individual student’s needs
and assisted us in making instructional choices that supported student growth.”
(principal A1)
“Unfortunately, I cannot answer the questions fully due to the fact that I was not
the principal during the PI years. I can share that I am concerned that the
technology we do have has not been utilized as fully as it should be. I am a
proponent of the need to utilize educational technology to the fullest and there
88
isn’t significant evident that this occurs at the school site or that it had a
significant impact on exiting PI. I feel that utilizing categorical funds on
technology and PD to ensure the consistent and effective use increases student
achievement. I just can’t be sure that is what assisted this school in exiting PI.”
(principal C3)
“I am sorry that these are probably not the answers you were hoping for. I cannot
attribute our success (exit of PI) to technology as we have not had much at our
school. We do not have a lab or any computers specifically for students to use.
Just this year, we have purchased SmartBoards for out school and we have just
ordered two portable laptop carts with 30 laptops so perhaps they will help us
continue to succeed.” (principal D4)
“Although educational technology played a key role in student achievement, the
most effective source was the quality of good teaching and focusing on learning
and tracking student progress through current data.” (principal E5)
Summary of the Findings for Research Question Number One:
Overall, does the utilization of educational technology resources support school level
efforts to raise student achievement and exit from program improvement?
In an effort to answer this first research question as to whether or not the
utilization of educational technology support school level efforts to raise student
achievement and exit program improvement, a series of quantitative and qualitative
questions were posed to principals in survey questionnaire instrument. These questions
were clustered into 3 topic areas of 1) Program Improvement Background Information, 2)
Instructional Leadership in Technology, and 3) Exiting Program Improvement.
89
The results from these principal surveys appear to be very supportive of the
premise that the utilization of educational technology resources is indeed supportive of
school level efforts to raise student achievement and exit program improvement. This
was evident within each cluster of questions and across all the clusters. While the NCES
national survey (Gray, Thomas, and Lewis, 2010) provided “data on the availability and
use of educational technology in public elementary schools”, it did not address the
efficacy of educational technology for increasing student achievement and therefore did
not provide points of comparison for this research question.
Ten of the nineteen quantitative items in these three cluster areas were specifically
designed to answer this research question regarding the efficacy of educational
technology to raise student achievement and exit program improvement. On these
specific 10 items, an average of 6.8 out of 11 principals, or 62%, strongly agreed or
agreed with these statements. Conversely, on these same specific 10 items, an average of
only 2.5 out of 11 principals, or 23%, disagreed or strongly disagreed with these
statements. An average of 1.7 out of 11 principals, or 15%, did not know how to respond
to these 10 items or felt they were not applicable to their school.
As previously summarized, the statement with the smallest number (3 out of 11,
or 27%) of principals strongly agreeing or agreeing was, “has the expansion of your site’s
use of technology been undertaken with the expectation that it would, “in and of itself”,
raise student achievement?” Of the 10 quantitative items regarding the efficacy of
educational technology, the statement with the greatest number (10 out of 11, or 91%) of
principals strongly agreeing or agreeing was, “has the expansion of your site’s use of
90
technology been undertaken with the expectation that it would “help” to raise student
achievement?”
While there were many different qualitative items utilized to collect data on
various aspects of using educational technology to increase student achievement and exit
program improvement, the following statement from a principal best summarizes the
findings for this research question: “Technology alone is not the reason for exiting PI
status but it is an essential tool to assist educators in making the best instructional
decisions for individual students. Technology allowed us to focus in on each individual
student’s needs and assisted us in making instructional choices that supported student
growth.”
Findings for Research Question Number Two:
What types, amounts, configurations and uses of educational technology resources
support school level efforts to raise student achievement?
Having established the efficacy of educational technology in general to increase
student achievement and exit program improvement, the second research question was
designed to clarify and specify, “What types, amounts, configurations and uses of
educational technology resources support school level efforts to raise student
achievement.” Quantitative and qualitative data was gathered from the eleven schools
sites using the principal survey instrument and the teacher survey instrument. Questions
for each school on both the principal surveys and the teacher surveys were clustered into
the two focus areas of 1) Technology Hardware, and 2) Technology Software/Programs.
A third cluster of questions on the principal survey instrument focused on Data Analysis
and Data Driven Decision Making (DDDM).
91
The data from 30 quantitative items on the principal survey and 20 items on the
teacher survey was entered into Excel spreadsheets to tally results and create figures to
visually represent the results. In completing these surveys, principals and teachers
responded on a four point Likert scale of “Strongly Agree,” “Agree,” “Disagree,” or
“Strongly Disagree” to form relative levels of agreement or disagreement. Principals and
teachers were given a fifth response option of “N.A. or Don’t Know” if they could not
provide data on a particular item. Unanswered items by various respondents were coded
by the researcher as “N.A. or Don’t Know.” The frequency of responses to each item
was graphically displayed in a horizontal manner to facilitate comparison of responses
within each topic area and more easily identify data trends. Results from the teacher
surveys were disaggregated from primary teachers (grades k-3) and intermediate teachers
(grades 4-6) to see if technology was being used differently at each of these grade spans.
The qualitative data from 9 open-ended items was also collected from the eleven
principals that completed the survey. The responses and descriptive comments from each
principal were transcribed and compiled to provide a cluster of responses for each
qualitative item.
In presenting and interpreting the data for the second research question, responses
from principals and teachers, as well as, from quantitative and qualitative items have
been interspersed within each topic area. This was done to generally reflect the order in
which the items were presented to the principals and teachers for comparison of
responses between principals, primary teachers, and intermediate teachers. Responses to
quantitative and qualitative items from principals were also juxtaposed so that numbers
92
could be put to subjective statements and clarification could be given to forced-choice
responses.
Technology Hardware
The first cluster of questions for both principals and teachers focused on the topic
of Technology Hardware. Principals were asked the following open-ended question:
“What types and brands of technological hardware have been utilized at your school
during the past three years, particularly as it relates to raising student achievement?” The
following represents the range of responses gathered from the 11 principals:
“Document cameras and LCDs have been provided to each classroom. Computer
pods were provided to the primary grades.” (principal A1)
“PC’s, Quizdom, LCD displays, doc. Cams, laptops.” (principal B2)
“Laptops for instruction, document cameras and LCD projectors to enhance
instruction for all learners.” (principal C3)
“We purchased SmartBoards, laptops and document cameras for all classrooms
last year.”
“District wide we were using Renaissance Learning. District also purchased Data
Director as a source to store and to analyze student assessments.” (principal E5)
“MAC computers” (principal F6)
“Desktops, laptops, LCD Projectors, Document cameras, internet, promethean
boards (minimal with goal of expansion), United Streaming” (principal G7)
“Promethean boards, Apple computers” (principal H8)
93
“Our district is a PC district so all computers were Dell. In the past 3 years,
teachers now have document cameras, LCD projectors, sound amplification
systems, and laptops to support instruction. Smart boards are now installed with
the student response system as well.” (principal I9)
“Laptop computers (Dell); LCD projectors (Toshiba); Document cameras (?? –
Don’t remember the brand)” (principal J10)
“Accelerated Reader/STAR Math” (principal K11)
The hardware utilized in the schools, as reported by the principals, was fairly similar
and included: document cameras (7), LCD projectors (6), laptop computers (6), desktop
computers (5), and interactive whiteboards (4). A few principals added rationales for
these choices such as supporting and enhancing instruction. A series of 12 forced-choice
items were then posed to the principals to quantify how much they agreed 12 different
types of technology hardware helped to raise student achievement. Figure 4 illustrates
the responses of the eleven principals to these questions.
94
Figure 4: Technology Hardware – Principal Survey Results
Incredibly, 11 out of 11 principals (100%) strongly agreed or agreed that, “LCD
projectors in classrooms helped to raise student achievement.” Likewise, 10 out of 11
principals (91%) strongly agreed or agreed that, “document cameras for LCD projectors
1
3
5
0
3
4
6
3
2
0
3
2
1
4
3
2
2
6
5
5
6
2
6
6
2
2
2
2
1
0
0
1
2
0
0
1
1
0
0
2
0
0
0
0
0
2
1
1
6
2
1
5
5
1
0
2
1
7
1
1
Has simpler technology devices (e.g. LeapPads,
Franklin Spellers) helped to raise student
achievement?
Has wireless internet connectivity helped to raise
student achievement?
Has high speed internet connectivity helped to
raise student achievement?
Has video equipment (e.g. camcorders, digital
cameras) helped to raise student achievement?
Have interactive computer display
equipment/programs (e.g. Smartboards, Learning
Pads) helped to raise student achievement?
Have document cameras for LCD projectors
helped to raise student achievement?
Have LCD projectors in classrooms helped to raise
student achievement?
Have laptop teacher computers helped to raise
student achievement?
Have desktop teacher computers helped to raise
student achievement?
Have laptop student computers in a portable lab
helped to raise student achievement?
Have desktop student computers in classrooms
helped to raise student achievement?
Have desktop student computers in a computer
lab helped to raise student achievement?
Technology Hardware ‐Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
95
helped to raise student achievement.” Out of the eleven principals surveyed, a significant
number strongly agreed or agreed that several other technology hardware resources also
raised student achievement including: “desktop student computers in classrooms” (82%),
“desktop student computers in a computer lab” (73%), “desktop teacher computers”
(73%), “laptop teacher computers” (73%), “high speed internet connectivity” (73%), and
“wireless internet connectivity” (64%). Conversely, only 2 out of 11 principals (18%)
agreed that, “laptop student computers in a portable lab”, “video equipment”, or “simpler
technology devices” helped to raise student achievement. Interestingly, only 5 out of 11
principals (45%) strongly agreed or agreed that “interactive computer display
equipment/programs” such as SmartBoards helped to raise student achievement.
However, 5 out of 11 principals (45%) also responded “N.A. or Don’t Know” on this
item, which would seem to indicate that the use of these devices is not as wide spread,
but those that do use them believe they help to raise student achievement.
A final follow-up question asked principals to “Please list any additional types,
configurations, or brands of hardware that helped to raise student achievement.” The
number of responses was limited and there were no identifiable trends. The following
represents the range of responses gathered from those that responded:
“NWEA/MAP testing done twice a year. EADMS-our data analysis system.”
(principal A1)
“None that I am aware of since I was not the principal during these years.”
(principal C3)
“Online resources” (principal E5)
96
“On site server” (principal G7)
In order to help triangulate and validate the data gathered from principals
regarding the efficacy of various technology hardware resources to help raise student
achievement, the exact same 12 quantitative items were posed to teachers at the same
schools sites using a teacher survey instrument. Seeking data from at least one
primary teacher (grades k-3) and one intermediate teacher (grades 4-6) at each school,
a total of 18 primary teachers and 14 intermediate teachers completed the teacher
survey instrument. Figure 5 and Figure 6 illustrate the responses of 18 primary
teachers and 14 intermediate teachers for these forced-choice questions.
97
Figure 5: Technology Hardware – Primary Teacher Survey Results
1
8
10
0
2
12
12
12
6
1
4
4
3
3
6
3
2
3
3
4
8
1
7
7
2
2
1
5
0
0
0
0
2
0
1
2
2
2
0
1
3
2
2
1
0
3
1
0
10
3
1
9
11
1
1
1
2
13
5
5
Simpler technology devices (e.g. LeapPads,
Franklin Spellers) helped raise student
achievement.
Wireless internet connectivity helped raise
student achievement.
High speed internet connectivity helped raise
student achievement.
Video equipment (e.g. camcorders, digital
cameras) helped raise student achievement.
Interactive computer display
equipment/programs (e.g. Smartboards,
Learning Pads) helped raise student…
Document cameras for LCD projectors helped
raise student achievement.
LCD projectors in classrooms helped raise
student achievement.
Laptop teacher computers helped raise student
achievement.
Desktop teacher computers helped raise student
achievement.
Laptop student computers in a portable lab
helped raise student achievement.
Desktop student computers in classrooms helped
raise student achievement.
Desktop student computers in a computer lab
helped raise student achievement.
Technology Hardware ‐Primary Teacher Survey Results (N=18)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
98
Figure 6: Technology Hardware – Intermediate Teacher Survey Results
0
4
7
2
4
10
11
8
2
1
4
1
2
8
5
1
2
2
2
4
6
5
6
9
2
1
0
1
1
0
0
0
2
2
1
1
0
0
0
0
0
0
0
1
0
0
0
0
10
1
2
10
7
2
1
1
4
6
3
3
Simpler technology devices (e.g. LeapPads,
Franklin Spellers) helped raise student
achievement.
Wireless internet connectivity helped raise
student achievement.
High speed internet connectivity helped raise
student achievement.
Video equipment (e.g. camcorders, digital
cameras) helped raise student achievement.
Interactive computer display
equipment/programs (e.g. Smartboards, Learning
Pads) helped raise student achievement.
Document cameras for LCD projectors helped
raise student achievement.
LCD projectors in classrooms helped raise
student achievement.
Laptop teacher computers helped raise student
achievement.
Desktop teacher computers helped raise student
achievement.
Laptop student computers in a portable lab
helped raise student achievement.
Desktop student computers in classrooms helped
raise student achievement.
Desktop student computers in a computer lab
helped raise student achievement.
Technology Hardware ‐Intermediate Teacher Survey Results
(N=14)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
99
The results from both the primary teachers and the intermediate teachers were
very similar to the results gathered from principals in terms of what technology hardware
resources they strongly agreed or agreed helped to raise student achievement. LCD
projectors and document cameras were at the top of all 3 surveys with agreement rates
exceeding 80%. However, over 80% of both primary and intermediate teachers felt just
as strongly that “Laptop teacher computers” and “High speed internet connectivity helped
raise student achievement.” Additionally, 86% of intermediate teachers agreed that,
“Wireless internet connectivity helped raise student achievement.”
Technology Software/Programs
A second cluster of questions for both principals and teachers focused on the topic
of Technology Software/Programs. Principals were asked the following open-ended
question: “What types and brands of technological software have been utilized for the
past three years, particularly as it relates to raising student achievement?” The following
represents the range of responses gathered from the 11 principals:
“We have implemented Lexia Phonic and Read 180 as interventions.” (principal
A1)
“ST math, Accelerated Reader, Read Naturally, Rosetta Stone.” (principal B2)
“United Streaming, Kid Biz (Achieve 3000), Fast Forward.” (principal C3)
“We just got Read 180 in September. Some use Lexia Phonics.” (principal D4)
“For the most part teachers used educational links as resource to support their
instruction. No software was purchased.” (principal E5)
“Renaissance Learning – Accelerated Reader web site” (principal F6)
100
“Read 180” (principal G7)
“Earrobics phonetic software, Pearson Successmaker” (principal H8)
“Mind Institute Math, Read 180, Accelerated Reader, Accelerated Math”
(principal I9)
“Accelerated Reader” (principal J10)
“Accelerated Reader/STAR Math” (principal K11)
The software and programs utilized in the schools, as reported by the principals, was
fairly divergent with multiple principals citing the use of Accelerated Reader (5), Read
180 (4), and Lexia Phonics (2). A series of 9 forced-choice items were then posed to the
principals to quantify how much they agreed each of 9 different types of technology
software/programs helped to raise student achievement. Figure 7 illustrates the responses
of the eleven principals to these questions.
101
Figure 7: Technology Software Programs – Principal Survey Results
Responses to each item varied greatly, which would seem to indicate that the
principals were considering their responses carefully and lent credibility to the responses
8
1
0
0
3
4
1
3
2
3
3
4
2
1
4
6
4
5
0
2
0
1
0
1
0
0
1
0
0
1
1
1
1
1
1
1
0
5
6
7
6
1
3
3
2
Have data analysis and Data Driven Decision
Making programs (e.g. EduSoft, OARS, etc.)
helped to raise student achievement?
Have multimedia programs (e.g. United
Streaming, Discovery Education) helped to raise
student achievement?
Have electronic textbooks and supplements to
core curriculum helped to raise student
achievement?
Have ELD programs (e.g. Rosetta Stone) helped
to raise student achievement?
Have leveled math assessment and tutorial
programs (e.g. Accelerated Math, Learning.com)
helped to raise student achievement?
Have leveled reading programs (e.g. Accelerated
Reader, Read 180) helped to raise student
achievement?
Have skills reinforcement software programs
(e.g. Math Facts, Math Blaster) helped to raise
student achievement?
Have California standards based assessment and
tutorial programs (e.g. Study Island) helped to
raise student achievement?
Has computer assisted learning (aka computer
assisted instruction or computer augmented
instruction) been used at your site to raise
student achievement?
Technology Software Programs ‐Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
102
which had a degree of consensus. Incredibly, 11 out of 11 principals (100%) strongly
agreed or agreed that, “data analysis and Data Driven Decision Making programs (e.g.
EduSoft, OARS, etc.) helped to raise student achievement.” A separate cluster of
questions addressing this specific topic was posed to principals later in the survey.
Not surprising given the open-ended responses, principals ranked “leveled reading
programs (e.g. Accelerated Readers, Read 180)” the second highest with 8 out of 11
(73%) strongly agreeing or agreeing that these programs helped to raise student
achievement. Seven out of eleven principals (64%) also agreed that “computer assisted
learning”, “California standards based assessment and tutorial programs (e.g. Study
Island)”, and “skills reinforcement software programs (e.g. Math Facts, Math Blaster)”
helped to raise student achievement. At the opposite end of the spectrum, 4 out of 11
principals (36%) or less agreed that “leveled math assessment and tutorial programs (e.g.
Accelerated Math, Learning.com)”, ELD programs (e.g. Rosetta Stone)”, “electronic
textbooks and supplements to core curriculum”, and “multimedia programs (e.g. United
Streaming, Discovery Education)” helped to raise student achievement.
After asking principals about the use of computer assisted learning to raise student
achievement, to which 64% agreed, a series of 3 follow-up questions were asked for
clarification. The first question asked, “How widespread (e.g. what percentage of
classrooms) has this use of technology been?” The following represents the range of
responses gathered from the 11 principals:
“100%” (principal A1)
“1-6 heavily, K some.” (principal B2)
103
“Varies depending on teacher and grade level.” (principal C3)
“This year we got SmartBoards so it is now 100%. Last year, perhaps 15%.”
(principal D4)
“Every classroom at our school has Internet access. Every classroom has a
minimum of three student computers and a teacher computer.” (principal E5)
“40%” (principal G7)
“2
nd
and 3
rd
grades have consistently used the software” (principal H8)
“65%” (principal I9)
“100%” (principal K11)
The second follow-up question asked, “How frequently has this use of technology
[computer assisted learning] been employed in classrooms?” The following represents
the range of responses gathered from the 11 principals:
“Daily” (principal A1)
“Daily to weekly” (principal B2)
“Varies depending on teacher and grade level.” (principal C3)
“This year SmartBoards are used all day.” (principal D4)
“Technology has been a key component especially with the use of Accelerated
Reader and accelerated Math.” (principal E5)
“30%” (principal G7)
“Daily” (principal H8)
“50%” (principal I9)
“Weekly” (principal K11)
104
The third follow up question in this series asked, “How has this use of technology
[computer assisted learning] employed in the classroom varied by grade level?” The
following represents the range of responses gathered from the 11 principals:
“Each teacher uses the technology according to student needs.” (principal A1)
“Some less, such as Read Naturally in primary grades.” (principal B2)
“3
rd
and 4
th
grades utilize technology to enhance instructional program more than
other grade levels.” (principal C3)
“All grade levels use it.” (principal D4)
“Each grade level has their own criteria as far as AR points/goals for the year.”
(principal E5)
“Varied greatly. Also varied by teacher proficiency and willingness.” (principal
G7)
“The programs adjust based on the individual student’s performance.” (principal
H8)
“Yes” (principal I9)
There were no clearly identifiable trends in the qualitative data gathered from these 3
follow-up questions on computer assisted learning. The percentage of classrooms using
this technology varied from 15% to 100%. The frequency of use varied from daily to
weekly. The best summary statement may have been that it, “varied by teacher
proficiency and willingness.”
A final follow up question asked principals to “Please list any additional
software/program types or brands that helped to raise student achievement.” The number
105
of responses were extremely limited and there no identifiable trends. The following
represents the range of responses from those that responded:
“None that I am aware of since I was not the principal during these years.”
(principal C3)
“Online resources” (principal E5)
“Read 180 as intervention for all struggling readers – during and/or after
school. Rosetta Stone for parents to learn English.” (principal G7)
In order to help triangulate and validate the data gathered from principals
regarding the efficacy of various technology software/program resources to help raise
student achievement, 8 similar quantitative items were posed to teachers at the same
schools sites using a teacher survey instrument. Seeking data from at least one primary
teacher (grades k-3) and one intermediate teacher (grades 4-6) at each school, a total of
18 primary teachers and 14 intermediate teachers completed the teacher survey
instrument. Figure 8 and Figure 9 illustrate the responses of 18 primary teachers and 14
intermediate teachers for these forced-choice questions.
106
Figure 8: Technology Software/Programs – Primary Teacher Survey Results
10
8
2
1
5
9
6
5
5
4
8
4
5
6
7
4
1
1
3
3
2
0
1
2
1
1
1
2
1
0
0
1
1
4
4
8
5
3
4
6
Data analysis and Data Driven Decision Making
programs (e.g. EduSoft, OARS) helped raise
student achievement.
Multimedia programs (e.g. United Streaming)
helped raise student achievement.
Electronic textbooks and supplements to core
curriculum helped raise student achievement.
ELD programs (e.g. Rosetta Stone) helped raise
student achievement.
Leveled math assessment and tutorial programs
(e.g. Accelerated Math) helped raise student
achievement.
Leveled reading programs (e.g. Accelerated
Reader, Read 180) helped raise student
achievement.
Computer assisted learning and skills
reinforcement software programs (e.g. Math
Facts, Math Blaster) helped raise student
achievement.
California standards based assessment and
tutorial programs (e.g. Study Island) helped raise
student achievement.
Technology Software/Programs ‐Primary Teacher Survey Results
(N=18)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
107
Figure 9: Technology Software/Programs – Intermediate Teacher Survey Results
The results from both the primary teachers and the intermediate teachers were
similar to the results gathered from principals in terms of what technology
software/program resources they strongly agreed or agreed helped to raise student
achievement. “Data analysis and Data Driven Decision Making programs” was at the top
8
5
1
1
2
5
3
1
4
6
7
4
4
6
6
5
0
1
2
0
0
0
1
1
0
0
0
0
0
0
0
0
2
2
4
9
8
3
4
7
Data analysis and Data Driven Decision Making
programs (e.g. EduSoft, OARS) helped raise
student achievement.
Multimedia programs (e.g. United Streaming)
helped raise student achievement.
Electronic textbooks and supplements to core
curriculum helped raise student achievement.
ELD programs (e.g. Rosetta Stone) helped raise
student achievement.
Leveled math assessment and tutorial programs
(e.g. Accelerated Math) helped raise student
achievement.
Leveled reading programs (e.g. Accelerated
Reader, Read 180) helped raise student
achievement.
Computer assisted learning and skills
reinforcement software programs (e.g. Math
Facts, Math Blaster) helped raise student
achievement.
California standards based assessment and
tutorial programs (e.g. Study Island) helped raise
student achievement.
Technology Software/Programs ‐Intermediate Teacher Survey
Results (N=14)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
108
of all 3 surveys with agreement rates exceeding 80%. However, with agreement rates of
67%- 83%, both primary and intermediate teachers felt nearly as strongly that “Leveled
reading programs” and “Multimedia programs (e.g. United Streaming) helped raise
student achievement.” While 73% of principals also agreed that “leveled reading
programs” helped to raise student achievement, it was interesting that only 36% of
principals agreed that “multimedia programs” helped to raise student achievement.
Data Analysis and Data Driven Decision Making (DDDM)
A third and final cluster of 9 quantitative questions focused on “Data Analysis
and Data Driven Decision Making (DDDM)” in trying to answer the second research
question of, “what types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement?” These items were
posed just to principals and were designed to expand on the data gathered in an earlier
quantitative item regarding DDDM, to which 100% of the principals signaled agreement.
Figure 10 illustrates the responses of the 11 principals to these forced-choice questions.
109
Figure 10: Data Analysis and Data Driver Decision Making (DDDM) – Principal Survey
Results
7
0
0
7
7
8
8
8
9
3
6
7
3
3
1
2
2
1
0
0
3
0
0
0
0
1
1
0
0
0
0
0
0
0
0
0
1
5
1
1
1
2
1
0
0
Overall, has the technology which was employed
and utilized within your system of DDDM been a
significant factor in raising student
achievement?
If so, has this led to increased student
achievement?
Has the technology employed within your
system of DDDM allowed students to monitor
their own academic progress?
If so, has this led to increased student
achievement?
Has the technology employed within your DDDM
system led to increased, or more effective,
collaboration between grade level teachers?
If so, has this led to increased student
achievement?
Has the technology which was employed within
your DDDM system allowed your teaching staff
to more closely monitor their students’
progress?
Has technology been employed in facilitating
these efforts?
Has data analysis or DDDM been employed in an
effort to raise student achievement?
Data Analysis and Data Driven Decision Making (DDDM) ‐
Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
110
Again, the results regarding the efficacy of Data Analysis and Data Driven
Decision Making (DDDM) to raise student achievement came back very strong with 91%
of principals strongly agreeing or agreeing to 6 of the 9 items, including “Overall, has the
technology which was employed and utilized within your system of DDDM been a
significant factor in raising student achievement?” The two items related to students
being allowed to use technology to monitor their own progress as part of a DDDM
system to increase student achievement only received 55%-64% agreement.
After asking principals “Has technology been employed in facilitating these
[DDDM] efforts”, to which 91% agreed, a series of 4 follow-up questions were asked for
clarification. The first question asked, “If technology has been employed, please briefly
describe how and the specific programs employed.” The following represents the range
of responses gathered from the 11 principals:
“All teachers have web access to their students’ assessment information. The
assessments are taken on scan forms with immediate feedback given.” (principal
A1)
“Our own in house SMART system. Amazing!” (principal B2)
“OARS, Data Directed used to support data analysis.” (principal C3)
“Data Director” (principal D4)
“Through the use of Data analysis, teachers have been able to track the student
performance.” (principal E5)
“Data teams complete data analysis form every two weeks, and share with
principal and Sp Ed teachers through email. Data from EduSoft is used for every
111
district benchmark assessment and all HM Theme Skills test for both testing and
analysis. “ (principal F6)
“Data director – District, school, teacher and student performance. Web – state,
district, school” (principal G7)
“EduSoft is used extensively in data and students performance analysis.”
(principal H8)
“There are several new data systems to support the use and analysis, but since I
am no longer in the schools, I do not know the names of all of the programs.”
(principal I9)
“Online Assessment and Reporting System (OARS) – Teachers trained in
analyzing data by test, cluster, and item. Action plans developed based on the
findings.” (principal J10)
Technology was used in different ways in each of the schools to support data analysis
with specific programs such as Online Assessment and Reporting System (OARS), Data
Director, and EduSoft each being mentioned twice. The second follow-up question
asked, “Has your use of data analysis or DDDM included computer-based, scored, or
tracked student assessments, and if so, how frequently were these tests administered and
data analysis or DDDM utilized?” The following represents the range of responses
gathered from the 11 principals:
“We have a computer-based scored assessment and tracking of students that is
used at a minimum 4 times a year.” (principal A1)
“Yes; every six weeks.” (principal C3)
112
“Yes, every 6-8 weeks.” (principal D4)
“We have had trimester benchmarks but with Data Director, teachers are able to
look at student scores at anytime.” (principal E5)
“See above regarding EduSoft – DBAs – 4x/year. Through 08/09, NWEA given
3x/year.” (principal F6)
“Three times yearly and of course, the CST” (principal G7)
“The assessments were used quarterly.”(principal H8)
“Yes” (principal I9)
“Language Arts – every 6-8 weeks; Math-once every 12 weeks.” (principal J10)
All schools indicated that their DDDM did utilize some form of computer-based,
scored, or tracked student assessments, but the frequency at which these tests were
administered and DDDM utilized varied greatly. The third follow up question in this
series asked, “If your use of data analysis or DDDM has included computer-based
assessments, what subject matter did these tests and DDDM include?” The following
represents the range of responses gathered from the 11 principals:
“These computer-based assessments include Language Arts and Math.” (principal
A1)
“Math, Language Arts” (principal B2)
“ELA and Math.” (principal C3)
“ELA, Math, Science” (principal D4)
“Language Arts and mathematics. Recently we have incorporated Science too.”
(principal E5)
113
“NWEA – ELA, Math. DBA – ELA, Math, Lang. Conventions.” (principal F6)
“RLA, Mathematics” (principal G7)
“Mathematics and Language Arts” (principal H8)
“Language Arts (reading), math, writing” (principal I9)
“Per teacher basis – Online Assessment was an option,, but very few utilized”
(principal J10)
Nearly all schools administered computer based assessments leading to DDDM in the
subjects of language arts and mathematics with a few adding science or writing. These
results were not surprising given that these are the subjects assessed on the California
Standards Tests and for which the schools were held accountable to exit from program
improvement. A fourth and final follow up question asked, “Has data analysis or DDDM
employed in an effort to raise student achievement varied by grade level? If so, how?”
The following represents the range of responses gathered from the 11 principals:
“We don’t use the same data analysis program with the Kindergarten students.”
(principal A1)
“Some have CST data, others interim data.” (principal B2)
“Yes. Grades 2-5 use it more frequently because they are CST testing grades.”
(principal C3)
“No.” (principal D4 and principal F6)
“Yes, district wide, there is a trend. Mainly 4
th
and 5
th
graders score lower
compare to 2
nd
and 3
rd
.” (principal E5)
“Used by all grade levels” (principal G7)
114
“Yes” (principal I9)
“Different types of data are collected at different grade levels; frequency of
collection sometimes varies” (principal J10)
Summary of the Findings for Research Question Number Two:
What types, amounts, configurations and uses of educational technology resources
support school level efforts to raise student achievement?
In an effort to answer this second research question as to what types, amounts,
configurations and uses of educational technology resources support school level efforts
to raise student achievement, a series of quantitative and qualitative questions were posed
to principals and teachers in survey questionnaire instruments. Questions for principals
and teachers were clustered into 2 topic areas of 1) Technology hardware, and 2)
Technology Software/Programs with a third cluster of questions on the principal survey
instrument focused on Data Analysis and Data Driven Decision Making (DDDM).
As noted in the findings from the first research question, the results from the
principal surveys appear to be very supportive of the premise that the general utilization
of educational technology resources are supportive of school level efforts to raise student
achievement and exit program improvement. When surveyed about specific technology
resources being used to increase student achievement to answer the second research
question, the results were differentiated with the level of agreement from principals
ranging from a low of 18% to a high of 100%. The results from teachers when surveyed
about the same specific technology resources being used to increase student achievement
ranged from a low of 11% to a high of 92%.
115
Technology Hardware
When surveyed about specific technology hardware, 11 out of 11 principals
(100%) strongly agreed or agreed that, “LCD projectors in classrooms helped to raise
student achievement.” Likewise, 10 out of 11 principals (91%) strongly agreed or agreed
that, “document cameras for LCD projectors helped to raise student achievement.” A
significant number of principals (73% or more) strongly agreed or agreed that several
other technology hardware resources also raised student achievement including: “desktop
student computers in classrooms”, “desktop student computers in a computer lab”,
“desktop teacher computers”, “laptop teacher computers”, and “high speed internet
connectivity”. Conversely, only 2 out of 11 principals (18%) agreed that, “laptop student
computers in a portable lab”, “video equipment”, or “simpler technology devices” helped
to raise student achievement.
Quantitative data on technology hardware was also gathered from both primary
teachers and intermediate teachers to triangulate the results gathered from principals and
the results were very similar in terms of what technology hardware resources they
strongly agreed or agreed helped to raise student achievement. LCD projectors and
document cameras were at the top of all 3 surveys with agreement rates exceeding 80%.
However, over 80% of both primary and intermediate teachers felt just as strongly that
“Laptop teacher computers” and “High speed internet connectivity helped raise student
achievement.” Additionally, 86% of intermediate teachers agreed that, “Wireless internet
connectivity helped raise student achievement.”
The NCES national survey of educational technology in public schools (Gray,
Thomas, and Lewis, 2010) looked at many of the same technology hardware resources as
116
those examined in the principal and teacher surveys used to gather data in eleven public
elementary schools in California for this study. While the national survey did not directly
measure the efficacy of the various types of educational technology resources, the
researcher asserts that the differing rates of the presence of each of the resources in
classrooms across the country may indicate varying degrees in the belief of each these
technology resources to increase student achievement.
Therefore, the findings that a large percentage of both principals and teachers in
this study agreed that LCD projectors helped to raise student achievement would appear
to be validated by the finding from the national study which found LCD projectors were
present in 97% of elementary schools across the country. Conversely, while a large
percentage of both principals and teachers in this study also agreed that document
cameras attached to LCD projectors helped to raise student achievement, the national
study revealed that document cameras are present in only 49% of elementary schools
across the country. Responses regarding wireless networks were very similar with 61-
86% of principals and teachers agreeing that wireless internet connectivity helped to raise
student achievement and the national survey revealing that 69% of elementary schools in
the country had wireless network access for all or part of the school. In this study of
eleven successful California elementary schools, only 11-43% of principals and teachers
agreed that laptop computers in portable labs helped to increase student achievement, and
22-45% agreed that interactive whiteboards helped to increase student achievement. The
researcher speculated that these low agreement rates may simply be due to these
technologies not being present in the eleven schools for evaluation, given the large
numbers of respondents marking “Don’t Know or N/A”. This contrasts with the NCES
117
(Gray, Thomas, and Lewis, 2010) national survey results which showed that 58% of
elementary schools in the country had laptops on carts and 71% had interactive
whiteboards.
Technology Software/Programs
When surveyed about specific technology software/programs, 11 out of 11
principals (100%) strongly agreed or agreed that, “data analysis and Data Driven
Decision Making programs (e.g. EduSoft, OARS, etc.) helped to raise student
achievement. Principals also ranked “leveled reading programs (e.g. Accelerated
Readers, Read 180)” the second highest with 8 out of 11 (73%) strongly agreeing or
agreeing that these programs helped to raise student achievement. Seven out of eleven
principals (64%) also agreed that “computer assisted learning”, “California standards
based assessment and tutorial programs (e.g. Study Island)”, and “skills reinforcement
software programs (e.g. Math Facts, Math Blaster)” helped to raise student achievement.
However, follow up open-ended items regarding computer assisted learning revealed that
the use of this specific technology resource is widely varied and inconsistent. At the
opposite end of the spectrum, 4 out of 11 principals (36%) or less agreed that “leveled
math assessment and tutorial programs (e.g. Accelerated Math, Learning.com)”, ELD
programs (e.g. Rosetta Stone)”, “electronic textbooks and supplements to core
curriculum”, and “multimedia programs (e.g. United Streaming, Discovery Education)”
helped to raise student achievement.
Again, quantitative data on technology software/programs was also gathered from
both primary teachers and intermediate teachers to triangulate the results gathered from
principals and the results were very similar in terms of what technology
118
software/program resources they strongly agreed or agreed helped to raise student
achievement. “Data analysis and Data Driven Decision Making programs” was at the top
of all 3 surveys with agreement rates exceeding 80%. However, just as there were
differences between principals and teachers regarding technology hardware, the same
held true with technology software/programs. Between 67% and 83% of both primary
and intermediate teachers felt nearly as strongly that “Leveled reading programs” and
“Multimedia programs (e.g. United Streaming) helped raise student achievement.” While
73% of principals also agreed that “leveled reading programs” helped to raise student
achievement, it was interesting that only 36% of principals agreed that “multimedia
programs” helped to raise student achievement.
Data Analysis and Data Driven Decision Making (DDDM)
A third and final cluster of 9 quantitative questions was posed just to principals
and focused on “Data Analysis and Data Driven Decision Making (DDDM)” in trying to
answer the second research question of, “what types, amounts, configurations and uses of
educational technology resources support school level efforts to raise student
achievement?” Given that 100% of principals agreed with the efficacy of DDDM when
asked about its use in the technology software/programs cluster of questions, it was not
surprising that the results regarding the efficacy of DDDM to raise student achievement
came back very strong in this cluster of questions with 91% of principals strongly
agreeing or agreeing to 6 of the 9 items, including “Overall, has the technology which
was employed and utilized within your system of DDDM been a significant factor in
raising student achievement?” Follow up questions revealed that technology was critical
to the successful implementation of DDDM, including specific programs such as Online
119
Assessment and Reporting System (OARS), Data Director, and EduSoft. This strongly
reflected the findings of the 2010 NCES national study of educational technology in U.S.
public schools which showed that 88% of elementary schools “used their district network
or the internet to provide standardized assessment results and data for teachers to
individualize” and plan instruction (Gray, Thomas, and Lewis, 2010). Rather than simply
relying on simple hand-graded paper and pencil assessments for data analysis, all eleven
schools indicated that their DDDM did utilize some form of computer-based, scored, or
tracked student assessments in language arts or math, but the frequency at which these
tests were administered and DDDM utilized varied greatly.
While many different qualitative items were utilized to collect data on specific
technology resources to increase student achievement and exit program improvement, the
following statements from principals best summarize the findings for this research
question: “Technology has been a key component” and “Each teacher uses the
technology according to student needs.”
Findings for Research Question Number Three:
What additional support, funding, and budgets should be provided to support school level
efforts to raise student achievement?
Having established the efficacy of educational technology in general to increase
student achievement and exit program improvement, and the specific technologies used,
the third research question was designed to clarify and specify, “What additional support,
funding, and budgets should be provided to support school level efforts to raise student
achievement?” Quantitative and qualitative data was gathered from the eleven schools
sites using the principal survey instrument and the teacher survey instrument. Questions
for each school on both the principal surveys and the teacher surveys were clustered into
120
the focus area of Technology Support. A second cluster of questions on the principal
survey instrument focused on Educational Technology Costs, Funding, and Budgets.
The data from 11 quantitative items on the principal survey and 4 items on the
teacher survey was entered into Excel spreadsheets to tally results and create figures to
visually represent the results. In completing these surveys, principals and teachers
responded on a four point Likert scale of “Strongly Agree,” “Agree,” “Disagree,” or
“Strongly Disagree” to form relative levels of agreement or disagreement. Principals and
teachers were given a fifth response option of “N.A. or Don’t Know” if they could not
provide data on a particular item. Unanswered items by various respondents were coded
by the researcher as “N.A. or Don’t Know.” The frequency of responses to each item
was graphically displayed in a horizontal manner to facilitate comparison of responses
within each topic area and more easily identify data trends. Results from the teacher
surveys were disaggregated from primary teachers (grades k-3) and intermediate teachers
(grades 4-6) to see if technology was being used differently at each of these grade spans.
The qualitative data from 11 open-ended items was also collected from the eleven
principals that completed the survey. The responses and descriptive comments from each
principal were transcribed and compiled to provide a cluster of responses for each
qualitative item.
In presenting and interpreting the data for the third research question, responses
from principals and teachers, as well as, from quantitative and qualitative items have
been interspersed within each topic area. This was done to generally reflect the order in
which the items were presented to the principals and teachers for comparison of
responses between principals, primary teachers, and intermediate teachers. Responses to
121
quantitative and qualitative items from principals were also juxtaposed so that numbers
could be put to subjective statements and clarification could be given to forced-choice
responses.
Technology Support
The first cluster of questions for both principals and teachers focused on
the topic of Technology Support. Having gathered data regarding the efficacy of
educational technology to increase student achievement and the specific technology tools
to be used, teachers were questioned regarding the support needed to properly implement
educational technology to help raise student achievement. Four quantitative items were
posed to teachers at the same schools sites using a teacher survey instrument. Seeking
data from at least one primary teacher (grades k-3) and one intermediate teacher (grades
4-6) at each school, a total of 18 primary teachers and 14 intermediate teachers completed
the teacher survey instrument. Figure 11 and Figure 12 illustrate the responses of 18
primary teachers and 14 intermediate teachers for these forced-choice questions.
Figure 11: Technology Support – Primary Teacher Survey Results
3
2
5
7
8
3
2
7
2
2
4
2
1
1
1
1
4
10
6
1
District tech support helped raise student
achievement.
A classified computer aid helped raise student
achievement.
A certificated tech coach helped raise student
achievement.
Staff development and hands‐on training in
technology helped raise student achievement.
Technology Support‐Primary Teacher Survey Results (N=18)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
122
Figure 12: Technology Support – Intermediate Teacher Survey Results
The highest level of agreement among both groups of teachers came to the
statement that, “Staff development and hands-on training in technology helped raise
student achievement”, with 78% of primary teachers and 64% of intermediate teachers
strongly agreeing or agreeing. There was also relative agreement, 61% of primary
teachers and 64% of intermediate teachers, to the statement that, “District tech support
helped raise student achievement.” Conversely, less than 39% of both primary teachers
and intermediate teachers agreed that either “a certificated tech coach” or “a classified
computer aid” helped to raise student achievement. This could be a significant find,
given the personnel costs associated with such positions. However, given the high
number of responses marked as “Don’t Know” to these two items, these findings may
reflect a lack of existence of these positions at these school sites, rather than a lack of
efficacy.
1
1
1
6
8
3
4
3
3
3
2
2
0
0
0
1
2
7
7
2
District tech support helped raise student
achievement.
A classified computer aid helped raise student
achievement.
A certificated tech coach helped raise student
achievement.
Staff development and hands‐on training in
technology helped raise student achievement.
Technology Support‐Intermediate Teacher Survey Results (N=14)
Strongly Agree Agree Disagree Strongly Disagree Don't Know
123
Before answering any questions regarding Technology Support, principals were
frontloaded with the following statement which was embedded in the principal’s survey
instrument at the beginning of the cluster of questions focused on this topic:
Insufficient or delinquent tech support- which frequently cannot keep pace with
the needs of many of the nation’s school technology infrastructure – tends to be a
widespread, common, and contentious issue in many school districts. Indeed, this
educational technology issue is frequently cited as a hindrance, or impediment, to
the effective implementation of educational technology within our public
educational system. With these thoughts in mind, please answer the following
tech support related questions.
The first question posed to principals on the topic of Technology Support was an
open-ended item which asked principals to, “Please briefly describe the entity you have
depended upon for tech support of your hardware, software, and networks.” The
following represents the range of responses gathered from the 11 principals:
“Our district contracts with an outside source for our data/benchmark assessment
program. The district offers us support with hardware and software and we have
a technology person on staff.” (principal A1)
“Our own TLS Department” (principal B2)
“Support once a week from district personnel and when needed by request.”
(principal C3)
“District Support” (principal D4)
“Our IT department has been a key factor in providing our technological needs.”
(principal E5)
“20% on-site tech coordinator, ITT, district tech maintenance” (principal F6)
“District personnel, site personnel” (principal G7)
124
“The school district itself provides training in specific applications and programs.
Our school site also has a technology coach that provides support.” (principal H8)
“District and site support” (principal I9)
“District level tech support” (principal J10)
“Middle/High School Teacher/Tech coord.” (principal K11)
Most principals cited a combination of both district level and site level support. A
series of 6 forced-choice items were then asked of the principals seeking quantitative data
on who provided technology support to the schools and whether or not that support was
adequate to help raise student achievement. Figure 13 illustrates the responses of the 11
principals to these forced-choice items.
125
Figure 13: Technology Support – Principal Survey Results
As was reflected in the open-ended items, principals reported receiving tech
support and maintenance from a variety of sources including “district technology office”
(91%), “on-site tech coordinator/coach” (73%), and “private firm” (36%). Seven out of
eleven principals (64%) strongly agreed or agreed that “tech support for [the] school’s
hardware, software, and networks on site [had] been adequate.” However, 36% of the
1
0
3
1
2
2
3
3
4
3
8
6
4
5
4
4
0
0
2
2
0
3
0
2
1
1
0
0
1
1
Has insufficient or delinquent tech support for
your school been a concern and an impediment
to the successful implementation of instructional
strategies designed to raise student
achievement?
Has tech support for your school’s hardware,
software, and networks on site been inadequate?
Has tech support for your school’s hardware,
software, and networks on site been adequate?
Has a private firm provided all, or some, of your
tech support and maintenance?
Has your district technology office provided
some, or all, of your tech support and
maintenance?
Has an on‐site tech coordinator/coach provided
some, or all of your tech support and
maintenance?
Technology Support‐Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
126
principals disagreed that tech support had been adequate, with no neutral responses. This
was one of the few quantitative items in all of the surveys to which there were no neutral
responses, which would seem to indicate that principals felt very strongly about tech
support at their sites. It is likely that the same 4 out of 11 principals (36%) strongly
agreed or agreed that, “insufficient or delinquent tech support for [their] school [had]
been a concern and an impediment to the successful implementation of instructional
strategies designed to raise student achievement.”
A follow up open-ended item asked principals to, “Please briefly list any
additional types of technological support (I.T., technology coaches, lab aides, etc.) that
may have been utilized at your site for the past three years.” The following represents the
range of responses gathered from the 11 principals:
“Our IT department is our main support with a technology informed teacher as a
backup.” (principal A1)
“3 hour on site. On call experts. Weekly shared TA. Great dept.” (principal B2)
“Tech coach funded by district until this year due to budget cuts.” (principal C3)
“None” (principal D4)
“We have a district tech person that comes once a week to assist us with trouble
shooting issues.” (principal E5)
“Lab aide, husband (rarely), daughter (rarely)” (principal G7)
“Site Tech Coach, Site Tech Representative, LAB Aide, private company sent by
district, district tech personnel” (principal I9)
“N/A – Training was provided on site and by teachers on site.” (principal J10)
127
As with the previous open-ended item and the forced-choice items, tech support for
each of the schools was provided mainly by district personnel with some additional
support coming from site staff and private companies.
A subset of the Technology Support cluster of items included a series of questions on
hardware replacement cycles, beginning with 3 forced-choice quantitative questions.
Figure 14 illustrates the responses of the 11 principals:
Figure 14: Hardware Replacement – Principal Survey Results
Only 45% of the principals agreed that their site had set a replacement cycle for
computers and peripherals. Even worse, only 27% of the principals expected funding
would be in place to meet the future expenditures associated with meeting projected
hardware replacement cycles. Even so, 82% of the principals agreed that they had
staggered purchases of computers and peripherals in an effort to spread the replacement
costs associated with hardware replacement cycles. A follow up open-ended question
asked principals, “What is the approximate length of time your site has set for this
1
0
0
8
3
5
1
3
5
0
1
0
1
4
1
Has your site endeavored to stagger your
purchases of computers and peripherals in an
effort to spread the replacement costs
associated with hardware replacement cycles?
Is funding currently in place, or do you expect
that funding will be available, in order to meet
the future expenditures associated with meeting
your projected hardware replacement cycle?
Has your site set a replacement cycle for your
computers and peripherals?
Hardware Replacement‐Principal Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
128
hardware replacement?” The following represents the range of responses gathered from
the 11 principals:
“We have about a five year replacement plan. Having had many MACs on site
we have been replacing them with PCs over the past 4 years.” (principal A1)
“3-5 years” (principal B2)
“Three years” (principal C3)
“Ideally, this would be done in a five year cycle. However, with limited available
funds, this isn’t always possible.” (principal J10)
“3-5 yr.” (principal K11)
All responses from principals fell into a range of 3-5 years for replacement cycles. A
second open-ended follow up question asked principals to, “Please briefly describe where
your site’s computers and peripherals are in their replacement cycle (e.g. 20% of
computers one year old, half of printers over 5 years old, all LCD projectors purchased 3
years ago, etc.).” The following represents the range of responses gathered from the 11
principals:
“Approximately 25 % of our computers were purchased 2 years ago. 25% will be
purchased this year. 50 % of the LCDs and Document cameras were purchased 2
years ago with the other 50% purchased one year ago. All printers are being
replaced this year.” (principal A1)
“Laptops 3 years. PCs based upon need and operating system age.” (principal
B2)
129
“Majority of computers between three and five years old; some even older in
classrooms. Laptops for teachers were purchased two years ago. LCD projectors
and Document Cameras purchased two years ago.” (principal C3)
“All desktops are 5 years old. All laptops are 1 year old.” (principal D4)
“75% of our computers are between 4-5 years old. 20% are about 2 years old.
5% are new or a year old.” (principal E5)
“All classroom MACs are OS9 and will no longer be able to support Renaissance
Learning in Jan. 2010. We are currently replacing approximately 70 computers –
through lease. ELMOs and LCDs for all classrooms purchased in 08/09.”
(principal F6)
“Don’t know” (principal I9)
“Teacher laptops, LCD projectors, and document cameras were all provided to
staff at the time of our exit from PI. All other computers and peripherals were
quite dated. I have since left this site, and I cannot speak to the technology
replacement cycle.” (principal J10)
While the responses varied, most principals reported desktop computers to be older in
their replacement cycle (3-5 years), and laptop computers and LCD projectors to be
newer in their replacement cycles (2 years or less). These findings would appear to
reflect the perceived efficacy of these technology resources that principals and teachers
reported in earlier portions of the principal and teacher surveys. In other words,
technology resources that were perceived to have a greater positive impact on increasing
student achievement were a higher budget priority.
130
Educational Technology Costs, Funding, and Budgets
Before answering any questions in the next major focus area of Educational
Technology Costs, Funding, and Budgets, principals were frontloaded with the following
statement which was embedded in the beginning of the cluster of questions specifically
addressing this topic in the principal’s survey instrument:
The purpose of this study is to develop a research based estimate of the resources
needed for a school level technology program that will support instructional
programs aimed at dramatically improving student performance. The study uses a
school finance adequacy model developed by Lawrence O. Picus and Allen
Odden. This model relies on research on what works in schools to identify
resources schools need to ensure all children have a chance to perform at high
levels. The goal is to take the Picus technology “costing-out” model and compare
the estimate resources with real world site level budgets and funding. The
Picus/Odden technology “costing-out” model identifies direct technology costs as
the direct costs of purchasing, upgrading, and maintaining computer technology
hardware and software. Staffing and staff development costs are captured
elsewhere in the model. Under this narrowly defined assessment of technology
costs, the Picus/Odden model finds annual costs per student are approximately
$250 for the purchase, update, and maintenance of hardware and software. The
Picus/Odden model thus asserts that the $250 figure is sufficient to purchase,
upgrade, and maintain computers, servers, operating systems and productivity
software, network equipment, and student administrative system and financial
systems software, as well as other equipment such as copiers. This figure is said
to be sufficient to cover medium priced student administrative and financial
systems software packages.
A series of 5 open-ended questions were posed to principals to gather data on this
topic of school level Educational Technology Costs, Funding, and Budgets. The first
question asked, “On average, looking at your site budgets, what are your approximate
annual expenditures on technology including hardware, software, programs, network
connectivity, etc. (please exclude staffing, staff development, and tech support)? How
many students are enrolled at your school?” The following represents the range of
responses gathered from the principals:
131
“On average, I would say we spend about $45,000 on technology hardware and
software per year. The district takes care of connectivity. We have 558
students.” (principal A1)
“$20,000. 820” (principal D4)
“The average cost after taking in consideration all of the expenses is about
$5,000. Our average student enrollment is about 700 students.” (principal E5)
“$10,000 - $12,000. 445” (principal F6)
“$150,000” (principal G7)
“Varies from year to year depending on the availability of categorical monies.
Approximately 13000 this includes the on site technology support personnel”
(principal H8)
The range of annual site expenditures on educational technology was extremely wide,
stretching from a minimum of $5,000 to a maximum of $150,000, with an average of
$40,667. Student enrollment at each of these elementary schools, as reported by 4 of the
principals, ranged from a minimum of 445 to a maximum of 820, with an average of 631.
This resulted in an average annual per pupil expenditure on educational technology of
$64.
The second open-ended question in the series on this topic asked, “What technology
related staff positions exist at your site? What is their full time equivalency (FTE)?” The
following represents the range of responses gathered from the 11 principals:
“0. We happen to have a teacher who is proficient with technology that helps as
he can.” (principal A1)
132
“37.5% [.375 FTE]” (principal B2)
“None” (principal D4)
“None – only a district IT person who comes once a week for trouble shooting
purposes and maintenance.” (principal E5)
“20% technology coordinator. ITT – stipend” (principal F6)
“.3 fte” (principal G7)
“.5” (principal H8)
“Tech coord. 2 hr a day” (principal K11)
The range of site technology support positions varied from a minimum of none (3
sites) to a maximum of half-time, with an average of 0.21 FTE. The third and closely
related open-ended question in this series asked, “How are your technology related staff
positions funded, and what is their approximate annual expense?” The following
represents the range of responses gathered from the 11 principals:
“Categorical $20K” (principal B2)
“Tech coach was stipend position held by an on-site teacher.” (principal C3)
“None” (principal D4)
“Tech coordinator – categorical - $11,000. ITT – district stipend - $1200.”
(principal F6)
“Title I, $33,000” (principal G7)
“Categorical title I full time cost approximately 50,000” (principal H8)
Two thirds of the school site technology related staff positions were categorically
funded, with half of those being paid with federal Title I funds.
133
The fourth open-ended question in this series asked, “What are your approximate
annual expenses for staff development related to technology? How is this funded?” The
following represents the range of responses gathered from the 11 principals:
“I don’t know because staff development is taken care of by the district. We
incorporate staff development through our own knowledgeable teachers during
collaboration time, 4 times a year.” (principal A1)g
“Release time, varies” (principal B2)
“District funded” (principal D4)
“About $500 out of School Improvement.” (principal E5)
“$0” (principal F6)
“$15,000” (principal G7)
“Minimal – On site or district personnel are provided” (principal H8)
Most principals had very little to say about staff development related to educational
technology. Most felt that this was the responsibility of the district and if they spent any
site funds, it was very minimal. These findings stand in stark contrast with the data
gathered from teachers who ranked “staff development and hands-on training in
technology” as the most important technology support factor in helping to raise student
achievement.
The fifth and final open-ended question in this series asked, “What is your district’s
(or site’s) projected annual budget, per student, for technology support? How is it
funded?” The responses were limited and there were no identifiable trends. The
following represents the range of responses gathered from the 11 principals:
134
“No district tech budget.” (principal D4)
“About $50 per student.” (principal E5)
“Not stated” (principal G7)
Two quantitative items were included in this cluster of questions on the topic of
Educational Technology Costs, Funding, and Budgets. One asked principals about the
adequacy of the proposed $250 per student funding level for educational technology. The
other asked about drafting a technology plan or budget as part of the Alternative
Governance Plan to exit the school from program improvement. Figure 15 illustrates the
responses from the 11 principals.
Figure 15: Educational Technology Costs, Funding, and Budgets – Principal Survey
Results
Only 5 of the 11 principals (45%) agreed that $250 per student was adequate to
fund educational technology at the elementary school level. This was interesting, given
that these schools were only spending $64 per student out of their own site funds on
3
1
2
4
3
3
1
1
2
2
During your site’s Program Improvement status
did your site draft an annual technology plan or
budget, or was technology a formal part of your
Alternative Governance Plan?
In your professional opinion, is the Picus/Odden
$250 per student proposed dollar figure for
funding educational technology at the
elementary school level adequate?
Educational Technology Costs, Funding, and Budgets ‐Principal
Survey Results (N=11)
Strongly Agree Agree Disagree Strongly Disagree N.A. or Don't Know
135
educational technology. Perhaps this reflected tough budget priority decisions being
made and a desire for more general fund support from the districts.
On the second quantitative item, only 45% of principals agreed that technology
was a formal part of their Alternative Governance Plan. A follow up qualitative question
asked, “Looking back, what was the total approximate annual dollar figure attached to the
technology aspects of your Program Improvement efforts?” The following represents the
range of response gathered from the 11 principals:
We budgeted approximately $50,000 per year for computer based programs and
hardware.” (principal A1)
“Over $100K in 5 years.” (principal B2)
“Minimal $10,000” (principal D4)
“With site license cost and staff development it was about $5,000.” (principal E5)
“$200” (principal F6)
“$75,000 per year” (principal G7)
“Don’t remember but it was a large part of the original PI budget. Maybe 1/3 of
it.” (principal I9)
Reflecting the results from the quantitative item, five of the eleven principals cited
large figure budgets for technology ranging from $10,000-100,000, with one of these
principals stating, “It was a large part of the original PI budget. Maybe 1/3 of it.” A
second open-ended follow up question asked, “Did your school have the benefit of a local
bond issue or grant designed to assist in funding your implementation and use of
technology?” The responses were limited which would seem to indicate that grants are
136
not a reliable source of funding for educational technology. The following represents the
range of responses from the 11 principals:
“A grant supported the purchase of computers in the lab four to five years ago.”
(principal C3)
“No.” (principal D4, principal E5, principal I9)
“Yes” (principal G7)
Summary of the Findings for Research Question Number Three:
What additional support, funding, and budgets should be provided to support school level
efforts to raise student achievement?
The findings from the first two research questions established that educational
technology in general can be used to increase student achievement and exit program
improvement, and the specific technologies that were most effective in doing so. The
third research question was designed to clarify and specify, “What additional support,
funding, and budgets should be provided to support school level efforts to raise student
achievement?” Quantitative and qualitative data was gathered from the eleven school
sites using the principal survey instrument and the teacher survey instrument. Questions
for each school on both the principal surveys and the teacher surveys were clustered into
the focus area of Technology Support. A second cluster of questions on only the
principal survey instrument focused on Educational Technology Costs, Funding, and
Budgets.
Technology Support
When surveyed about Technology Support, the highest level of agreement among
both primary and intermediate teachers came to the statement that, “Staff development
137
and hands-on training in technology helped raise student achievement”, with 78% of
primary teachers and 64% of intermediate teachers strongly agreeing or agreeing. There
was also relative agreement, 61% of primary teachers and 64% of intermediate teachers,
to the statement that, “District tech support helped raise student achievement.”
Conversely, less than 39% of both primary teachers and intermediate teachers agreed that
either “a certificated tech coach” or “a classified computer aid” helped to raise student
achievement. This could be a significant find, given the personnel costs associated with
such site positions. However, given the high number of responses marked as “Don’t
Know” to these two items, these findings may reflect a lack of existence of these
positions at these school sites, rather than a lack of efficacy. This matched the results
from the NCES national educational technology survey (Gray, Thomas, and Lewis ,
2010) which showed that only 27% of elementary schools in the country reported having
full time staff responsible for technology support and/or integration.
Both qualitative and quantitative items showed that principals reported receiving
tech support and maintenance from a variety of sources including “district technology
office” (91%), “on-site tech coordinator/coach” (73%), and “private firm” (36%). These
results very closely matched the NCES national study of educational technology in U.S.
public schools (Gray, Thomas, and Lewis, 2010) in which 85% of respondents strongly
agreed or agreed that “District level technology staff provided technical support” and
70% of respondents strongly agreed or agreed that “School level technology staff
provided technical support”.
Nationally, the percentage of schools agreeing that “technical support for
educational technology is adequate” ranged from 60% in high poverty schools to 74% in
138
low poverty schools (Gray, Thomas, and Lewis, 2010). The results from the eleven
California principals surveyed for this study split the difference with 64% strongly
agreeing or agreeing that “tech support for [the] school’s hardware, software, and
networks on site [had] been adequate.” However, 36% of the principals disagreed that
tech support had been adequate, with no neutral responses. This was one of the few
quantitative items in all of the surveys to which there were no neutral responses, which
would seem to indicate that principals felt very strongly about tech support at their sites.
It is likely that the same 4 out of 11 principals (36%) strongly agreed or agreed that,
“insufficient or delinquent tech support for [their] school [had] been a concern and an
impediment to the successful implementation of instructional strategies designed to raise
student achievement.”
Potentially compounding any weaknesses in tech support to set up and maintain
school technology resources was the finding that only 45% of the principals agreed that
their site had set a replacement cycle for computers and peripherals. This could prove to
be problematic as older machines typically require greater maintenance to keep running,
and upgrades to run newer programs. Even worse, only 27% of the principals expected
funding would be in place to meet the future expenditures associated with meeting
projected hardware replacement cycles. Even so, 82% of the principals agreed that they
had staggered purchases of computers and peripherals in an effort to spread the
replacement costs associated with hardware replacement cycles. Most principals reported
desktop computers to be older in their replacement cycle (3-5 years), and laptop
computers and LCD projectors to be newer in their replacement cycles (2 years or less).
These findings would appear to reflect the perceived values placed on these different
139
technology resources that principals and teachers reported in the principal and teacher
surveys. In other words, technology resources that were perceived to have a greater
positive impact on increasing student achievement were a higher budget priority.
Educational Technology Costs, Funding, and Budgets
The second topic investigated in trying to answer the research question was
Educational Technology Costs, Funding, and Budgets. Principals reported an extremely
wide range of annual site expenditures on educational technology, stretching from a
minimum of $5,000 to a maximum of $150,000, with an average of $40,667. Student
enrollment at each of these elementary schools ranged from a minimum of 445 to a
maximum of 820, with an average of 631. This resulted in an average annual per pupil
expenditure on educational technology of $64. This was interesting, given that only 5 of
the 11 principals (45%) agreed that the $250 per student proposed by Picus and Odden
was adequate to fund educational technology at the elementary school level, which
closely matches sentiments of principals nationally (Gray, Thomas, and Lewis, 2010)
who when asked if “Funding for educational technology is adequate”, 40% agreed and
60% disagreed. Likewise, only 45% of the California principals agreed that technology
was a formal part of their Alternative Governance Plan. Perhaps all of this reflected
tough budget priority decisions being made and a desire for more general fund support
from the districts.
The range of site technology support positions varied from a minimum of none (3
sites) to a maximum of half-time, with an average of 0.21 FTE, most being categorically
funded. Most principals had little to say about staff development in support of effectively
implementing educational technology. Most felt this was the responsibility of the district
140
and if they spent any site funds, it was very minimal. These findings are significant in
that they could represent a source of untapped potential for increasing student
achievement and potential conflict with teachers who ranked “staff development and
hands-on training in technology” as the most important technology support factor in
helping to raise student achievement.
Findings for Research Question Number Four:
Do state technology surveys and state achieved test data indicate a positive correlation
between spending money on technology and high student achievement?
In order to further triangulate the findings in the first three research questions
developed from data gathered through the principal and teacher survey instruments, state
technology surveys consisting of quantitative data for each school site were retrieved
from the California Department of Education (CDE) web site. Data from the CDE
School Technology Surveys were entered into Excel spreadsheets to tabulate results for
comparison to statewide averages. Tables 1-8 present the data gathered from these
School Technology Surveys.
141
Table 1: School Site Student to Computer Ratios
School Students per Computer Students per Internet
Connected Computer
A-1 2.64 3.35
B-2 3.03 3.46
C-3 6.89 21.97
D-4 14.71 18.91
E-5 5.59 25.81
F-6 2.76 21.40
G-7 3.59 3.75
H-8 2.50 2.50
I-9 3.91 9.46
J-10 5.39 5.39
K-11 2.98 2.98
Median 3.59 5.39
Average 4.91 10.82
Statewide 4.11 4.59
Difference +0.80 +6.23
142
Table 2: School Site Computers Connected to Internet
School % Computers Connected
to Internet
% Computers Not
Connected to Internet
A-1 78.82% 21.18%
B-2 87.76% 12.24%
C-3 31.37% 68.63%
D-4 77.78% 22.22%
E-5 21.68% 78.32%
F-6 12.90% 87.10%
G-7 95.77% 4.23%
H-8 100.0% 0.00%
I-9 41.32% 58.68%
J-10 100.0% 0.00%
K-11 100.0% 0.00%
Median 78.82% 21.18%
Average 67.95% 32.05%
Statewide 89.63% 10.37%
Difference -21.68% 21.68%
143
Table 3: School Site Computer Locations
School Classrooms Lab Library Other
A-1 75.26% 20.53% 4.21% 0.00%
B-2 80.23% 15.12% 1.55% 3.10%
C-3 56.77% 19.35% 1.29% 22.58%
D-4 100.0% 0.00% 0.00% 0.00%
E-5 82.98% 12.06% 2.84% 2.13%
F-6 48.81% 0.00% 9.52% 41.67%
G-7 82.11% 16.06% 1.83% 0.00%
H-8 83.88% 14.88% 1.24% 0.00%
I-9 100.0% 0.00% 0.00% 0.00%
J-10 37.35% 21.69% 3.61% 37.35%
K-11 60.56% 33.80% 5.63% 0.00%
Median 80.23% 15.12% 1.83% 0.00%
Average 73.45% 13.95% 2.88% 9.71%
Statewide 65.33% 25.55% 5.65% 3.48%
Difference +8.12% -11.60% -2.77% +6.23%
144
Table 4: Age of School Site Computers
School < 1 Year 1-2 Years 2-3 Years 3-4 Years > 4 Years
A-1 17.37% 0.00% 16.84% 31.58% 34.21%
B-2 5.81% 2.71% 0.00% 16.28% 75.19%
C-3 29.25% 31.97% 20.41% 8.16% 10.20%
D-4 0.00% 0.00% 0.00% 86.05% 13.95%
E-5 14.89% 18.44% 24.11% 8.51% 34.04%
F-6 33.33% 12.50% 0.00% 0.00% 54.17%
G-7 6.88% 6.42% 11.01% 5.50% 70.18%
H-8 9.92% 5.37% 9.92% 5.37% 69.42%
I-9 0.00% 0.00% 0.00% 6.61% 93.39%
J-10 4.81% 42.31% 19.23% 8.65% 25.00%
K-11 9.86% 0.00% 0.00% 28.17% 61.97%
Median 9.92% 5.37% 9.92% 8.51% 54.17%
Average 12.01% 10.88% 9.23% 18.63% 49.25%
Statewide 12.47% 12.88% 13.58% 14.17% 46.90%
Difference -0.46% -2.00% -4.35% +4.46% +2.35%
145
Table 5: Expected Change in School Site Computer Availability
School % Computers
Scheduled to be
Retired
% Computers
Expected to be
Added
% Net Gain or
Loss
A-1 22.11% 0.00% -22.11%
B-2 3.10% 0.00% -3.10%
C-3 0.00% 0.00% 0.00%
D-4 0.00% 0.00% 0.00%
E-5 0.00% 0.00% 0.00%
F-6 0.00% 0.00% 0.00%
G-7 0.00% 4.59% +4.59%
H-8 3.31% 0.00% -3.31%
I-9 0.00% 0.00% 0.00%
J-10 1.92% 12.05% +10.13%
K-11 16.90% 16.90% 0.00%
Median 0.00% 0.00% 0.00%
Average 4.30% 3.05% -1.25%
Statewide 6.80% 8.46% +1.66%
Difference -2.50% -5.41% -2.91%
146
Table 6: Average School Site Hardware Fix Time
(1 = Two hours; 2 = One day; 3 = Two to Five days; 4 = One week; 5 = One month +)
School Hardware Fix Time
A-1 3.00
B-2 3.00
C-3 1.00
D-4 2.00
E-5 3.00
F-6 3.00
G-7 3.00
H-8 3.00
I-9 5.00
J-10 4.00
K-11 2.00
Median 3.00
Average 2.91
Statewide 2.74
Difference +0.17
147
Table 7: School Site Level Technical Support Staffing (FTE per 1,000 students)
School On-Site Certificated
Support
On-Site Classified
Support
A-1 0.00 0.00
B-2 0.00 0.56
C-3 0.10 0.00
D-4 0.20 0.00
E-5 0.00 1.00
F-6 0.00 0.20
G-7 0.00 0.24
H-8 0.00 0.00
I-9 0.00 0.00
J-10 0.50 0.00
K-11 0.00 0.00
Median 0.00 0.00
Average 0.07 0.18
Statewide 0.35 0.79
Difference -0.28 -0.61
148
Table 8: School Site Level Technology Curriculum Support Staffing (FTE per 1,000
students)
School
On-Site Certificated
Technology Curriculum
Support Staffing
On-Site Classified
Technology Curriculum
Support Staffing
A-1 0.00 0.00
B-2 0.00 0.10
C-3 0.20 0.00
D-4 0.00 0.00
E-5 0.00 1.00
F-6 1.00 0.00
G-7 0.00 0.00
H-8 0.30 0.00
I-9 0.00 0.00
J-10 0.50 0.00
K-11 0.00 0.00
Median 0.00 0.00
Average 0.18 0.10
Statewide 0.38 0.28
Difference -0.20 -0.18
149
Summary of Findings for Research Question Number Four:
Do state technology surveys and state achieved test data indicate a positive correlation
between spending money on technology and high student achievement?
Data from the California School Technology Surveys for each of the eleven
schools was gathered in an attempt to triangulate and corroborate the data collected from
the principal survey instrument and the teacher survey. It was also collected to see how
the use of 8 different educational technology resources in these schools compared to the
use of these same resources in schools across California. There were also many points of
comparison between the California School Technology Surveys and the national study of
educational technology in U.S. public schools conducted by the National Center for
Education Statistics (Gray, Thomas, and Lewis , 2010). Higher relative use of these 8
key educational technology resources, as determined by the California Department of
Education, in the eleven schools in this study would support a positive correlation
between spending money on educational technology and high student achievement.
At first, the data gathered from these 11 elementary schools that had exited
program improvement did not appear to support such a positive correlation. Across the
board on all 8 measures, the averages which were calculated pointed to a negative
correlation. However, the data from many of the individual schools still pointed to a
positive correlation between technology and student achievement. The researcher
speculated that a few outliers were skewing the data for this relatively small sample of 11
schools compared to the number of all schools statewide. The researcher decided to also
calculate the median number for each measure of educational technology which showed
how sensitive the data was to the small sample size. Both figures were presented in the
tables.
150
Table 1 revealed that the average ratio of students to computers for the 11 schools
(4.91) exceeded the average for the state (4.11). However, 7 of the 11 schools had a
lower ratio than the state, resulting in a lower median ratio of 3.59. This single most
important piece of data from all of the tables demonstrated a positive correlation between
spending money on educational technology and high student achievement. There were,
however, more students per internet connected computer at the 11 schools
(average=10.82, median=5.39) compared to the statewide average (4.59) using either
method. The average for these sample schools and the statewide average both exceeded
the national ratio of students to computers with internet access which is 3.2 to 1 (Gray,
Thomas, and Lewis , 2010). Table 2 reinforced this finding, showing that more
computers are connected to the internet at schools statewide (89.63%), than at the 11
schools considering either the average (67.95%) or the median (78.82%). Again, these
results show that schools statewide and in the study sample trailed schools nationally
which reported 98% of instructional computers having internet access (Gray, Thomas,
and Lewis , 2010).
Table 3 showed that the 11 schools in the study have a higher percentage of
computers located in classrooms considering both the average (73.45%) and the median
(80.23%) than the statewide average (65.33%). Nationally, 56% of instructional
computers were located in classrooms and 23% were located in computer labs (Gray,
Thomas, and Lewis , 2010). Given though that both the teacher surveys and principal
surveys found computers located in classrooms and computers located in labs to be
equally efficacious, it is not clear that the data in Table 3 can be used to demonstrate
either a positive or negative correlation between technology and student achievement.
151
Table 4 revealed that the 11 schools have a slightly higher percentage
(average=49.25%, median=54.17%) of computers that are more than 4 years old than the
average of schools statewide (46.90%). Likewise, the 11 schools have a slightly lower
percentage (average=12.01%, median=9.92%) of computers that are less than 1 year old
than schools statewide (12.47%) or schools across the nation (15%) according to NCES
(Gray, Thomas, and Lewis, 2010). Looking just at averages, Table 5 showed that more
computers are scheduled to be retired statewide (6.80%) than for the sample schools
(4.30%), and significantly more computers are scheduled to be added statewide (8.46%)
than in the sample schools (3.05%). The averages for the sample schools were skewed
though by the fact that 6 schools reported they were retiring no computers and 8 schools
reported they were adding no computers.
Shifting from technology hardware to technology support, Table 6 showed that
the average school site hardware fix time was very slightly longer in the sample schools
than for schools statewide, but the calculations for both worked out to be less than 2 days.
These results proved to be more positive than the national data which revealed only 24%
of schools reported that it took less than 2 days to get a computer repaired, 45% reported
that it took 2-5 days, and 31 % reported that it took longer than 5 days (Gray, Thomas,
and Lewis, 2010).
Table 7 showed that the average FTE school site level technical support staffing
in the 11 sample schools trailed the FTE staffing for schools statewide for both on-site
certificated support (statewide=0.35, sample=0.07) and on-site classified support
(statewide=0.79, sample=0.18). Adding the certificated support and classified support
together yields a total technical support staffing FTE of 0.25, which is very close to the
152
average technical support staffing FTE of 0.21 reported in the principal surveys, thus
corroborating and triangulating that data.
Likewise, Table 8 showed that the average FTE school site level technology
curriculum support staffing in the 11 sample schools trailed the FTE staffing for schools
statewide for both on-site certificated curriculum support (statewide=0.38, sample=0.18)
and on-site classified support (statewide=0.28, sample=0.10). Once again, this
corroborates and triangulates data gathered from principals who had very little to say
about staff development in support of effectively implementing educational technology.
The principal surveys revealed that most felt technology curriculum support was the
responsibility of the district and if they spent any site funds, it was very minimal.
Conclusions
The data revealed that of the eleven principals surveyed using both quantitative
and qualitative items only three believed that technology “in and of itself” raised student
achievement. However, all but one of the surveyed principals did believe that the use of
educational technology did “help” to raise student achievement leading to their schools
exiting from program improvement. Therefore, the results from these principal surveys
appear to indicate that the utilization of educational technology resources is indeed
supportive of school level efforts to raise student achievement and exit Program
Improvement.
Having established that educational technology can help to raise student
achievement and exit from Program Improvement, both principals and teachers at the
eleven successful schools were surveyed to determine what specific educational
technology resources could help to raise student achievement. The principals and
153
teachers were surveyed regarding the efficacy of many different technology hardware
resources and they agreed that many of these resources helped to raise student
achievement, with LCD projectors and document cameras topping the list, followed
closely by high speed internet connectivity and wireless internet connectivity. When
surveyed about different technology software resources, many respondents believed in
the efficacy of leveled reading programs and computer based learning programs.
However, the highest degree of agreement among both principals and teachers was to the
belief that technology proved invaluable in analyzing student achievement data and in
supporting data driven decision making (DDDM).
When surveyed about technology support, teachers indicated that staff
development and hands-on training in technology was the most important factor in
helping to raise student achievement. Most principals though had little to say about staff
development in support of effectively implementing educational technology. Most
principals felt this was the responsibility of the district and if they spent any site funds, it
was very minimal. Principals reported receiving most tech support and maintenance
from the district technology office and on-site tech coordinators and coaches.
Most principals were able to identify how much money they spent annually on
technology out of site funds, with the average being $64 per student. However, less than
half of the surveyed principals felt the $250 dollars per student proposed by Odden and
Picus was adequate. While $64 per student is much less than the $250 proposed by
Odden and Picus, the results appear to reinforce the belief of principals that funding and
support for technology should come from district, not site funds, and more funds should
be provided.
154
Results from the California School Technology surveys were mixed. While many
individual schools in the study showed stronger implementation and support for
educational technology than schools statewide, the average results for the eleven schools
in the sample for this study usually trailed the statewide averages. However, in the
critical area of number of students per computer, the eleven sample schools had an
average ratio lower than the state average which supported a positive correlation between
increasing educational technology and increasing student achievement.
155
CHAPTER FIVE
SUMMARY, CONCLUSIONS, AND IMPLICATIONS OF THE FINDINGS
Introduction
Working under both the federal No Child Left Behind Act (NCLB) and the state
Public Schools Accountability Act (PSAA), schools in California face great pressure to
teach students to high standards based upon yearly standardized test scores. If schools
and districts do not meet goals for Adequate Yearly Progress (AYP) under the federal
law, they face increasingly severe sanctions through the Program Improvement process.
Likewise, if schools and districts do not meet goals for increasing their Academic
Performance Index (API) under the state law, they face increasingly severe sanctions
through the School Assistance and Intervention Team (SAIT) and District Assistance and
Intervention Team (DAIT) processes.
Adequately funding schools has become particularly important, because goals
have to be met not only school-wide, but also within significant subgroups such as
English Language Learners and Economically Disadvantaged. This means the amount of
resources and how they are allocated needs to be varied to effectively address the
differing needs of individual schools and considerations of adequacy must identify
funding that is sufficient enough and of an appropriate nature to ensure all students meet
high academic standards. The current state budget crisis, which will continue into the
2010-2011 school year, has made this extremely difficult as districts are just trying to
stay financially solvent.
The key concept that Odden and Picus (2008) encourage states to consider in
creating school finance policy is “adequacy”, which they define as “the provision of a set
156
of strategies, programs, curriculum, and instruction, with appropriate adjustments for
special-needs students, districts, and schools, and their full financing, that is sufficient to
teach students to high standards.” Odden and Picus (2008) and Baker, Taylor, and
Vedlitz (2003) have summarized many approaches taken by different states in conducting
educational adequacy studies. Regardless of the approach used or the state being
examined, adequacy studies culminate in a list of recommended educational changes and
the total amount of finances required to adequately fund and implement those changes.
For this study, the focus was on the provision of technological resources including
hardware, software, networking, training, school-wide technology-based academic
programs, technology supported data analysis, and other educational technologies, with
appropriate adjustments for special-needs students, districts, and schools, and their full
financing, that is sufficient to support teaching students to high standards, as part of a
larger adequacy model.
Overview of the Problem
While the vast majority of the recommendations in these adequacy studies are
research based and have been adjusted for the special needs of individual states with the
input of professional judgment panels, there are key areas of expenditures such as central
office administration and site administration for which there is very little empirical data
for determining an adequate level of funding. Average historical expenditures are often
used to determine funding levels in these areas, regardless of their efficacy.
Technology is one of these areas on which there is a great deal of money spent,
but limited research on what an adequate amount would be to support student
achievement or what those expenditures should look like. For example in the Arkansas
157
adequacy study (Odden, Picus & Fermanich, 2003) a nominal amount of $250 per student
was given regardless of whether the student was in elementary, middle, or high school.
Relative to higher profile core instructional components of the expenditure matrices, such
as teacher compensation, there were fewer explanations on how the technology figure
was generated, how it was to be expended, or how it was to be adjusted for students,
schools, or districts with special needs. Rather, expenditures for non core instructional
areas such as technology, maintenance, and central office administration in the
prototypical schools are often given much less empirical support, and at times, are
determined by simply averaging the expenditures spent in each area up to the time of the
adequacy study.
Purpose of the Study
While technology, in and of itself, is unlikely to raise student test scores or play
more than a supporting role to key educational components such as class size and quality
of teachers, expenditures on technology do play a role in achieving an overall adequate
educational system. The purpose of this study was to use a successful schools model to
more accurately determine what types of educational technology resources can help to
raise student achievement and an adequate amount of funding in the area of technology
needed to purchase those resources for prototypical elementary schools in future
adequacy studies. In particular, this study attempted to determine an adequate level of
funding for technology for public elementary schools in the state of California.
Research questions addressed the relationship between funding for technology
and student achievement, as well as, focused on specific expenditures on technology
including the following:
158
1. Overall, does the utilization of educational technology resources support
school level efforts to raise student achievement and exit from program
improvement?
2. What types, amounts, configurations and uses of educational technology
resources support school level efforts to raise student achievement?
3. What additional support, funding, and budgets should be provided to support
school level efforts to raise student achievement?
4. Do state technology surveys and state achievement test data indicate a positive
correlation between spending money on technology and high student
achievement?
Review of Methodology
Utilizing a successful schools model, school level experts in the form of
principals and teachers were surveyed to begin establishing a baseline for adequate levels
of technology and funding. Successful schools were identified as those elementary
schools in California that significantly raised student achievement for two consecutive
years in order to exit Program Improvement status from Year 4 or Year 5 under the
federal No Child Left Behind act in 2006 or 2008. Once the successful schools were
identified, staff members were surveyed to investigate the type and amount of technology
being utilized. Site administrators at successful schools, as determined by 2006 or 2008
STAR test results, were surveyed to assess the types and amounts of technology being
used and how much funds were expended in this area over the past three years. Samples
of teachers at these same schools were surveyed to further ascertain the impact of
technology on student achievement.
159
Annual assessment results for these schools are published annually and are a key
component of the state’s accountability system. The state also requires all public schools
to complete, submit, and publish annual surveys on the types of technology available to
students and staff for instructional purposes. Expenditure patterns on technology by
these successful schools in California were checked for correlations with increasing
student achievement to further establish an empirical basis for adequately funding
technology in California.
Findings
Summary of the Findings for Research Question Number One:
Overall, does the utilization of educational technology resources support school level
efforts to raise student achievement and exit from program improvement?
In an effort to answer this first research question as to whether or not the
utilization of educational technology supported school level efforts to raise student
achievement and exit program improvement, a series of quantitative and qualitative
questions were posed to principals in a survey questionnaire instrument. These questions
were clustered into 3 topic areas of 1) Program Improvement Background Information, 2)
Instructional Leadership in Technology, and 3) Exiting Program Improvement.
The results from these principal surveys appear to be very supportive of the
premise that the utilization of educational technology resources was indeed supportive of
school level efforts to raise student achievement and exit program improvement. This
was evident within each cluster of questions and across all the clusters. While the NCES
national survey (Gray, Thomas, and Lewis, 2010) provided “data on the availability and
use of educational technology in public elementary schools”, it did not address the
160
efficacy of educational technology for increasing student achievement and therefore did
not provide points of comparison for this research question.
Ten of the nineteen quantitative items in these three cluster areas were specifically
designed to answer this research question regarding the efficacy of educational
technology to raise student achievement and exit program improvement. On average,
approximately 62% of the surveyed principals, strongly agreed or agreed with these
statements. Conversely, on these same specific 10 items, an average of only 23%,
disagreed or strongly disagreed with these statements. An average of 15%, did not know
how to respond to these 10 items or felt they were not applicable to their school.
Only 27% of principals strongly agreed or agreed with the statement, “has the
expansion of your site’s use of technology been undertaken with the expectation that it
would, in and of itself, raise student achievement?” Of the 10 quantitative items
regarding the efficacy of educational technology, 91% of principals strongly agreed or
agreed with the statement, “has the expansion of your site’s use of technology been
undertaken with the expectation that it would help to raise student achievement?”
These results matched the researcher’s expectations based upon his own
experience as a site administrator and a review of the literature on the topic of
educational technology. No matter how engaging and dynamic educational technology
becomes, it is unlikely that it will push aside more key components of raising student
achievement such as good first instruction by highly qualified teachers. Just the same,
educational technology can supplement and support that good teacher instruction, by
making it more accessible and interesting to students to raise their level of engagement.
Educational technology can also help teachers to evaluate their teaching and make
161
adjustments by analyzing student achievement results and guiding them in making
adjustments to their instructional practices based upon those assessment results. This
realization led to the second research question.
Summary of the Findings for Research Question Number Two:
What types, amounts, configurations and uses of educational technology resources
support school level efforts to raise student achievement?
In an effort to answer this second research question as to what types, amounts,
configurations and uses of educational technology resources support school level efforts
to raise student achievement, a series of quantitative and qualitative questions were posed
to principals and teachers in survey questionnaire instruments. Questions for principals
and teachers were clustered into 2 topic areas of 1) Technology hardware, and 2)
Technology Software/Programs with a third cluster of questions on the principal survey
instrument focused on Data Analysis and Data Driven Decision Making (DDDM).
As noted in the findings from the first research question, the results from the
principal surveys appear to be very supportive of the premise that the general utilization
of educational technology resources are supportive of school level efforts to raise student
achievement and exit program improvement. When surveyed about specific technology
resources being used to increase student achievement to answer the second research
question, the results were differentiated with the level of agreement from principals
ranging from a low of 18% to a high of 100%. The results from teachers when surveyed
about the same specific technology resources being used to increase student achievement
ranged from a low of 11% to a high of 92%.
162
Technology Hardware
When surveyed about specific technology hardware, 100% of the principals
strongly agreed or agreed that, “LCD projectors in classrooms helped to raise student
achievement.” Likewise, 91% of principals strongly agreed or agreed that, “document
cameras for LCD projectors helped to raise student achievement.” A significant number
of principals (73% or more) strongly agreed or agreed that several other technology
hardware resources also raised student achievement including: “desktop student
computers in classrooms”, “desktop student computers in a computer lab”, “desktop
teacher computers”, “laptop teacher computers”, and “high speed internet connectivity”.
Conversely, only 18% of the principals agreed that, “laptop student computers in a
portable lab,” “video equipment”, or “simpler technology devices” helped to raise student
achievement.
Quantitative data on technology hardware was also gathered from both primary
teachers and intermediate teachers to triangulate the results gathered from principals and
the results were very similar in terms of what technology hardware resources they
strongly agreed or agreed helped to raise student achievement. LCD projectors and
document cameras were at the top of all 3 surveys with agreement rates exceeding 80%.
However, over 80% of both primary and intermediate teachers felt just as strongly that
“Laptop teacher computers” and “High speed internet connectivity helped raise student
achievement.” Additionally, 86% of intermediate teachers agreed that, “Wireless internet
connectivity helped raise student achievement.”
While the results gathered from principals and teachers were very similar, which
strengthened their validity, the researcher thinks principals would be wise to note the
163
small perceptual differences between themselves and their teachers. Teachers more
strongly agreed that educational technology resources that they use everyday, such as
teacher laptop computers, high speed internet connectivity, and wireless internet
connectivity helped to raise student achievement. When principals have to make tough
budget decisions in making technology purchases, they may be tempted to cut back on
these items because they may not be used directly by students. However, these resources
may be crucial in helping teachers to prepare and deliver high quality instruction, which
in turn leads to higher student achievement.
The national survey of educational technology in public schools (Gray, Thomas,
and Lewis, 2010) looked at many of the same technology hardware resources as those
examined in the principal and teacher surveys used to gather data in eleven public
elementary schools in California for this study. While the national survey did not directly
measure the efficacy of the various types of educational technology resources, the
researcher asserts that the differing rates of the presence of each of the resources in
classrooms across the country may indicate varying degrees in the belief of each these
technology resources to increase student achievement.
Therefore, the findings that a large percentage of both principals and teachers in
this study agreed that LCD projectors helped to raise student achievement would appear
to be validated by the finding from the national study which found LCD projectors were
present in 97% of elementary schools across the country. Conversely, while a large
percentage of both principals and teachers in this study also agreed that document
cameras attached to LCD projectors helped to raise student achievement, the national
study revealed that document cameras are present in only 49% of elementary schools
164
across the country. Responses regarding wireless networks were very similar with 61-
86% of principals and teachers agreeing that wireless internet connectivity helped to raise
student achievement and the national survey revealing that 69% of elementary schools in
the country had wireless network access for all or part of the school. In this study of
eleven successful California elementary schools, only 11-43% of principals and teachers
agreed that laptop computers in portable labs helped to increase student achievement, and
22-45% agreed that interactive whiteboards helped to increase student achievement. The
researcher speculated that these low agreement rates may simply be due to these
technologies not being present in the eleven schools for evaluation, given the large
numbers of respondents marking “Don’t Know or N/A”. This contrasts with the NCES
(Gray, Thomas, and Lewis, 2010) national survey results which showed that 58% of
elementary schools in the country had laptops on carts and 71% had interactive
whiteboards. Given the much higher use of interactive whiteboards such as SmartBoards
and Promethean boards across the country, and support for this technological resource in
the literature for increasing the engagement of digital natives, the researcher feels this is
an underutilized resource in the sample schools and could represent a means for helping
to further increase student achievement.
Technology Software/Programs
When surveyed about specific technology software/programs, 100% of the
principals strongly agreed or agreed that, “data analysis and Data Driven Decision
Making programs (e.g. EduSoft, OARS, etc.) helped to raise student achievement”.
Principals also ranked “leveled reading programs (e.g. Accelerated Readers, Read 180)”
the second highest with 73% strongly agreeing or agreeing that these programs helped to
165
raise student achievement. Of the eleven surveyed principals, 64% also agreed that
“computer assisted learning”, “California standards based assessment and tutorial
programs (e.g. Study Island)”, and “skills reinforcement software programs (e.g. Math
Facts, Math Blaster)” helped to raise student achievement. However, follow up open-
ended items regarding computer assisted learning revealed that the use of this specific
technology resource was widely varied and inconsistent, both between schools and within
schools. Again, this may represent an underutilized resource that could have an even
greater impact on increasing student achievement if it were used more purposefully and
systematically to address specific standards with specific students. At the opposite end of
the spectrum, 36% or less agreed that “leveled math assessment and tutorial programs
(e.g. Accelerated Math, Learning.com)”, “ELD programs (e.g. Rosetta Stone)”,
“electronic textbooks and supplements to core curriculum”, and “multimedia programs
(e.g. United Streaming, Discovery Education)” helped to raise student achievement.
Again, quantitative data on technology software/programs was also gathered from
both primary teachers and intermediate teachers to triangulate the results gathered from
principals. The results were very similar in terms of what technology software/program
resources they strongly agreed or agreed helped to raise student achievement. “Data
analysis and Data Driven Decision Making programs” was at the top of all 3 surveys with
agreement rates exceeding 80%. However, just as there were differences between
principals and teachers regarding technology hardware, the same held true with
technology software/programs. Between 67% and 83% of both primary and intermediate
teachers felt nearly as strongly that “Leveled reading programs” and “Multimedia
programs (e.g. United Streaming) helped raise student achievement.” While 73% of
166
principals also agreed that “leveled reading programs” helped to raise student
achievement, it was interesting that only 36% of principals agreed that “multimedia
programs” helped to raise student achievement. Again, the researcher feels that
principals should pay close attention to perceptual differences such as this. Multimedia
programs such as United Streaming (now Discovery Education), are instructional tools
used by teachers to better engage the interests of digital native students as discussed in
the literature. These programs help to build background knowledge before delving into
instruction and make new concepts more accessible to EL students by providing more
visuals.
Data Analysis and Data Driven Decision Making (DDDM)
A third and final cluster of 9 quantitative questions was posed just to principals
and focused on “Data Analysis and Data Driven Decision Making (DDDM)” in trying to
answer the second research question of, “what types, amounts, configurations and uses of
educational technology resources support school level efforts to raise student
achievement?” Given that 100% of principals agreed with the efficacy of DDDM when
asked about its use in the technology software/programs cluster of questions, it was not
surprising that the results regarding the efficacy of DDDM to raise student achievement
came back very strong in this cluster of questions with 91% of principals strongly
agreeing or agreeing to 6 of the 9 items, including “Overall, has the technology which
was employed and utilized within your system of DDDM been a significant factor in
raising student achievement?” Follow up questions revealed that technology was critical
to the successful implementation of DDDM, including specific programs such as Online
Assessment and Reporting System (OARS), Data Director, and EduSoft. This strongly
167
reflected the findings of the 2010 NCES national study of educational technology in U.S.
public schools which showed that 88% of elementary schools “used their district network
or the internet to provide standardized assessment results and data for teachers to
individualize” and plan instruction (Gray, Thomas, and Lewis, 2010). These results also
replicated the findings from a parallel study conducted by George Szeremeta (2009), in
which schools exiting from Year 4 and 5 of Program Improvement in 2007 identified
DDDM as the single most important technology supported factor in raising student
achievement.
Rather than simply relying on simple hand-graded paper and pencil assessments
for data analysis, all eleven schools indicated that their DDDM did utilize some form of
computer-based, scored, or tracked student assessments in language arts or math, but the
frequency at which these tests were administered and DDDM utilized varied greatly. The
survey results showed that these data analysis programs were very helpful to teachers and
administrators in monitoring student progress towards mastery of grade level standards.
However, access to such data may also be very helpful to other key stakeholders in the
educational process, such as students and parents. Regrettably, like the schools in
Szeremeta’s (2009) study of 2007 schools, no schools in this sample offered such access
to students and parents.
Summary of the Findings for Research Question Number Three:
What additional support, funding, and budgets should be provided to support school level
efforts to raise student achievement?
The findings from the first two research questions established that educational
technology in general can be used to increase student achievement and exit program
improvement, and the specific technologies that were most effective in doing so. The
168
third research question was designed to clarify and specify, “What additional support,
funding, and budgets should be provided to support school level efforts to raise student
achievement?” Quantitative and qualitative data was gathered from the eleven school
sites using the principal survey instrument and the teacher survey instrument. Questions
for each school on both the principal surveys and the teacher surveys were clustered into
the focus area of Technology Support. A second cluster of questions on only the
principal survey instrument focused on Educational Technology Costs, Funding, and
Budgets.
Technology Support
When surveyed about Technology Support, the highest level of agreement among
both primary and intermediate teachers came to the statement that, “Staff development
and hands-on training in technology helped raise student achievement”, with 78% of
primary teachers and 64% of intermediate teachers strongly agreeing or agreeing. There
was also relative agreement, 61% of primary teachers and 64% of intermediate teachers,
to the statement that, “District tech support helped raise student achievement.”
Conversely, less than 39% of both primary teachers and intermediate teachers agreed that
either “a certificated tech coach” or “a classified computer aid” helped to raise student
achievement. This could be a significant find, given the personnel costs associated with
such site positions. However, given the high number of responses marked as “Don’t
Know” to these two items, these findings may reflect a lack of existence of these
positions at these school sites, rather than a lack of efficacy. This matched the results
from the national educational technology survey (Gray, Thomas, and Lewis, 2010) which
169
showed that only 27% of elementary schools in the country reported having full time staff
responsible for technology support and/or integration.
Both qualitative and quantitative items showed that principals reported receiving
tech support and maintenance from a variety of sources including “district technology
office” (91%), “on-site tech coordinator/coach” (73%), and “private firm” (36%). These
results very closely matched the national study of educational technology in U.S. public
schools (Gray, Thomas, and Lewis, 2010) in which 85% of respondents strongly agreed
or agreed that “District level technology staff provided technical support” and 70% of
respondents strongly agreed or agreed that “School level technology staff provided
technical support”.
Nationally, the percentage of schools agreeing that “technical support for
educational technology is adequate” ranged from 60% in high poverty schools to 74% in
low poverty schools (Gray, Thomas, and Lewis, 2010). The results from the eleven
California principals surveyed for this study split the difference with 64% strongly
agreeing or agreeing that “tech support for [the] school’s hardware, software, and
networks on site [had] been adequate.” However, 36% of the principals disagreed that
tech support had been adequate, with no neutral responses. This was one of the few
quantitative items in all of the surveys to which there were no neutral responses, which
would seem to indicate that principals felt very strongly about tech support at their sites.
It is likely that the same 4 out of 11 principals (36%) strongly agreed or agreed that,
“insufficient or delinquent tech support for [their] school [had] been a concern and an
impediment to the successful implementation of instructional strategies designed to raise
student achievement.”
170
Potentially compounding any weaknesses in tech support to set up and maintain
school technology resources was the finding that only 45% of the principals agreed that
their site had set a replacement cycle for computers and peripherals. This could prove to
be problematic as older machines typically require greater maintenance to keep running,
and upgrades to run newer programs. Even worse, only 27% of the principals expected
funding would be in place to meet the future expenditures associated with meeting
projected hardware replacement cycles. Even so, 82% of the principals agreed that they
had staggered purchases of computers and peripherals in an effort to spread the
replacement costs associated with hardware replacement cycles. Most principals reported
desktop computers to be older in their replacement cycle (3-5 years), and laptop
computers and LCD projectors to be newer in their replacement cycles (2 years or less).
These findings would appear to reflect the perceived values placed on these different
technology resources that principals and teachers reported in the principal and teacher
surveys. In other words, technology resources that were perceived to have a greater
positive impact on increasing student achievement were a higher budget priority and were
being kept more current.
Educational Technology Costs, Funding, and Budgets
The second topic investigated in trying to answer the research question was
Educational Technology Costs, Funding, and Budgets. Principals reported an extremely
wide range of annual site expenditures on educational technology, stretching from a
minimum of $5,000 to a maximum of $150,000, with an average of $40,667. Student
enrollment at each of these elementary schools ranged from a minimum of 445 to a
maximum of 820, with an average of 631. This resulted in an average annual per pupil
171
expenditure on educational technology of $64. This was interesting, given that only 5 of
the 11 principals (45%) agreed that the $250 per student proposed by Picus and Odden
was adequate to fund educational technology at the elementary school level, which
closely matches sentiments of principals nationally (Gray, Thomas, and Lewis, 2010)
who when asked if “Funding for educational technology is adequate”, 40% agreed and
60% disagreed. Likewise, only 45% of the California principals agreed that technology
was a formal part of their Alternative Governance Plan. Perhaps all of this reflected
tough budget priority decisions being made and a desire for more general fund support
from the districts.
The range of site technology support positions varied from a minimum of none (3
sites) to a maximum of half-time, with an average of 0.21 FTE, most being categorically
funded. Most principals had little to say about staff development in support of effectively
implementing educational technology. Most felt this was the responsibility of the district
and if they spent any site funds, it was very minimal. These findings are significant in
that they could represent a source of untapped potential for increasing student
achievement and potential conflict with teachers who ranked “staff development and
hands-on training in technology” as the most important technology support factor in
helping to raise student achievement.
Few principals who have proven themselves to be effective instructional leaders
by leading their schools out of Program Improvement would ever think of putting newly
adopted textbooks into teachers’ hands without thorough training and staff development
in how to properly implement the curriculum as designed to maximize student
achievement. Yet, how many of these same principals routinely purchase new
172
educational technology resources, place them in classrooms, and expect teachers to begin
utilizing them and integrating them into instruction immediately, with little or no training.
Such practices do not take into account what the literature reveals in that most classroom
teachers today are still technology immigrants, and many may actually be technophobic.
LCD projectors and document cameras were both rated very highly by both principals
and teachers for increasing student achievement, yet it is not unusual to see these
resources set aside in a classroom while a teacher continues to use an overhead projector
with transparencies. Such underutilization of classroom technology is even more
prevalent with more complicated devices such as interactive whiteboards. The researcher
strongly suggests that one of the most important findings from this study that principals
should heed is that teachers are practically pleading for staff development and hands-on
training in technology to help increase student achievement. When it comes to funding
for technology, this would appear to be money very well spent.
Summary of Findings for Research Question Number Four:
Do state technology surveys and state achievement test data indicate a positive
correlation between spending money on technology and high student achievement?
Data from the California School Technology Surveys for each of the eleven
schools was gathered in an attempt to triangulate and corroborate the data collected from
the principal survey instrument and the teacher survey. It was also collected to see how
the use of 8 different educational technology resources in these schools compared to the
use of these same resources in schools across California. There were also many points of
comparison between the California School Technology Surveys and the national study of
educational technology in U.S. public schools conducted by the National Center for
Education Statistics (Gray, Thomas, and Lewis, 2010). Higher relative use of these 8 key
173
educational technology resources, as determined by the California Department of
Education, in the eleven schools in this study would support a positive correlation
between spending money on educational technology and high student achievement.
At first, the data gathered from these 11 elementary schools that had exited
Program Improvement did not appear to support such a positive correlation. Across the
board on all 8 measures, the averages which were calculated pointed to a negative
correlation. However, the data from many of the individual schools still pointed to a
positive correlation between technology and student achievement. The researcher
speculated that a few outliers were skewing the data for this relatively small sample of 11
schools compared to the number of all schools statewide. The researcher decided to also
calculate the median number for each measure of educational technology which showed
how sensitive the data was to the small sample size. Both figures were presented in the
tables.
Table 1 revealed that the average ratio of students to computers for the 11 schools
(4.91) exceeded the average for the state (4.11). However, 7 of the 11 schools had a
lower ratio than the state, resulting in a lower median ratio of 3.59. This single most
important piece of data from all of the tables demonstrated a positive correlation between
spending money on educational technology and high student achievement. There were,
however, more students per internet connected computer at the 11 schools
(average=10.82, median=5.39) compared to the statewide average (4.59) using either
method. The average for these sample schools and the statewide average both exceeded
the national ratio of students to computers with internet access which is 3.2 to 1 (Gray,
Thomas, and Lewis, 2010). Table 2 reinforced this finding, showing that more
174
computers are connected to the internet at schools statewide (89.63%), than at the 11
schools considering either the average (67.95%) or the median (78.82%). Again, these
results show that schools statewide and in the study sample trailed schools nationally
which reported 98% of instructional computers having internet access (Gray, Thomas,
and Lewis, 2010).
Table 3 showed that the 11 schools in the study have a higher percentage of
computers located in classrooms considering both the average (73.45%) and the median
(80.23%) than the statewide average (65.33%). Nationally, 56% of instructional
computers were located in classrooms and 23% were located in computer labs (Gray,
Thomas, and Lewis, 2010). Given though that both the teacher surveys and principal
surveys found computers located in classrooms and computers located in labs to be
equally efficacious, it is not clear that the data in Table 3 can be used to demonstrate
either a positive or negative correlation between technology and student achievement.
Table 4 revealed that the 11 schools have a slightly higher percentage
(average=49.25%, median=54.17%) of computers that are more than 4 years old than the
average of schools statewide (46.90%). Likewise, the 11 schools have a slightly lower
percentage (average=12.01%, median=9.92%) of computers that are less than 1 year old
than schools statewide (12.47%) or schools across the nation (15%) according to the
National Center for Education Statistics (Gray, Thomas, and Lewis, 2010). Looking just
at averages, Table 5 showed that more computers are scheduled to be retired statewide
(6.80%) than for the sample schools (4.30%), and significantly more computers are
scheduled to be added statewide (8.46%) than in the sample schools (3.05%). The
175
averages for the sample schools were skewed though by the fact that 6 schools reported
they were retiring no computers and 8 schools reported they were adding no computers.
Shifting from technology hardware to technology support, Table 6 showed that
the average school site hardware fix time was very slightly longer in the sample schools
than for schools statewide, but the calculations for both worked out to be less than 2 days.
These results proved to be more positive than the national data which revealed only 24%
of schools reported that it took less than 2 days to get a computer repaired, 45% reported
that it took 2-5 days, and 31 % reported that it took longer than 5 days (Gray, Thomas,
Lewis, 2010).
Table 7 showed that the average FTE school site level technical support staffing
in the 11 sample schools trailed the FTE staffing for schools statewide for both on-site
certificated support (statewide=0.35, sample=0.07) and on-site classified support
(statewide=0.79, sample=0.18). Adding the certificated support and classified support
together yields a total technical support staffing FTE of 0.25, which is very close to the
average technical support staffing FTE of 0.21 reported in the principal surveys, thus
corroborating and triangulating that data.
Likewise, Table 8 showed that the average FTE school site level technology
curriculum support staffing in the 11 sample schools trailed the FTE staffing for schools
statewide for both on-site certificated curriculum support (statewide=0.38, sample=0.18)
and on-site classified support (statewide=0.28, sample=0.10). Once again, this
corroborates and triangulates data gathered from principals who had very little to say
about staff development in support of effectively implementing educational technology.
176
The principal surveys revealed that most felt technology curriculum support was the
responsibility of the district and if they spent any site funds, it was very minimal.
Conclusions
The data revealed that of the eleven principals surveyed using both quantitative
and qualitative items only three believed that technology “in and of itself” raised student
achievement. However, all but one of the surveyed principals did believe that the use of
educational technology did “help” to raise student achievement leading to their schools
exiting from program improvement. Therefore, the results from these principal surveys
appear to indicate that the utilization of educational technology resources is indeed
supportive of school level efforts to raise student achievement and exit Program
Improvement. These findings replicated those found in the parallel study conducted by
George Szeremeta (2009).
Having established that educational technology can help to raise student
achievement and exit from Program Improvement, both principals and teachers at the
eleven successful schools were surveyed to determine what specific educational
technology resources could help to raise student achievement. The principals and
teachers were surveyed regarding the efficacy of many different technology hardware
resources and they agreed that many of these resources helped to raise student
achievement, with LCD projectors and document cameras topping the list, followed
closely by high speed internet connectivity and wireless internet connectivity. When
surveyed about different technology software resources, many respondents believed in
the efficacy of leveled reading programs and computer based learning programs.
However, the highest degree of agreement, among both principals and teachers, was to
177
the belief that technology proved invaluable in analyzing student achievement data and in
supporting data driven decision making (DDDM). This replicated the most significant
finding from George Szeremeta’s parallel study (2009).
When surveyed about technology support, teachers indicated that staff
development and hands-on training in technology was the most important factor in
helping to raise student achievement. Most principals though had little to say about staff
development in support of effectively implementing educational technology. Most
principals felt this was the responsibility of the district and if they spent any site funds, it
was very minimal. Principals reported receiving most tech support and maintenance
from the district technology office and on-site tech coordinators and coaches.
Most principals were able to identify how much money they spent annually on
technology out of site funds, with the average being $64 per student. However, less than
half of the surveyed principals felt the $250 dollars per student proposed by Odden and
Picus was adequate. These findings did not replicate those in the Szeremeta (2009)
parallel study in which most principals could not articulate how much they spent on
technology, yet felt the Odden/Picus figure was adequate. While $64 per student is much
less than the $250 proposed by Odden and Picus, the results appear to reinforce the belief
of principals that funding and support for technology should come from district, not site
funds, and more funds should be provided.
Results from the California School Technology surveys were mixed. While many
individual schools in the study showed stronger implementation and support for
educational technology than schools statewide, the average results for the eleven schools
in the sample for this study usually trailed the statewide averages. However, in the
178
critical area of number of students per computer, the eleven sample schools had an
average ratio lower than the state average which supported a positive correlation between
increasing educational technology and increasing student achievement.
Implications
Like Szeremeta’s 2009 parallel study, this study focused on a sample of 11 public
elementary schools and their use of educational technology to raise student achievement
and exit from Program Improvement under No Child Left Behind. However, the
researcher disagrees with Szeremeta’s (2009) assertion that, “the broader implications of
the study could be somewhat limited in their application to other schools which lay
outside the Program Improvement environment.” All Title I schools face a mandate to
have 100% of their students score proficient or above on state tests by 2014 under No
Child Left Behind (NCLB). Likewise, even non Title I schools should feel a moral
obligation to continue increasing student achievement. Therefore, the technology
resources and practices delineated in this study, such as using technology to support data
driven decision making (DDDM) should be replicated in all schools. Schools with high
rates of student achievement that are not under pressure from NCLB, may choose to
focus on cultivating 21
st
century skills as advocated in the literature, such as using the
internet to conduct research and PowerPoint to create and share presentations.
Using educational technology to purposefully increase student achievement or to
address 21
st
century skills, both represent appropriate uses in all schools, though the
particular emphasis may vary depending on each school’s particular situation. Either
way, this study revealed that educational technology has an important role to play in
elementary schools and should be properly funded. It should be noted though, that
179
schools may be utilizing educational technology resources to the best of their ability
given the current constraints that exist. As noted in the literature, schools may need to
significantly shift their paradigm if they are too fully realize the benefits of a technology
embedded and driven instructional program. This means rather than delivering
curriculum and instruction in a teacher-centered traditional manner with technology
layered over the top, schools may need to shift towards a student-centered and directed
curriculum in which technology is the main vehicle for learning and students conduct
inquiry and investigation in real-time using devices such as smart-phones to build
knowledge understanding.
Recommendations
This study showed that technology positively impacts student achievement and
shed some light on how financial resources should be allocated to support these positive
outcomes. If state legislators are going to take such findings into account they must make
sure enough general fund money is budgeted to purchase adequate technological
resources after major expenses such as personnel are covered. Too often schools have
had to rely on categorical funds and grants to fund educational technology resources.
However, many of those categorical funds are now being swept by districts during the
current budget crisis, and grants often have many strings attached which may not reflect
the priorities found in this study for increasing student achievement. In responding to the
current Robles-Wong school funding lawsuit, legislators should be sure to include
funding for technology as part of the eventual lawsuit settlement.
Given the stated desire of teachers for greater staff development and hands on
training in using educational technology, the California Commission on Teacher
180
Credentialing, and similar credentialing institutions in other states, should require
coursework, or even formal exams, to better ensure teachers can properly utilize
hardware and software and integrate them into a standards based instructional program.
Schools and districts should also make ongoing staff development in educational
technology a higher priority.
In addition to providing guidance to policymakers, the results of this study should
prove invaluable to practitioners as they try to eke out ever increasing levels of student
achievement as required under No Child Left Behind mandates. This study should help
administrators spend money on the right types of hardware and software, in the optimal
configurations, with appropriate support (such as staff development) to maximize the
positive impact on student achievement. Rather than simply asking students to open their
textbooks to read passages and work problems, teachers should use the results of this
study to help them make concepts and curriculum come alive for students, using critical
technology hardware such as LCD projectors and document cameras. The results of this
study regarding technology software such as leveled reading programs and computer
based learning should also reinforce the notion that technology can be an invaluable tool
for differentiating instructional opportunities to provide remediation and interventions for
students struggling to meet grade level standards, while at the same time providing
enrichment opportunities for students that need to work above grade level standards if
they are to reach their full potential.
Suggestions for Further Research
This study demonstrated that educational technology helped to increase student
achievement by identifying successful schools as those that had exited from Year 4 or 5
181
of Program Improvement under No Child Left Behind, then surveying principals and
teachers at those schools for their subjective opinions. A more objective approach may
be to identify both successful and unsuccessful schools and then identify differences in
the utilization of educational technology. California already ranks all public schools into
deciles with 1 being the lowest and 10 being the highest, based upon performance on the
California Standards Test. Co-variables such socio-economic status and parent education
levels can be controlled for by only comparing schools in similar schools bands. If there
were no significant differences in the utilization of educational technology between
decile 1 and decile 10 schools in such a study, or technology was used less in the
successful decile 1 schools, then it would be difficult to attribute increases in student
achievement to educational technology. Conversely, if educational technology was used
more or differently in the decile 1 schools, then it would point to a positive correlation
between technology and increasing student achievement, and would help to identify
specific technology resources that were beneficial.
This study found that the average per pupil expenditures on educational
technology was $64 per year. This figured was derived only from site expenditures at the
11 schools in the study sample. Based upon the wide range of responses and the number
of non responses, the researcher feels the site expenditures may have been a bit subjective
and may have been under reported. Future studies may focus more on identifying
adequate funding for educational technology by identifying successful schools in a
similar manner, but more accurately identifying current expenditures by examining actual
site budgets and district budgets to more accurately determine the total expenditures to
implement, maintain, and support technology in such successful schools. Current
182
expenditures may not necessarily be adequate. Shortcomings in areas such as hardware
replacement cycles, computer maintenance, speed of repairs, and staff development could
be priced out and added to current expenditures to try and determine an adequate funding
level. This final figure could then be compared to the Odden/Picus figure of $250 per
student to more objectively determine if this level of funding is adequate or not.
183
BIBLIOGRAPHY
Abramson, L. (2009, June 15). A Tale of Technology in Two School Districts. National
Public Radio. Retrieved June 15, 2009, from http://www.npr.org/templates/story
Augenblick, J., Myers, J., Anderson, A. (1997). “Equity and Adequacy in School
Funding.” Future of Children 7(3), pp. 63-78, Winter 1997.
Baker, B., Taylor, L., Vedlitz, A. (2005). Measuring Educational Adequacy in Public
Schools.
BBC News. (2008, September 17). Pupils test multi-touch screens. Retrieved
September 22, 2008, from
http://news.bbc.co.uk/go/pr/fr//2/hi/uk_news/education/7621213.stm
Boster, F.J., Meyer, G.S., Roberto, A.J., and Inge, C.C. (2002). A Report on the Effect
of the United Streaming Application on Educational Performance. Retrieved
November 28, 2005 from United Learning, Discovery Education at:
http://www.unitedlearning.com/Images/streaming/evaluation.pdf
California School Boards Association (2004). Key features of Williams settlement made
public. California School Boards Association, August 18, 2004.
Chaker, A.M. (2009, July 22). An Apple for Your Teacher. The Wall Street Journal.
Retrieved July 23, 2010, from
http://online.wsj.com/article/SB100014240529702049009045743041
40278264598.html
Chea, T. (2009, August 11). California names digital textbooks that meet standards.
MercuryNews.com. Retrieved August 12, 2009, from
http://www.mercurynews.com
Christensen, C., Horn, M., and Johnson, C. (2008, August 11). Creative Disruption:
How to Change the Way Kids Learn. Forbes.com. Retrieved August 18, 2008,
from http://www.forbes.com/forbes/2008/0811/081_print.html
Clune, W. (1994). The Shift from Equity to Adequacy in School Finance. Educational
Policy, 8(4), pp. 376-94, December 1994.
Conley, T. and Picus, L.O. (2003). Oregon’s Quality Education Model: Linking
Adequacy And Outcomes. Educational Policy. 17 (5). November 2003.
586-612.
184
Consortium on School Networking. (2010). Data-driven decision making FAQ.
Retrieved March 24, 2010, from http://3d2know.cosn.org/FAQ/html
Datnow, A., Park, V., & Wohlstetter, P. (2007). Achieving with Data: How high-
performing school systems use data to improve instruction for elementary
students. Center on Educational Governance, Rossier School of Education,
University of Southern California. Retrieved March 24, 2010, from
http://www.newschools.org/files/AchievingWithData.pdf
Dede, C. (1994). The Technologies Driving the National Information Infrastructure:
Policy Implications for Distance Education. Commissioned by the Southwest
Regional Laboratory (SWRL) in connection with the U.S. Department of
Education’s Evaluation of Star Schools, October 1994.
Dede, C. (1995). Testimony to the U.S. Congress, House of Representatives, Committee
on Science and Committee on Economic and Educational Opportunities. Joint
Hearing on Educational Technology in the 21
st
Century. Oct. 12, 1995.
Dede, C. (1998). Learning with Technology. 1998 yearbook of the Association for
Supervision and Curriculum Development, ed. (1998).
EdSource (1996). A Primer on Proposition 98. EdSource, October 1996.
EdSource (2003a). About the No Child Left Behind (NCLB) Act of 2001. EdSource,
May 2003.
EdSource (2003b). The Basics of California’s School Finance System. EdSource,
August 2003.
EdSource (2003c). School Finance Highlights 2003-2004. EdSource, November 2003.
Fisher, B. (2007, August 6). Teaching Technology Growing in Importance. The
Enquirer. Retrieved August 8, 2007, from http://news.enquirer.com/apps/pbcs.dll
Frier, S. (2009, June 5). California schools see distant future for textbooks. The
Sacramento Bee. Retrieved June 10, 2009, from
http://www.sacbee.com/topstories/v-print/story/1920872.html
Gartner. (2003, April). Why Total Cost of Ownership (TCO) Matters.
Giokaris, G. (2001). Class notes and handouts. Education Policy & Administration
613, University of Southern California, Fall 2001.
185
Gordon, M. (2008, August 6). Arizona educators embrace trend of technology in their
curriculum. The Arizona Republic. Retrieved August 7, 2008, from http://www.
azcentral.com/arizonarepublic/local/articles/2008/08/06/20080806bts=classtecho
Gragg, W. (2009, July 26). More, younger children take online classes: state oks 3
electronic course providers: classes start in 3
rd
grade. Waco Tribune-Herald.
Retrieved July 27, 2009, from
http://www.wacotrib.com/news/content/news/stories/2009/07/26/07262009_
wac_onlineschools.html
Gray, L., Thomas, N., and Lewis, L. (2010). Educational Technology in U.S. Public
Schools: Fall 2008 (NCES 2010-034). U.S Department of Education, National
Center for Education Statistics. Washington, DC: U.S. Government Printing
Office. Retrieved May 3, 2010, from
http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010034
Hadderman, M. (1999). Equity and Adequacy in Educational Finance. Eric Digest, 129,
August 1999.
Hanushek, E. (1994). “A Jaundiced View of Adequacy in School Finance Reform.”
Educational Policy, 8(4), pp. 460-69, December 1994.
Hanushek, E. (1996). “The Productivity Collapse in Schools.” In William Fowler (Ed.),
Developments in School Finance, 1996. Washington D.C.: National Center for
Education Statistics 97(535).
Hechinger, J. (2009, June 12). Data-Driven Schools See Rising Scores. The Wall Street
Journal. Retrieved June 15, 2009, from
http://online.wsj.com/article/SB124475338699707579.html
Hitlin, P. & Rainie, L. (2005, August). Teens, technology, and school. Data memo.
Washington, D.C.: Pew Internet & American Life Project.
Kaestner, R. (2006, November). Value of investment in technology: Simple questions,
difficult to answer. School Business Affairs, 72. Retrieved March 25, 2010, from
http://www.Edtechvoi.org/resources/ASBO_Nov06_SBA_Article_Investment-in-
Technology.pdf
Khadaroo, S.T. (2009). Schools tap ‘21
st
century skills’ to prepare students for a fast-
changing future, teachers are reaching beyond the R’s. The Christian Science
Monitor, January 8, 2009.
Kulik, J.A. (1994). Meta-analytic studies of finding on computer-based instruction. In
E.C. Baker and H.F. O’Neil, Jr. (Eds.). Technology assessment in education and
training. Hillsdale, NJ: Lawrence Erlbarm.
186
Lake Washington School District. (2004, Spring). Click!: Connections for Lifetime
Learning.
Lang, L., Torgesen, J.K., Vogel, W., Chanter, C., Lefskey, E., & Petscher, Y. (2009).
Exploring the relative effectiveness of reading interventions for high school
students. Journal of Research on Educational Effectiveness, 2, 149-175.
Lizama, J.A. (2009, April 13). Schools turn to online testing for standards of learning.
Richmond Times Dispatch. Retrieved April 14, 2009, from
http://www.timesdispatch.com/rtd/news/local/education
Lockwood, A. (1996). “Productive Schools: Perspectives from Research and Practice.”
New Leaders for Tomorrow’s Schools, 3(1), Fall 1996.
McKenzie, J. (2001, March). How teachers learn technology best. From Now On: The
Educational Technology Journal. 10 (6). Retrieved August 8, 2007, from
http://www.fno.org/mar01/howlearn/html
Murphy, R., Penuel, W., Means, B., Korbak, C. Whaley, A. (2001). E-DESK: A Review
of Recent Evidence on the Effectiveness of Discrete Educational Software.
Menlo Park, CA: SRI International.
National Commission on Excellence in Education (1983). A Nation at Risk: The
Imperative for Educational Reform.
Norris, C. & Soloway, E. (2004). Envisioning the Handheld-Centric Classroom. Journal
of Educational Computing Research, 30 (4), 281 – 294.
Oakes, J.; Blasi, G; Rogers, J. (2003). Accountability for Adequate and
Equitable Opportunities to Learn. To appear in Kenneth Sirotnik, Ed.,
Accountability Run Amok, New York: Teachers College Press.
Odden, A. (1997). How to rethink school budgets to support school transformation.
Getting better by design series, Volume 3. Arlington, VA: New American
Schools.
Odden, A. (2000). New and Better Forms of Teacher Compensation Are Possible.
Phi Delta Kappan, 81, 361-366.
Odden, A. & Archibald, S. (2001). Reallocating Resources: How to Boost Student
Achievement Without Asking for More. Corwin Press, Thousand Oaks, CA.
Odden, A. (2003). “Equity and Adequacy in School Finance Today.” Forthcoming in
Phi Delta Kappan, Fall 2003.
187
Odden, A.; Fermanich, M. & Picus, L. (2003). A State of the Art Approach To
School Finance Adequacy in Kentucky. Report prepared for the Kentucky
Department of Education.
Odden, A; Picus, L. & Fermanich, M. (2003). An Evidenced-Based
Approach to School Finance Adequacy in Arkansas. Final Report: September 1,
2003.
Odden, A. & Picus, L. (2008). School Finance: A Policy Perspective, 4th
Edition. New York: McGraw-Hill.
O’Dwyer, L.M., Russell, M., Bebell, M., and Tucker-Seeley, K.R. (2005, January).
Examining the relationship between home and school computer use and student’s
English/Language Arts test scores. The Journal of Technology, Learning, and
Assessment. (vol. 3, no.3). Retrieved March 9, 2009, from
http://www.bc.edu/research/intasc/studies/USEIT/pdf/USEIT_r10.pdf
Olson, L. (1999). Pulling in Many Directions. Education Week, 19(12), 27-30.
Palfrey, J. & Gasser, U. (2008). Born Digital: Understanding the First Generation of
Digital natives. New York: Basic Books.
Papalewis, R. (2994). Struggling middle school readers: Successful, accelerating
intervention. Reading Improvement, 41 (1), 24-37.
Park, J. (2004). School Finance. Education Week on the Web, updated Sept. 22, 2004.
Partnership for 21
st
Century Skills. (2005). Road to 21
st
century learning: A
policymaker’s guide to 21
st
century skills. Washington, D.C. Retrieved March
19, 2010, from
http://www.21stcenturyskills.org/images/stories/otherdocs/P21_Policy_Paper.pdf
Picus, L.O. (2000). Setting Budget Priorities. American School Board Journal. 187 (5).
May 2000. Pgs. 31 - 33
Picus, L.O. (2001). “Educational Governance in California: Defining State and Local
Roles.” In Jon Sonstelie and Peter Richardson (Eds.), School Finance and
California’s Master Plan for Education. San Francisco, CA: Public Policy
Institute of California, pp. 9-27.
Picus, L.O.; Odden, A. & Fermanich, M. (2003). A Professional Judgment Approach To
School Finance Adequacy in Kentucky. Report prepared for the Kentucky
Department of Education.
188
Picus, L.O. (2004). Class Notes and Handouts. Educational Policy and Analysis 516,
University of Southern California, Fall 2003.
Public Broadcast System (2001). “The Story of American Education”.
Public Policy Institute of California (2003). Great Expectations: Reconciling California’s
Academic Standards and School Resources. Research Brief, Public Policy
Institute of California, October 2003, Issue #78.
Reschovsky, A. and Imazeki, J. (2001). “Achieving Educational Adequacy Through
School Finance Reform.” Journal of Education Finance 26(4), 373-396.
Reutter, H. (2010, March 8). Nebraska schools preparing for standardized reading test.
The Grand Island Independent. Retrieved March 23, 2010, from
http://www.theindependent.com
Ricadela, A. (2008, December 16). Rethinking Computers in the Classroom. Business
Week. Retrieved December 19, 2008, from
http://www.businessweek.com/print/technology/content/dec2008/tc20081215_37
1267.htm
Robelen, E. (1999). The Evolving Federal Role. Education Week, 19(12), 34-35.
Roschelle, J., Pea, P., Hoadley, C., Gordin, D. & Means, B. (200). Changing how and
What Children Learn in School with Computer-Based Technologies. The Future
of Children and Computer Technology, 10 (2).
Rose, H. (2001). “The Concept of Adequacy and School Finance.” In Jon Sonstelie and
Peter Richardson (Eds.), School Finance and California’s Master Plan for
Education. San Francisco, CA: Public Policy Institute of California, pp. 29-46.
Sable, J. and Plotts, C. (2009). Data File: Common Core of Data Public
Elementary/Secondary School Universe Survey: School Year 2007-08 (NCES
2009-306). U.S. Department of Education, National Center for Education
Statistics. Washington, DC: U.S. Government Printing Office. Retrieved June 9,
2010, from
http://nces.ed.gov/pubs2010/2010306/table/table_02.asp?referrer=report
Sachs, Emily (2001). Voters Pass Colton School Bond. The Sun, September 26, 2001.
Samuels, D. (2010, February 1). Palo Alto School District to spend million in bond
funds on technology. The Mercury News. Retrieved February 4, 2010, from
http://www.mercurynews.com/breaking-news/ci_14313254?nclick_check=1
189
Sanders, M. (2008, August 12). Should laptops be issued to middle schoolers? Omaha
World-Herald. Retrieved August 13, 2008, from
http://www.omaha.com/print_friendly.php?u_mod=story
Schmoker, M. (1999). Results: The Key to Continuous Improvement (2
nd
ed.).
Alexandria, VA:Association for Supervision and Curriculum Development.
Schrag, P. (2004). Peter Schrag: Williams deal – Better California schools by inches.
The Sacramento Bee, August 18, 2004.
Schugurensky, D. (2002). “1965: Elementary and Secondary School Act, the ‘War on
Poverty’and Title I.” Selected Moments of the 20
th
Century. The Ontario Institute
for Studies in Education of the University of Toronto.
Sivin-Kachalala, J. (1998). Report on the effectiveness of technology in schools, 1990-
1997. Software Publishers Association.
Slavin, R., Cheung, A., Gaff, C., & Lake, C. (2008). Effective reading programs for
Middle and high school students: A best-evidence synthesis. Reading Research
Quarterly, 43 (3). 290-322
Sonstelie, J. (2001). “Is There a Better Response to Serrano?.” In Jon Sonstelie and
Peter Richardson (Eds.), School Finance and California’s Master Plan for
Education. San Francisco, CA: Public Policy Institute of California, pp. 155-186.
Spring, J. (2001). Conflict of Interests, The Politics of American Education, 4
th
Edition.
McGraw-Hill Higher Education, July 2001.
Sullinger, J. (2008, December 8). Kansas tries a new tactic for helping children to read.
The Kansas City Star. Retrieved December 12, 2008, from
http://www.kansascity.com/115/v-print/story/928504.html
Szeremeta, G. (2009). The Role and Value of Educational Technology in California
Fourth and Fifth Year (2006-2007) Program Improvement Elementary Schools
That Achieved AYP Growth Targets. University of Southern California.
Tartakoff, J. (2008). Educators urged to adapt to student’s tech lifestyles. Seattle Post-
Intelligencer, December 3, 2008.
Tsantis, L. and Keefe, D. (1996, Fall). Reinventing Education. ASHA, the Magazine of
the American Speech-Language-Hearing Association, 38(4), 38-41.
Tyack, D. (1999). Democracy in Education-Who Needs It? Education Week, 19(12),
42-44.
190
U.S. Department of Education. (2001). Public law print of PL 107-110, the No Child
Left Behind Act of 2001. Retrieved March 22, 2010, from
http://www.ed.gov/policy/elsec/leg/esea02/107-110.pdf
U.S. Department of Education (2009, June 26). U.S. Department of Education Study
Finds that Good Teaching can be Enhanced with New Technology. Retrieved
June 29, 2009, from
http://www.ed.gov/print/news/pressrelease/2009/06/06262009.html
U.S. Department of Education (2010a). Department of Education Information Related to
the Economic Recovery Act of 2009. Retrieved June 9, 2010, from
http://www2.ed.gov/policy/gen/leg/recovery/index.html
U.S. Department of Education (2010b). Race to the Top Fund. Retrieved June 9, 2010,
from http://www2.ed.gov/programs/racetothetop/index.html
U.S. Department of Education, National Center for Education Statistics. (2010). Digest
of Education Statistics, 2009 (NCES 2010-013).
VanSlyke, D.; Tan, A.; Orland, M. (1994). School Finance Litigation: A Review
of Key Cases. Prepared for The Finance Project, December 1994.
Wenglinsky, H. (1998). Does it compute? The relationship between educational
technology and student achievement in mathematics. Princeton, N.J. Educational
Testing Service policy Information Center.
WestEd. (2000, July) School funding : From equity to adequacy. Policy Brief. Retrieved
September 01, 2004, from http://www.wested.org/cs/we/view/rs/180
191
APPENDIX A
TEACHER SURVEY
ADEQUATE FUNDING FOR TECHNOLOGY STUDY – TEACHER SURVEY
Your school was selected for a professional judgment study due to exiting year 4 or 5 of Program Improvement (PI) in
2006 or 2008. While many factors such as excellent instruction, standards-based curriculum, and targeted intervention
programs must have all contributed to your school’s success, this study is focused on your perceptions of what role, if
any, technology played in increasing student achievement and raising test scores at your school. Please consider the
use of technology helping to exit PI and since you have exited PI. Findings will help educators at the school, district,
and state levels make informed decisions on how much and what types of technology should be funded.
Please indicate, by an X, your agreement or disagreement with each of the following statements using the following
scale as it relates to you, where (4) is “Strongly Agree”, (3) is “Agree”, (2) is “Disagree”, (1) is “Strongly Disagree”,
and (0) is “Don’t Know”. Please select only one response per statement.
Technology Hardware
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
Don’t
Know
1. Desktop student computers in a computer lab helped
raise student achievement.
2. Desktop student computers in classrooms helped raise
student achievement.
3. Laptop student computers in a portable lab helped
raise student achievement.
4. Desktop teacher computers helped raise student
achievement.
5. Laptop teacher computers helped raise student
achievement.
6. LCD projectors in classrooms helped raise student
achievement.
7. Document cameras for LCD projectors helped raise
student achievement.
8. Interactive computer display equipment/programs (e.g.
Smartboards, Learning Pads) helped raise student
achievement.
9. Video equipment (e.g. camcorders, digital cameras)
helped raise student achievement.
10. High speed internet connectivity helped raise student
achievement.
11. Wireless internet connectivity helped raise student
achievement.
12. Simpler technology devices (e.g. LeapPads, Franklin
Spellers) helped raise student achievement.
Technology Software/Programs
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
Don’t
Know
13. California standards based assessment and tutorial
programs (e.g. Study Island) helped raise student
achievement.
14. Computer assisted learning and skills reinforcement
software programs (e.g. Math Facts, Math Blaster) helped
raise student achievement.
15. Leveled reading programs (e.g. Accelerated Reader,
Read 180) helped raise student achievement.
16. Leveled math assessment and tutorial programs (e.g.
Accelerated Math) helped raise student achievement.
17. ELD programs (e.g. Rosetta Stone) helped raise
192
student achievement.
18. Electronic textbooks and supplements to core
curriculum helped raise student achievement.
19. Multimedia programs (e.g. United Streaming) helped
raise student achievement.
20. Data analysis and Data Driven Decision Making
programs (e.g. EduSoft, OARS) helped raise student
achievement.
Technology Support
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
Don’t
Know
21. Staff development and hands-on training in
technology helped raise student achievement.
22. A certificated tech coach helped raise student
achievement.
23. A classified computer aid helped raise student
achievement.
24. District tech support helped raise student
achievement.
Open-Ended Items
Please list any additional types, configurations, or brands of hardware that helped to raise student achievement.
Please list any additional software/program types or brands that helped to raise student achievement.
Please list any additional technology support or factors that helped to raise student achievement.
Teacher Grade Level (Please check one)
Primary (K-3) _______
Intermediate (4-6) _______
193
APPENDIX B
PRINCIPAL SURVEY
ADEQUATE FUNDING FOR TECHNOLOGY STUDY – PRINCIPAL SURVEY
Your school has been selected for a professional judgment study as a result of your site’s success in raising
student achievement and exiting year 4 or 5 of Program Improvement (PI) in 2006 or 2008. While many
factors such as excellent instruction, standards-based curriculum, and targeted intervention programs must
have all contributed to your school’s success, this study is focused on your perceptions of what role, if any,
technology has played in increasing student achievement and raising test scores at your school. Please
consider the use of technology helping to exit PI and since you have exited PI. Findings will help
educators at the school, district, and state levels make informed decisions on how much and what types of
technology should be funded, after fully funding teaching positions.
Please indicate, by an X, your agreement or disagreement with each of the following statements using the
following scale as it relates to you, where (4) is “Strongly Agree”, (3) is “Agree”, (2) is “Disagree”, (1) is
“Strongly Disagree”, and (0) is “Not Applicable (N/A) or Don’t Know ”. Please select only one response
per statement. For open-ended items please briefly respond (with just a few words, phrases, or sentences)
in the space provided if completing the survey electronically or on the back of the page if completing a
hardcopy.
Program Improvement Background
Information
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
1. In which subgroups, and subject areas, was your
school not making Adequate Yearly Progress
(AYP) causing your school to progress to Year 4
or Year 5 of Program Improvement (PI)?
2. What were some of the key factors/programs
(not necessarily technology related) that helped
your school to raise student achievement and exit
P.I.?
3. To what extent was technology integrated into
or supportive of these factors/programs?
4. Do you credit the integration of technology into
the core curriculum as an important element of
raising student achievement?
4a. If so, did anyone do “detailed curriculum
design work” to support successful integration?
4b. If so, who did the work?
4c. If so, was release time required?
4d. If so, how was it funded?
5. Did the use, and integration, of technology help
you to make AYP with your particular struggling
subgroup and subject?
5a. If you agree or strongly agree, please briefly
explain how.
Technology Hardware
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
194
6. What types and brands of technological
hardware have been utilized at your school during
the past three years, particularly as it relates to
raising student achievement?
6a. Have desktop student computers in a computer
lab helped to raise student achievement?
6b. Have desktop student computers in classrooms
helped to raise student achievement?
6c. Have laptop student computers in a portable lab
helped to raise student achievement?
6d. Have desktop teacher computers helped to raise
student achievement?
6e. Have laptop teacher computers helped to raise
student achievement?
6f. Have LCD projectors in classrooms helped to
raise student achievement?
6g. Have document cameras for LCD projectors
helped to raise student achievement?
6h. Have interactive computer display
equipment/programs (e.g. Smartboards, Learning
Pads) helped to raise student achievement?
6i. Has video equipment (e.g. camcorders, digital
cameras) helped to raise student achievement?
6j. Has high speed internet connectivity helped to
raise student achievement?
6k. Has wireless internet connectivity helped to
raise student achievement?
6l. Has simpler technology devices (e.g. LeapPads,
Franklin Spellers) helped to raise student
achievement?
Technology Software/Programs
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
7. What types and brands of technological
software have been utilized for the past three
years, particularly as it relates to raising student
achievement?
8. Has computer assisted learning (aka computer
assisted instruction or computer augmented
instruction) been used at your site to raise student
achievement?
8a. How widespread (e.g what percentage of
classrooms) has this use of technology been?
8b. How frequently has this use of technology been
employed in classrooms?
8c. How has this use of technology employed in
the classroom varied by grade level?
8d. Have California standards based assessment
and tutorial programs (e.g. Study Island) helped to
raise student achievement?
8f. Have skills reinforcement software programs
195
(e.g. Math Facts, Math Blaster) helped to raise
student achievement?
8g. Have leveled reading programs (e.g.
Accelerated Reader, Read 180) helped to raise
student achievement?
8h. Have leveled math assessment and tutorial
programs (e.g. Accelerated Math, Learning.com)
helped to raise student achievement?
8i. Have ELD programs (e.g. Rosetta Stone)
helped to raise student achievement?
8j. Have electronic textbooks and supplements to
core curriculum helped to raise student
achievement?
8l. Have multimedia programs (e.g. United
Streaming, Discovery Education) helped to raise
student achievement?
8m. Have data analysis and Data Driven Decision
Making programs (e.g. EduSoft, OARS, etc.)
helped to raise student achievement?
Instructional Leadership in Technology
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
9. As the de facto instructional leader of your site,
have you deliberately expanded the use of
technology at your site?
9a. If so, has the expansion of your site’s use of
technology been undertaken with the expectation
that it would, in and of itself, raise student
achievement?
9b. If so, has the expansion of your site’s use of
technology been undertaken with the expectation
that it would help to raise student achievement?
9c. If so, has facilitating your students’ mastery of
“21
st
century skills” been a factor in your
expansion of educational technology at your site?
9d. If so, has the expansion of your site’s use of
technology been undertaken with an expectation
that it would require dedicated professional
development time and dollars?
Technology Support
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
Insufficient or delinquent tech support – which frequently cannot keep pace with needs of many of the
nation’s school technology infrastructure – tends to be a widespread, common, and contentious issue in
many school districts. Indeed, this educational technology issue is frequently cited as a hindrance, or
impediment, to the effective implementation of educational technology within our public educational
system. With these thoughts in mind, please answer the following tech support related questions.
10a. Please briefly describe the entity you have
depended upon for tech support of your hardware,
196
software, and networks.
10b. Has an on-site tech coordinator/coach
provided some, or all of your tech support and
maintenance?
10c. Has your district technology office provided
some, or all, of your tech support and
maintenance?
10d. Has a private firm provided all, or some, of
your tech support and maintenance?
10e. Has tech support for your school’s hardware,
software, and networks on site been adequate?
10f. Has tech support for your school’s hardware,
software, and networks on site been inadequate?
10g. Has insufficient or delinquent tech support for
your school been a concern and an impediment to
the successful implementation of instructional
strategies designed to raise student achievement?
11. Please briefly list any additional types of
technological support (I.T., technology coaches,
lab aides, etc.) that may have been utilized at your
site for the past three years.
12. Has your site set a replacement cycle for your
computers and peripherals?
12a. If so, what is the approximate length of time
your site has set for this hardware replacement?
12b. Is funding currently in place, or do you expect
that funding will be available, in order to meet the
future expenditures associated with meeting your
projected hardware replacement cycle?
12c. Has your site endeavored to stagger your
purchases of computers and peripherals in an effort
to spread the replacement costs associated with
hardware replacement cycles?
12d. Please briefly describe where your site’s
computers and peripherals are in their replacement
cycle (e.g. 20% of computers one year old, half of
printers over 5 years old, all LCD projectors
purchased 3 years ago, etc.)
Data Analysis and Data Driven Decision
Making (DDDM)
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
13. Has data analysis or DDDM been employed in
an effort to raise student achievement?
If you agree or strongly agree that data analysis or DDDM has been employed in an effort to raise
student achievement, please describe the use and importance of the following.
13a. Has technology been employed in facilitating
these efforts?
13b. If technology has been employed, please
briefly describe how and the specific programs
employed.
13c. Has your use of data analysis or DDDM
197
included computer-based, scored, or tracked
student assessments, and if so, how frequently
were these tests administered and data analysis or
DDDM utilized?
13d. If your use of data analysis or DDDM has
included computer-based assessments, what
subject matter did these tests and DDDM include?
13e. Has data analysis or DDDM employed in an
effort to raise student achievement varied by grade
level? If so, how?
13f. Has the technology which was employed
within your DDDM system allowed your teaching
staff to more closely monitor their students’
progress?
13g. If so, has this led to increased student
achievement?
13h. Has the technology employed within your
DDDM system led to increased, or more effective,
collaboration between grade level teachers?
13i. If so, has this led to increased student
achievement?
13j. Has the technology employed within your
system of DDDM allowed students to monitor
their own academic progress?
13k. If so, has this led to increased student
achievement?
13l. Overall, has the technology which was
employed and utilized within your system of
DDDM been a significant factor in raising student
achievement?
Educational Technology Costs, Funding, and
Budgets
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
The purpose of this study is to develop a research based estimate of the resources needed for a school
level technology program that will support instructional programs aimed at dramatically improving
student performance. The study uses a school finance adequacy model developed by Lawrence O. Picus
and Allen Odden. This model relies on research on what works in schools to identify resources schools
need to ensure all children have a chance to perform at high levels. The goal is to take the Picus
technology “costing-out” model and compare the estimated resources with real world site level budgets
and funding. The Picus/Odden technology “costing-out” model identifies direct technology costs as the
direct costs of purchasing, upgrading, and maintaining computer technology hardware and software.
Staffing and staff development costs are captured elsewhere in the model. Under this narrowly defined
assessment of technology costs, the Picus/Odden model finds annual costs per student are approximately
$250 for the purchase, update, and maintenance of hardware and software. The Picus/Odden model thus
asserts that the $250 figure is sufficient to purchase, upgrade, and maintain computers, servers, operating
systems and productivity software, network equipment, and student administrative system and financial
systems software, as well as other equipment such as copiers. This figure is said to be sufficient to cover
medium priced student administrative and financial systems software packages.
14. Please briefly answer, as best you can, the following questions that relate to the above aspects of
educational technology expenses, funding, and budgets. Please base your answers on averages over the
two (2) previous years before the current budget crisis.
198
14a. In your professional opinion, is the
Picus/Odden $250 per student proposed dollar
figure for funding educational technology at the
elementary school level adequate?
14b. On average, looking at your site budgets,
what are your approximate annual expenditures on
technology including hardware, software,
programs, network connectivity, etc. (please
exclude staffing, staff development, and tech
support)? How many students are enrolled at your
school?
14c. What technology related staff positions exist
at your site? What is their full time equivalency
(FTE)?
14d. How are your technology related staff
positions funded and what is their approximate
annual expense?
14e. What are your approximate annual expenses
for staff development related to technology? How
is this funded?
14f. What is your district’s (or site’s) projected
annual budget, per student, for technology support?
How is it funded?
15. During your site’s Program Improvement
status did your site draft an annual technology plan
or budget, or was technology a formal part of your
Alternative Governance Plan?
15a. Looking back, what was the total approximate
annual dollar figure attached to the technology
aspects of your Program Improvement efforts?
15b. Did your school have the benefit of a local
bond issue or grant designed to assist in funding
your implementation and use of technology?
Exiting Program Improvement
4
Strongly
Agree
3
Agree
2
Disagree
1
Strongly
Disagree
0
N/A or
Don’t
Know
16. In conclusion - based upon your success in guiding your school out of program improvement, and
your site’s experiences concerning the effectiveness of educational technology in that environment - how
would you advise other educational professionals, who may be administrators and instructional leaders,
concerning the value and effectiveness of technology in raising student achievement.
16a. Educational technology was an essential
element of my overall school site program
improvement plan or Alternative Governance Plan.
16b. Educational technology was an important
aspect of my overall school site program
improvement plan or Alternative Governance Plan.
16c. Educational technology was not an important
aspect of my overall school site program
improvement plan or Alternative Governance Plan.
16d. Educational technology had a significant
impact on raising student achievement at my
199
school site.
16e. Educational technology played only a
minor/supporting role in raising student
achievement at my school site.
16f. Educational technology had little or no effect
on raising student achievement at my school site.
16g. Educational technology played a significant
role in my site’s success in raising student
achievement and made a significant contribution in
helping our school exit from program
improvement.
16h. Educational technology played a
minor/supporting role in my site’s success in
raising student achievement and made a small
contribution in helping our school exit from
program improvement.
16i. Educational technology played little of no role
in my site’s success in raising student achievement
and made a little or no contribution in helping our
school exit from program improvement.
16j. The best efforts of my staff, and the
implementation of my program improvement plan,
might not have been effective in raising student
achievement, without the incorporation of an
effective educational technology component.
Open-Ended Items
Please list any additional types, configurations, or brands of hardware that helped to raise student
achievement.
Please list any additional software/program types or brands that helped to raise student achievement.
Please list any additional technology support or factors that helped to raise student achievement.
Do you have any other comments concerning educational technology, or is there anything else you would
like to add?
Thank you for expending some of your valuable time to complete this survey. As a former site principal I
know how precious little time you have for activities such as this. I also know that as a site principal, I
spent a lot of site categorical money on technology expenditures. My hope is that your thoughtful response
will help fellow administrators to figure out if these expenditures actually help to increase student
achievement in these times of high accountability. I also hope your responses will inform administrators on
which specific technology expenditures will help them get the most bang for the buck in these tumultuous
budget times.
200
APPENDIX C
DISTRICT NOTIFICATION LETTER
July 21, 2009
XXXX XXXXX, Superintendent
XXXXX Unified School District
XXXXX Avenue
XXXXX, CA 9XXXX
Dear Superintendent XXXXX,
I am writing to seek your permission to include one of the schools within your district – XXXXX
Elementary – in my doctoral study. I would like to include XXXXX in my study because they significantly
increased student achievement and exited from Year 4 or 5 of Program Improvement (P.I.) in 2006 or 2008.
As the former principal of one of the ten California elementary schools (XXXXX in XXXXX U.S.D.) that
also exited from Year 4 of P.I. in 2006, I appreciate what an accomplishment this was.
As a doctoral candidate at the University of Southern California, under the guidance of Dr. Lawrence Picus
at the Rossier School of Education, I am currently conducting a successful school study, focusing on
elementary schools in California that exited from Year 4 or 5 of P.I. in 2006 or 2008. While I know many
factors affect a school’s success in exiting from P.I., I am narrowing the focus of my study to look at what
effect educational technology played, if any, in helping to raise student achievement.
As XXXXX Elementary is one of only a few schools which are contained within the statewide population
of my study, I urgently need a few minutes of time to survey the principal. I would also like to incorporate
a brief paper survey with at least two classroom teachers at the school (one or more representing the lowers
grades, and one or more representing the upper grades), and (should one exist at the school level) a
dedicated site level technology coordinator. As a fellow busy site administrator, I have sympathetically
designed the principal survey to take less than 30 minutes. The teacher paper survey should take less than
10 minutes. I am hoping that the relative calm between school years will be the perfect time to complete
this task.
In order to conduct this research with your school, I am seeking an approval letter on your letterhead.
To maintain confidentiality, your school district, the individual schools, and the participants will remain
anonymous. Each of the above will be identified only by a letter code.
Again, the main purpose of this study is to determine to what extent, if any, technology has played a role in
helping to increase student achievement, as demonstrated by your site’s ability to raise standardized test
scores, and therefore exit from Year 4 or 5 of P.I. The ultimate goal of this study is to better define existing
estimates of the resources needed for a school level technology program that will support instructional
programs aimed at dramatically improving student achievement. I also hope the information gathered will
provide guidance in how my district spends federal stimulus money. I know the school may have been
asked to participate in several studies. Therefore, I will try to make this as quick and painless as possible.
Thank you for your time and consideration of this request. For your convenience I have attached an
approval letter form which can be copied to your district’s letterhead. I would greatly appreciate it if you
could please have a member of your staff return the signed approval letter in the self addressed envelope I
have included, or, send me a fax of this approval letter at (909)357-5129. I look forward to the opportunity
to learn about the instructional leadership strategies and technology used by Mr. XXXX XXXXX and his
staff at XXXXX Elementary School. Please contact me should you require further information regarding
the nature of this study, or with any questions you may have related to my request.
201
Sincerely,
Jason B. Angle
Doctoral Candidate, University of Southern California
It is my understanding that this document is seeking my approval for the surveys of the various
staff mbers of XXXXX Elementary school, as described below. I have been apprised that this
research will be conducted by Jason Angle, a doctoral candidate from the University of
Southern California under the guidance of Dr. Lawrence Picus, at the Rossier School of
Education. I also understand that all of the research data to be collected will be part of a
successful school study which has been designed to examine educational technology,
instructional leadership, and an elementary level school’s demonstrated ability to raise student
achievement.
I hereby grant permission to the above researcher to survey the school site principal. I also grant
permission to allow the incorporation of a brief paper survey with classroom teachers at the
school site (one or more representing the lower grades, and one or more representing the upper
grades), and (should one exist at the school level) a site level technology coordinator. I have been
assured that the principal survey and interview will require no more than 30 minutes to complete,
and the teacher paper survey will require no more than 10 minutes to complete.
I have also been assured that in order to maintain confidentiality, the names of the school district,
the individual schools, and the participants will remain anonymous; with each of the above being
identified only by a letter code.
I have been apprised that: 1) That the sole purpose of this study is to determine to what extent, if
any, technology has played a role in helping to increase student achievement, as demonstrated by
the site’s ability to raise standardized test scores, and therefore exit from year four or five of
Program Improvement. 2) That the ultimate goal of this study is to better define existing
estimates of the resources needed for a school level technology program that will support
instructional programs aimed at dramatically improving student achievement.
Acknowledging the above, I hereby grant permission to the above doctoral candidate to conduct
the school technology research project described above.
District: XXXXX Unified School District
Superintendent (or designee): __________________________________________________
Signed _________________________________________________________
Date Signed ________________________
Please return document to researcher in the self addressed envelope, or fax to (XXX)XXX-XXXX,
Attention Jason Angle.
202
APPENDIX D
PRINCIPAL SURVEY COVER LETTER
June 9, 2009 XXXXX Elementary School
Dear Principal XXXXX,
In 2006, your school met or exceeded all AYP targets under No Child Left Behind and your school exited
Program Improvement from Year 4. As one of only 10 schools in the entire state of California to exit
Program Improvement from Year 4, your school has been selected to participate in a professional judgment
study – Adequate Funding for Technology in Elementary Schools.
The purpose of this study is to determine what types and formats of technology, if any, have been used in
these 10 schools to help increase student achievement and raise test scores to exit Program Improvement
from Year 4. As a fellow educator in one of these schools, I fully understand that technology is just one
piece of the puzzle in raising student achievement. Technology just happens to be the piece of the puzzle
that I am investigating! In responding to the questions, please do not limit your responses to just the
technology that you used to exit Program Improvement. Please also consider any technology that you may
have used since exiting program improvement to continue increasing student achievement. This study will
be used to make adjustments to state level adequacy studies used to guide state, district, and school level
officials in making funding decisions in areas such as technology.
As the former principal of one of these 10 schools, I understand that your time is valuable and that you may
have already participated in several studies. As a doctoral student, I would truly appreciate it if you would
donate just 30 minutes of your time to assist me in gathering data for my study. I know that the principal
survey looks big and scary, but most questions that are not multiple-choice simply require a yes/no
response or just a few brief words to answer. You can glance at your site budgets to answer money
questions, or just give me an estimate. You can complete this survey electronically at your convenience
and e-mail it back to me, or print a hard copy to complete by hand (I can send you a self-addressed and
stamped envelope). If it would make it easier, I would also be happy to interview you in person or over the
phone. Your participation is completely voluntary and no individual identifying information will be
included in the results. In addition to ensuring that your responses and individual identifying information
remains anonymous, I will follow all guidelines and regulations as stipulated by the University of Southern
California’s Institutional Review Board. Your superintendent has also given permission for your school to
participate in this study.
Just as your individual time and effort made an important contribution to your school’s success, so will
your individual time and effort make an important contribution to this study. Thank you for your
participation!
Sincerely,
Jason Angle
Director, Elementary Instruction
(Former Principal, XXXXX Elementary School)
XXXX Unified School District
Doctoral Student, University of Southern California
Abstract (if available)
Abstract
While technology, in and of itself, is unlikely to raise student test scores, expenditures on technology do play a role in achieving an overall adequate educational system. The purpose of this study was to use a successful schools model to survey site principals and site teachers to seek their professional opinions on the role educational technology played in increasing student achievement and exiting from Year 4 or Year 5 of Program Improvement in 2006 or 2008.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The role and value of educational technology in California fourth and fifth year (2006-2007) program improvement elementary schools that achieved AYP growth targets
PDF
Allocation of educational resources to improve student achievement: case studies of six California schools within two school districts
PDF
The allocation of resources at the school level to improve learning for struggling readers: What is adequate?
PDF
Allocation of educational resources to improve student achievement: Case studies of four California charter schools
PDF
Resource allocation and educational adequacy: case studies of school-level resource use in southern California with budget reductions
PDF
Evidence-based study of resource usage in California’s program improvement schools
PDF
Educational resources to improve student learning: effective practices using an evidence-based model
PDF
Allocation of educational resources to improve student achievement: Case studies of non-title I schools
PDF
The implications of using online classes with at-risk students in an alternative education setting
PDF
How improving schools allocate resources: a case study of successful schools in one southern California urban school district
PDF
Allocation of educational resources to improve student learning: case studies of California schools
PDF
The impact of resource allocation on professional development for the improvement of teaching and student learning within an elementary school in a centrally managed school district: a case study
PDF
Organizational relationships in supplemental educational services (SES) and SES-type programs
PDF
Allocation of educational resources to improve student learning: case studies of California schools
PDF
Better is as better does: resource allocation in high performing schools
PDF
Resource allocation and instructional improvement strategies in rural single-school elementary districts
PDF
School-level resource allocation practices in elementary schools to increase student achievement
PDF
Making the Golden State glitter again: how the evidence based adequacy model can save struggling schools in difficult times
PDF
Successful resource allocation in times of fiscal constraint: case studies of school-level resource use in southern California elementary schools
PDF
Resource allocation strategies and educational adequacy: Case studies of school level resource use in California middle schools
Asset Metadata
Creator
Angle, Jason B. (author)
Core Title
Adequate funding for educational technology
Contributor
Electronically uploaded by the author
(provenance)
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Degree Conferral Date
2010-08
Publication Date
08/10/2010
Defense Date
06/29/2010
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
adequate,Educational Technology,elementary,funding,OAI-PMH Harvest
Place Name
California
(states)
Language
English
Advisor
Picus, Lawrence O. (
committee chair
), Hentschke, Guilbert C. (
committee member
), Nelson, John L. (
committee member
)
Creator Email
anglefamily5@rocketmail.com,angljb@fusd.net
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m3374
Unique identifier
UC1447195
Identifier
etd-Angle-3985 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-384633 (legacy record id),usctheses-m3374 (legacy record id)
Legacy Identifier
etd-Angle-3985.pdf
Dmrecord
384633
Document Type
Dissertation
Rights
Angle, Jason B.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
adequate
elementary
funding