Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The role and value of educational technology in California fourth and fifth year (2006-2007) program improvement elementary schools that achieved AYP growth targets
(USC Thesis Other)
The role and value of educational technology in California fourth and fifth year (2006-2007) program improvement elementary schools that achieved AYP growth targets
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
THE ROLE AND VALUE OF EDUCATIONAL TECHNOLOGY IN CALIFORNIA
FOURTH AND FIFTH YEAR (2006-2007) PROGRAM IMPROVEMENT
ELEMENTARY SCHOOLS THAT ACHIEVED AYP GROWTH TARGETS
by
George Raymond Szeremeta
A Dissertation Presented to the
FACULTY OF THE ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2009
Copyright 2009 George Raymond Szeremeta
ii
DEDICATION
This dissertation is dedicated to those all who have walked beside me, supported me, and
have sacrificed, along this long journey. I wish to sincerely thank, and express my love
to, my family: My wonderful, loving wife Shih Chun and our truly, truly incredible son
Dylan Chu. (Your daddy will always love you, BIG – BIG – BIG!) My Parents, Sally
Curtis, Robert Curtis, and George Szeremeta Sr. My brother, William Szeremeta, and
his family. My sister, Victoria Lane, and her family. I wish to also express my love and
gratitude to all of the writers, artists, philosophers, professors and intellectuals, musicians
and poets, Zen masters, revolutionaries, physicists, friends and acquaintances – some that
I have had the honor to meet or know, some that have lived in my age, and many who
lived in an age before mine – who have touched my life and have guided me in making
sense of existence and the world we find ourselves inhabiting. Special messages of love
and gratitude I would like to express to the following two gentlemen: To a warm,
compassionate human being, my professor, my mentor, and a person concerned, and
involved, enough in humanity to be a social activist in the long tradition of those who
have inhabited this world. A person who cares enough, in the tradition of those great
men and women who came before him, not only to speak out, but to attempt to right
those injustices and injuries one witnesses being perpetrated toward others; Dr. Carlos
Tejeda. Thank you, sir. And to Dr. Michael Calabrese, my beloved English professor,
from whom I learned a great many things. You frequently made me laugh. Sometimes,
almost, made me – and others – cry, and forged me into a better writer. Thank you, sir.
iii
ACKNOWLEDGEMENTS
This dissertation would never have come to fruition
without the generous support, and the indispensable guidance from,
my dissertation chair, Dr. Lawrence Picus.
I would also like to acknowledge Dr. Guilbert C. Hentschke and Dr. John Nelson,
for the time, insights, and wisdom, which they generously provided
through service as committee members.
Thank you, gentlemen.
iv
TABLE OF CONTENTS
DEDICATION
ACKNOWLEDGEMENTS
LIST OF TABLES
LIST OF FIGURES
ABSTRACT
CHAPTER ONE: OVERVIEW OF THE STUDY
Introduction
Statement of the Problem
Purpose for Undertaking the Research Study
Research Questions
Importance of the Research Study
Methodology
Assumptions
Limitations
Delimitations
Definition of Terms
Organization of the Study
CHAPTER TWO: REVIEW OF THE LITERATURE
Introduction
Why Adequacy? – Why Now?
History of School Finance in America
School Finance and the America Common School Model
Shifting Responsibilities Involving Public School Funding
Origins of State Level School Finance Equity Litigation
The Federal Role in Education and School Finance
Evolution of School Finance Litigation at the State Level
Origins of School Finance Adequacy Movement
Defining Adequacy and Assessing the Impact
of Funding Levels in Schools
Evolution of the Adequacy Movement in School Finance
School Finance Adequacy Methodology
The Input, Professional Judgment or Consensus, Approach
The Economic Cost-Function Approach
The Successful School / District Approach
ii
iii
vii
viii
x
1
1
3
6
7
7
10
11
12
12
12
16
17
17
19
20
20
21
22
23
26
27
28
33
35
37
39
40
v
The Evidence-Based or School Reform Approach
Alternative Perspectives on the Various Methodologies,
the Various Classifications, and History of Adequacy Studies
Economies and Dis-Economies of Scale
Educational Productivity
Summary of School Finance Methodologies
Educational Technology Resource Allocation
Technology Resource Allocation and Student Achievement
Technology Resource Allocation and its Relationship
to Instruction and Teachers
21
st
Century Literacy
Technology Mandates Under No Child Left Behind (NCLB) Act of 2001
Data Driven Decision Making (DDDM)
Costing Out Methodologies and Cost / Benefit Analysis
Total Cost of Ownership (TCO)
Value on Investment (VOI)
CHAPTER THREE: METHODOLOGY
Introduction
Research Population
Instrumentation
Data Collection
Data Analysis
Summary
CHAPTER FOUR: DATA ANALYSIS AND
INTERPRETATIONS OF THE FINDINGS
Introduction
Sample and Population
Findings for Research Question Number One:
How Should Technology be Deployed and Employed
at the School Site Level, in Keeping With Best Practice?
Summary of Findings for Research Question Number One:
How Should Technology be Deployed and Employed
at the School Site Level, in Keeping With Best Practice?
Findings for Research Question Number Two:
Does Technology, Deployed and Employed in Keeping With Best Practice,
Provide School Sites an Effective Vehicle for Raising Student Achievement?
Summary of the Findings for Research Question Number Two:
Does Technology, Deployed and Employed in Keeping With Best Practice,
Provide School Sites an Effective Vehicle for Raising Student Achievement?
Findings for Research Question Number Three:
What Levels of Resource Allocations are Required to Establish, and Sustain,
an Effective Technology Program at the School Site Level?
42
44
45
47
49
51
51
53
56
62
63
68
69
70
72
72
74
76
79
81
83
84
84
86
89
100
103
134
137
vi
Summary of the Findings for Research Question Number Three:
What Levels of Resource Allocations are Required to Establish,
and Sustain, an Effective Technology Program at the School Site Level?
California Department of Education 2007 School Technology Survey Data
CHAPTER FIVE: SUMMARY, CONCLUSIONS,
AND IMPLEMENTATIONS OF THE FINDINGS
Overview of the Problem
Purpose of the Study
Methodology
Sample and Population
Data Collection
Data Analysis
Findings by Research Question:
Research Question One
Research Question Two
Research Question Three
Conclusions
Broader Implications and Limitations
Implications
Recommendations
Suggestions for Further Research
REFERENCES
155
158
167
167
169
170
171
172
173
174
174
180
182
183
186
187
188
189
191
vii
LIST OF TABLES
Table 1: School Site Student to Computer Ratios
Table 2: School Site Computers Connected to Internet
Table 3: School Site Computer Location
Table 4: Age of School Site Computers
Table 5: Expected Change in School Site Computer Availability
Table 6: Average School Site Hardware Fix Time
Table 7: School Site Level Technical Support Staffing
Table 8: School Site Level Curriculum Support Staffing
159
160
161
162
163
164
165
166
viii
LIST OF FIGURES
Figure 1: Technology Integration into Curriculum and Student Achievement
Figure 2: Technology, AYP and Struggling School Site Subgroups
Figure 3: Technology Curriculum Integration and Student Achievement
Figure 4: Expansion of School Site Technology Program
Figure 5: Technology as Stand Alone Solution to Student Achievement
Figure 6: Can Technology Help to Raise Student Achievement
Figure 7: Students and Mastery of 21
st
Century Skill Sets
Figure 8: Was DDDM Employed to Raise Student Achievement
Figure 9: Teaching Staff Use of DDDM and Student Achievement
Figure 10: Technology, DDDM, Teacher Collaboration and Student Achievement
Figure 11: Student Use of DDDM Data and Student Achievement
Figure 12: Educational Technology as Import Aspect of Improvement Plan
Figure 13: Educational Technology as Essential Element of Improvement Plan
Figure 14: Educational Technology as Not an Important Aspect of Plan
Figure 15: Technology Had a Significant Impact on Student Achievement
Figure 16: Technology Played Minor Role in Raising Student Achievement
Figure 17: Technology Had Little or No Effect on Student Achievement
Figure 18: Technology Played Significant Role and Made Significant Contribution
Figure 19: Technology Played Minor Role and Made Small Contribution
95
108
110
111
112
112
113
114
116
117
118
123
123
124
125
125
126
127
127
ix
Figure 20: Technology Played Little or No Role and Contribution
Figure 21: Absence of Technology in Improvement Plan
and Student Achievement
Figure 22: Twelve Site Lower Grade Teacher Response Summary
Figure 23: Twelve Site Upper Grade Teacher Response Summary
Figure 24: Technology and Resource Component Required
for Professional Development
128
129
132
133
150
x
ABSTRACT
The wisdom, expense, and relative cost effectiveness, of employing computers and
other technology within California public schools, in an effort to both raise student
achievement and equip our students with 21
st
century skills, has been and continues to be,
a contentious issue. Researchers, educational professionals, school board members and
stakeholders often remain at odds with one another about the presence, best practice, and
effectiveness of computers, and other aspects of technology, within our schools.
The purpose of this successful school study was to examine and gauge, by means of a
14 page quantitative and qualitative survey, the various professional opinions of school
site principals, concerning the implementation, best practice, and effectiveness of
technology in raising student achievement. In addition, a two page technology survey
was completed by one upper grade elementary school teacher, and one lower grade
elementary school teacher at the site.
The population selected for the study consisted of California public schools that were
designated by the California Department of Education (CDE) as being fourth year
Program Improvement (PI) schools pursuant to Title I of No Child Left Behind (NCLB).
Schools which have been designated as PI, year four, have failed to make Adequate
Yearly Progress (AYP) for a period of at least five years. Such a situation is not only a
serious concern to students, parents and other stakeholders, but this status has serious
repercussions for the school site administration and staff. The districts which are
responsible for such schools are required by federal law to draft a restructuring plan for
the following school year which would be implemented should the site once again fail to
xi
meet their AYP growth target. The restructuring such schools face entails fundamental
reforms, and a major reorganization of the site, that includes the site’s staffing and
governance. The sample selected from this population of year four PI schools was
narrowed from this group of schools to include only California elementary schools that
were successful in exiting from PI by meeting their AYP for the 2006-2007 school year.
This study was guided by three fundamental research questions which were posed in
order to advance the knowledge base involving school finance research associated with
adequacy issues relating to the presence and use of technology in schools: (1) How
should technology be deployed and employed, at the school site level, in keeping with
best practice? (2) Does technology, deployed and employed in keeping with best practice,
provide school sites an effective vehicle for raising student achievement? (3) What levels
of resource allocations are required to establish, and sustain, an effective technology
program at the school site level?
The overarching findings related to the three research questions were: (1) A best
practice belief that technology must be well and truly integrated within, and across, the
curriculum in order to be an effective tool, or vehicle, for raising student achievement; (2)
Data Driven Decision Making (DDDM) was an essential element of each of the 12 school
site improvement plans examined, and facilitated student achievement; (3) Funding
mechanisms or revenue streams associated with technology resource allocations are, at
best, a fragmented situation and, at worst, a feast or famine affair which does not allow
for the kind of long range, year to year, planning and replacement cycles which are
critical to sustaining an effective technology program at the school site level; (4) Finally,
xii
technology is an important resource, however it represents but a single resource among
many which must be present and productive within the school site, in order to foster our
ability to teach, and facilitate our student’s ability to master, their grade level curriculum.
One may conclude, looking to the future, that the widespread implementation of best
practice procedures may lead to the ability of researchers to better establish more
accurate long range planning resource allocation figures, and thus a more accurate picture
of the adequate technology resource requirements required to establish and sustain, an
effective technology program that can be relied upon to raise student achievement.
1
CHAPTER ONE
OVERVIEW OF THE STUDY
Introduction
The previous decade within the school finance community saw a push for the
establishment of quantifiable linkages in K-12 funding systems, between given funding
levels, which may be thought of as inputs, and productivity, which may be thought of as
outputs. As a result, what would eventually come to be called the adequacy movement
within school finance was born. With the birth of this movement came a quest to link
given levels of funding, and how those funds are being allocated, to various measures of
student achievement.
Adequacy based school finance studies inquire into the relative strengths, or the
weaknesses, of these correlations between various inputs and outputs. In this respect,
educational outcomes, most often defined by student achievement and student attainment,
have most frequently been measured in terms of a student’s mastery of grade-level
curricular and content standards, which in turn are most frequently measured through
standardized testing measures, which in turn are frequently associated with broader
accountability systems.
Although the previous decade has seen the development, and widespread
implementation of, various approaches which all attempt to define and quantify
educational adequacy, this research study is most closely aligned with the state-of-the-art,
or evidence-based studies, that are most frequently associated with the work of Odden
2
and Picus. Although classified under the broader umbrella of resource cost studies, such
school finance research studies have shifted the emphasis from the search for a means to
measure equity and gauge remedies for differences attributable to the fiscal capacities of
various school districts, to that of school finance research which is more concerned with
the study of resource allocations associated with the implementation of sound, research-
based instructional strategies designed to facilitate student achievement and allow all,
except perhaps the most severely of disabled, students to meet prescribed grade-level
standards.
Although evidence-based, or state-of-the-art, adequacy studies have now been
undertaken in a number of states, these early studies have been somewhat limited by
being restricted to assessing the overall, per-pupil funding needs primarily upon a set of
core instructional program expenses which were re-constituted on the basis of state
averages. It is problematic, and a weakness, that non-core expenses such as technology,
safety and security, transportation, district administration, maintenance, and facilities,
have heretofore, been taken as fixed costs, and simply accepted by the researchers in their
calculations of the overall, per-pupil, dollar amounts arrived at within a given adequacy
study. This shortcoming of state-of-the-art and evidence-based methodology, represents
a compromise which, heretofore, was necessitated by a dearth of research concerning
how these various allied educational expenses can best be quantified and then rolled into
a given adequacy study, for a given, state, district, or school site.
A knowledge gap therefore exists in educational practice, and in the literature. One of
the gaps in this area of school finance knowledge centers around the question of how to
3
best define, and quantify, technology related resource allocation requirements and related
expenses, in order to supplement established methodologies which have already
ascertained core instructional per-pupil adequacy funding, and resource allocation levels.
This knowledge is an important step in providing a more complete and accurate
assessment of required school site funding levels that are associated with meeting various
state and federal mandates called for in the nation’s current accountability and standards
based educational environment and political climate.
Statement of the Problem
Public Education is an expensive proposition. Recent years have seen widespread,
severe, and repeated, budget crises at the state level across the nation. Indeed, people are
frequently surprised to learn that the largest single line item contained with any given
state budget is public education. In such times governors, legislators, educational leaders,
and stakeholders increase calls for fiscal accountability, efficiency, and often must seek
cost cutting measures to cope with leaner budgets.
Picus (2000, p. 75) observes that, “Despite the large sums of money spent annually for
K-12 education, we know remarkably little about how those funds are used at the
individual student – and school – level”. In spite of countless research studies, and
amidst an atmosphere in which the long-term national trend is one of ever increasing
expenditures on the part of the nation’s state and federal governments, it is disturbing that
the literature concerning school finance remains inconclusive, and has been unable to
4
quantify, the needs associated with the required allocation levels of various student
resources and the associated adequate per-pupil funding levels.
The problem thus is that, in spite of these large quantities of research, and the vast
sums of data which exist concerning current state educational funding levels on a per
pupil basis, far less is known about adequate funding levels at the state level, and very
little is known about such needs at the district or school site level. The evidence based,
or state-of-the-art method, with which this dissertation is aligned, therefore seeks to better
understand, and strives to quantify, adequate levels of per pupil funding using the district
and school site as the units of analysis.
Odden and Picus (2008) have advocated an analysis of various school expenditures,
that is disaggregated by traditional organization as related to age and grade level
categories, with the school site as a unit of analysis. They also assert that the collection
of resource data at the school level is equally important in relation to state and federal
education policy. They point to the fact that “Many states” across the nation “still spend
between 25 and 33 percent more for high school students than elementary school
students” and therefore reason that “more detailed expenditure figures are preferred,
especially expenditures by program and school level.” (Odden & Picus, 2008, pp. 56-57).
A more detailed analysis of site level per-pupil expenditures, disaggregated by type of
school, which incorporates data related to the various technology related expenses
required to generate a more accurate dollar figure of overall funding pursuant to the quest
to quantify an adequate education, is therefore clearly the direction that school finance
5
research should proceed in the quest to expand knowledge of practice and make a
meaningful contribution to the existing body of school finance literature.
A gap in knowledge currently exists in the literature since school finance studies have,
to date, traditionally been associated with research which has been conducted primarily at
the state and school district level with the these districts being the most common unit of
analysis. Such a narrow research focus is problematic since researchers in school finance
currently, as Odden and Picus (2008) state, know a great deal about how much our
schools spend for salaries, benefits, contracts, and so forth, but relatively little about
expenditures by “function” which encompass allied costs in the form of central district
administration costs, maintenance and operations expenses, transportation, safety and
security, and technology.
Furthermore, to date, when researchers have attempted to quantify adequate per pupil
funding levels by means of such evidence based or state-of-the-art methodologies, such
studies have traditionally been conducted by defining and re-calculating expenses
associated with core instructional programs, and have proceeded by simply accepting the
existing data associated with other related, or non-core, expenses such as district
administration, maintenance and operations, facilities, food services, transportation,
special education, safety and security, and technology. A far more accurate calculation
of per pupil funding requirements and adequacy levels is therefore plausible once these
traditionally assumed allied expenses can also be defined and quantified.
In the context of our nation’s current standards and accountability movement in which
draconian sanctions can attach at both the student and school level when they do not meet
6
prescribed achievement levels and goals, it is imperative that educators, administrators,
politicians, business leaders, and parents be armed with a clearer understanding of
whether or not the nation’s schools are receiving adequate levels of resource allocation,
and the associated adequate per-pupil funding levels, in order to meet the diverse needs
of a diverse student body of young Americans.
Educational policy in such a national climate demands that future research provide
educational stakeholders with the ability to correlate, and juxtapose, inputs in the form of
dollars and resources, with outputs, in the form of meeting existing state and federal
standards and ensuring satisfactory levels of student achievement. Therefore research
which can provide stakeholders with the first half of this equation is quite imperative.
Once the issue of site-level, per-pupil funding levels and adequacy has been quantified,
continued research can, and almost certainly will, advance practice and the literature by
defining and exploring the nature of the relationship, and by endeavoring to quantify the
correlations which exist between inputs and outputs. This has been lacking or absent in
the literature and research, but is imperative in order to guide effective educational policy
and improve practice.
Purpose for Undertaking the Research Study
The purpose of this study is to define and quantify the nature of, and required levels
of, technological resources at the school site level in order for students to meet standards
and prescribed levels of educational attainment. Taken in concert with aligned and
similar studies currently being conducted as thematic dissertations at the University of
7
Southern California under the supervision of Dr. Lawrence Picus, the goal of this study is
to advance the literature and improve practice by providing a more accurate picture of
school site level, per pupil funding adequacy by defining, and quantifying, aspects of
those related educational costs which have, hereto, been undefined, and have traditionally
remained un-quantified, in resource based or state-of-the-art studies and research.
Research Questions
This study addresses and answers three research questions:
1. How should technology be deployed and employed, at the school site level, in
keeping with best practice?
2. Does technology, deployed and employed in keeping with best practice, provide
school sites an effective vehicle for raising student achievement?
3. What levels of resource allocations are required to establish, and sustain, an
effective technology program at the school site level?
Importance of the Research Study
This research study provides policymakers and stakeholders the ability to more
thoroughly define, and quantify, adequate levels of per pupil funding by defining, and
quantifying, those levels of technological resources necessary to facilitate student
learning and achievement, which are required for the mastery of curriculum and state
frameworks, and to meet sets of grade level appropriate state delineated standards, as
demonstrated through various assessment measures.
8
This study is important because it adds to the literature by providing the first of two
critical components involved in educational finance studies, namely inputs, or those
levels of resources prescribed for adequate levels of technology required to facilitate
student achievement. Ultimately, school finance research, and the literature, should
progress toward a linkage of such data with that of future studies dedicated to the
correlation, and juxtaposition, of input data with data concerning outputs in the form of
standardized testing results and other student achievement statistics.
Politicians, the courts, and taxpayers have grown weary of unbridled expenditures in
K-12 education which have lacked a sound methodology through which the relative
effectiveness or ineffectiveness of such expenditures can be quantified. Witness the rise
of recent accountability measures and the standards movement. This study advances the
literature, and practice, by helping to close the knowledge gap which exists between
school finance methodology and the trend toward a standards and accountability based
accountability movement.
This study therefore seeks to define and quantify technology related requirements
and expenses, with overall per-pupil adequacy figures, in order to provide a more
accurate picture of the necessary funding levels associated with meeting the various
mandates called for in today’s educational environment and political climate.
Some have argued that money matters very little, or not at all, in relation to K-12
student achievement, attainment, and educational outcomes. The frontier of research
related to school funding is therefore related to generating knowledge, and advancing
best practice, that will one day enable the nation’s educators, politicians, policymakers,
9
and other stakeholders to make informed decisions, that are informed by a research based
understanding of the correlations between inputs, in the form of funding levels, and
resource allocation, and outcomes, in the form of indicators such as a mastery of
standards, demonstrations of competency through standardized testing, drop-out rates,
admission and degree program completion rates relating to higher education, and lifetime
earning potential.
Although reams of input data certainly exist, in the form of per pupil funding levels, at
the federal, state, and even district level, there is a gap within the literature relating to
resource allocation at the school site level. This gap is further complicated by the widely
accepted belief that within the various K-12 level schools, primary, middle schools, and
high schools, diverse levels of resource need are seen as almost a certainty.
A fundamental step to correlating school finance inputs with educational outputs,
would involve the advancement of a reliable methodology that will allow stakeholders to
define, and then quantify, adequate levels of per-pupil funding at the school site level.
This research study, which is being undertaken in concert with similar thematic
dissertations under the oversight of Dr. Lawrence Picus, endeavors to expand the body of
knowledge pertaining to site level, per pupil adequacy levels, by considering those
technology needs that constitute an integral, and essential, part of a modern day K-12
education. For the purposes of this study, this has been defined as those technology
needs that are necessary for students to meet prescribed grade level standards, meet the
career and professional standards associated with the 21
st
century workplace, prepare a
10
student to fulfill their civic responsibilities, and facilitate academic success for those who
continue their studies in higher education.
Methodology
A mixed methods research approach was utilized in order to examine technology
resource allocations, usage, and effectiveness relative to student achievement, in twelve
elementary schools located in the state of California. This was accomplished by means
of two distinct survey instruments: (a) a researcher designed survey in the form of a self-
report questionnaire which was used to collect data from a sample; (b) an existing
California Department of Education School Technology Survey, which California
schools complete for the state annually, and is retrievable from a state database.
The population of the study consisted of California public schools that were
designated by the California Department of Education (CDE) as being fourth year
Program Improvement (PI) schools pursuant to Title I of No Child Left Behind (NCLB).
The sample selected from this population of Year 4 PI schools was narrowed to include
only those California elementary schools which were successful in exiting from PI by
meeting their AYP for the 2006-2007 school year.
The researcher designed school site survey instruments contained a 14 page principal
survey, a two page upper grade teacher survey instrument, and a two page lower grade
teacher survey instrument. The principal survey instrument contained both qualitative
and quantitative questions. The principal survey instrument contained 45 open-ended
questions and 27 structured, so called forced choice or fixed response, scaled-construct
11
questions, for a grand total of 72 questions. The teacher survey instruments contained
three open-ended questions and 25 structured, so called forced choice or fixed response,
scaled-construct questions, for a grand total of 28 questions. The CDE School
Technology Surveys, on the other hand, contain only qualitative data.
The implementation of the study was concurrent, and a concurrent triangulation
strategy was employed to confirm, cross-validate, and corroborate the findings. When
consideration was given to the nature of the quantitative and qualitative data involved in
the study, the researcher decided to set an equal priority between the two. Integration
between the two types of data occurred at the data collection stage, the data analysis
stage, and the interpretation stage of the study.
Assumptions
In conducting this study the researcher operated under the assumption that the survey
subjects would respond honestly, and to the best of their ability, without prejudice. It was
assumed that the school site technology data sets attained from the California Department
of Education (CDE) School Technology Surveys were accurate, and did in fact contain
data for the year 2006-2007 as indicated. Finally, the research was conducted under the
assumption that the data attained from the 12 elementary school sites that agreed to
participate in the technology surveys would have a degree of applicability and
generalizability to similar schools and districts across the country.
12
Limitations
A primary limitation of this study involves its descriptive nature. Survey data were
collected from 12 school sites over a period of five months. The researcher made a good
faith attempt to include each of the 28 elementary schools contained within the
population of the study. Ultimately however, the cooperation of only 12 of the schools
comprising the population was obtained. Each of these twelve schools agreed to
participate in the study, and followed through by returning the completed surveys to the
researcher. The study therefore was limited to survey data acquired from a total of 12
California elementary schools representing a total of nine school districts.
Delimitations
The sites represented within this study were all NCLB Title I Program Improvement,
Year Four, elementary schools. The data contained within this study should be
considered as being specific to K-6 or K-8 schools. Generalizations drawn from this
study therefore should be used with other primary grade school sites. Additionally, since
funding levels vary between states, the findings may not be generalizable beyond
California.
Definition of Terms
21
ST
CENTURY LITERACY
21
st
century literacy is the set of abilities and skills where aural, visual and digital literacy
overlap. These include the ability to understand the power of images and sounds, to
13
recognize and use that power, to manipulate and transform digital media, to distribute
them pervasively, and to easily adapt them to new forms (The New Media Consortium,
2005).
TECHNOLOGY INTEGRATION
Encompassing more than merely using computers in a school setting, technology
integration corresponds to the incorporation of technology in lessons to transform
learning experiences for students. It thus involves redesigning curricula and aligning
lesson content and learning objectives with effective instructional practices to maximize
technology’s benefits (Waddoups, 2004, p. 2).
INSTRUCTIONAL TECHNOLOGY (IT)
The hardware and software use primarily for improving learning, including but not
limited to computers, content-specific software, programs and applications such as the
Internet, smartboards, and videoconferencing (Jazzar & Friedan, 2007).
ADEQUACY
An approach to school funding that begins with the premise that the amount of funding
schools receive should be based on some estimate of the cost of achieving the state’s
educational goals. This approach attempts to answer two questions: How much money
would be enough to achieve those goals and where would it be best spent? (EdSource,
2008).
PROGRAM IMPROVEMENT (PI)
A formal designation for Title I-funded schools and LEAs that do not make AYP for two
consecutive years in specific areas. Title I funds are federal funds provided under the
14
NCLB Act of 2001. There are required services and/or interventions that schools and
LEAs must implement during each year they are in PI. A school will exit PI when it
makes AYP for each of two consecutive years (California Department of Education,
2007).
TITLE I SCHOOL
A Title I school receives federal Title I funds. Title I, Part A, of the NCLB Act of 2001 is
the largest federal program supporting elementary and secondary education. This
program is intended to help ensure that all children have the opportunity to obtain a high-
quality education and to reach proficiency on challenging state content standards and
assessments. Title I provides flexible funding that may be used to provide additional
instructional staff, professional development, extended-time programs, and other
strategies for raising student achievement in high-poverty schools. Title I schools that do
not make AYP may face NCLB corrective actions (California Department of Education,
2007).
NO CHILD LEFT BEHIND (NCLB)
The No Child Left Behind (NCLB) Act of 2001 is a federal law enacted in January 2002
that reauthorized the Elementary and Secondary Education Act (ESEA). It mandates that
all students (including students who are economically disadvantaged, are from racial or
ethnic minority groups, have disabilities, or have limited English proficiency) in all
grades meet the state academic content standards for proficiency in ELA and
mathematics by 2014. Schools must demonstrate “Adequate Yearly Progress” (AYP)
toward achieving that goal (California Department of Education, 2007).
15
ADEQUATE YEARLY PROGRESS (AYP)
Under NCLB, all states are required to develop and implement a single, statewide
accountability system that will ensure all public schools make their Adequate Yearly
Progress (AYP) toward the federal goal that all students perform at the proficient or
above level in English-language arts (ELA) and mathematics by 2014. Under AYP
requirements, schools and LEAs are required to meet criteria in four areas: participation
rate, percent proficient (also known as Annual Measurable Objectives or AMOs), API as
an additional indicator, and graduation rate (if applicable) (California Department of
Education, 2007).
CALIFORNIA DEPARTMENT OF EDUCATION (CDE)
The California Department of Education (CDE) is the state agency that oversees
California’s public school system (California Department of Education, 2007).
TOTAL COST OF OWNERSHIP (TCO)
The business world, where multiyear budgeting and planning are common, looks upon
technology as a tool to increase productivity. Business models often calculate the
approximate total cost of ownership for technology initiatives through formulas that
incorporate both expenditures on hardware and software and maintenance, replacement,
training, and all aspects of the business that are impacted by the core system (Institute for
the Advancement of Emerging Technologies at AEL, 2008).
16
Organization of the Study
Chapter one provides an introduction to the study. It provides an overview of the
issues surrounding technological resource allocation, at the elementary school level, and
a summary of the relevance and importance of this study.
Chapter two provides a historical review of the school finance movement, and
documents the shift from a focus on equity to one of adequacy, which ultimately leads to
a discussion of resource allocation, and this studies focus on primary grade investments
in technological programs.
Chapter three provides an explanation of the appropriateness of the methodologies
employed in the study. The population of the study was defined as all California NCLB
Title I, Year Four, Program Improvement elementary schools. The sample was defined
as California NCLB Title I, Year Four, Program Improvement elementary schools which
were successful in raising student achievement to meet their Adequate Yearly Progress
(AYP) and escape Program Improvement for the 2007-2008 school year. Finally, an
explanation of the various strategies employed in the collection, and analysis, of data
used to answer the research questions upon which the study was based, is also provided.
Chapter four presents the data collected in the course of the study, provides an
analysis of the data collected, and states the findings as they relate to the research
questions that were posed.
Chapter five provides a summary of the study, and a review of the findings. A
conclusion is drawn, implications are asserted, and suggestions for additional research are
presented.
17
CHAPTER TWO
REVIEW OF THE LITERATURE
Introduction
This chapter presents a summary of the literature in the domain of school finance. An
emphasis has been placed, herein, upon the adequacy movement. Issues involving school
finance and funding adequacy are often quite contentious. Debates continue to rage over
the plethora of school finance systems across the nation and how much money is required
to meet the responsibilities, and educational mandates, called for by the widely varying
language contained in various state constitutions. This has resulted in literally dozens of
school finance lawsuits across the nation. As America entered the new millennium most
discussions surrounding, and the majority of litigation related to, school finance have
generally centered around questions of defining, and how best to quantify, the nature of,
and the various levels of, resources required to achieve these definitions of adequacy,
across diverse settings and diverse student sub-groups.
Judicial mandates, political pressure, and the accountability and standards movement
are all causal forces attributable to the recent call to seek quantifiable data and other
knowledge which will serve to illuminate unanswered, or even unaddressed, questions
which pertain to issues surrounding:
Whether or not per-pupil funding levels are adequate to meet state constitutional
mandates, and allow students to master various state curricular and content
18
standards, which are today assessed through federally mandated standardized
testing;
How the various resources those funds provide are best allocated and ultimately
utilized;
And the question of whether or not these per-pupil funding levels are related to,
and correlated with, student outcomes, and if so, to what extent.
Over the course of the last decade and a half, the domain of school finance has seen a
rather dramatic shift from questions and concerns over equity, to those primarily based
upon concerns over the levels of resources allocated, and the levels of resources required,
to insure the success of our nation’s schools and students. Along the way scholars,
researches, educational leaders, educational professionals, politicians, and the courts have
struggled to define the term of adequacy and arrive at consensus over the various rights,
responsibilities, and standards commensurate with such a definition.
The literature reveals that Federal and Supreme Court rulings have largely placed the
onus concerning questions relating to educational equity and adequacy squarely at the
state level. Rights and responsibilities delineated by the language and terminology
contained within individual State constitutions tend to vary widely and therefore call for
varied levels, and standards, of educational attainment and outcomes. The incredibly
diverse nature of the fifty states, north, south, east and west, and diverse nature of school
districts, rural and urban, wealthy and impoverished, across the nation, add complexity to
the issue. Additionally, a diverse range of student needs related or pertaining to socio-
19
economic status, second language learner status, or special education needs, further
complicates the quest to define and quantify adequacy in public school finance.
The literature also reveals that the standards and accountability movement, both at the
state and federal level, are in fact quite inextricably intertwined with, and have even
served to drive, the school finance, educational adequacy movement. In turn, one
question frequently seems to arise within the literature; does the term adequacy in
relation to education pertain simply to inputs (funding levels and resources), or outputs
(related to educational attainment and standardized testing), or does it pertain to issues
involving both?
Why Adequacy? – Why Now?
The previous decade within the school finance community witnessed a drive for the
establishment of quantifiable linkages between K-12 funding systems, and given levels of
funding which may be thought of as inputs, and productivity, which may be thought of as
outputs. Thus, what would eventually came to be called the adequacy movement within
school finance was born. Odden and Picus (2008) have described the birth of this
movement within the school finance community, as a quest to link inputs, defined first, as
a given amount of funds, and second, as a closer look at how a given amount of funds are
utilized to purchase various educational resources, to (outputs) which may be defined as
various measures of student achievement.
The previous dozen years have also seen school finance research strive to uncover
latent connections between the adequacy of funding and productivity. School finance
20
studies have been performed seeking to uncover, and document, the existence of this
causal relationship.
In this respect, adequacy based school finance studies endeavor to quantify the relative
strength, or weakness, of such correlations between various inputs and outputs. In this
respect, educational outcomes, most frequently defined as student achievement and
student attainment, have most frequently been measured in terms of student mastery of
grade-level curricular and content standards, which in turn are most frequently measured
through standardized testing measures, which in turn are frequently associated with
broader accountability systems.
History of School Finance in America
School Finance and the American Common School Model
The 19
th
century concept of a state sponsored, free public education, as championed
by Horace Mann and others, has become deeply woven into the social fabric of America.
So many of the fundamental tenets of America, that of America as a meritocracy, and an
egalitarian, democratic society, are closely linked to the existence of a sound K-12
education which serves the full spectrum of our citizenry. However noble the American
common school model was in concept, in reality our public education system has always
been inherently unequal due to funding issues.
Local control over education has long been a celebrated aspect of the American
educational system. However, for better or worse, partnered with this framework under
which our nation’s schools function largely as local entities governed through systems of
21
local control, are local funding mechanisms. Unfortunately, a reliance upon such local
funding mechanisms, also tends to link a given school’s revenue levels, to the relative
level of local wealth represented within any given community, usually in the form of
property values.
Shifting Responsibilities Involving Public School Funding
The civil rights movement of the sixties launched concerns over educational equality.
This in turn initiated a future trend within states toward the implementation of statewide
educational standards, and as a result, began a trend away from the existing systems of
school finance which had dominantly depended upon local revenues. During this period,
concerns over the significant disparities which existed in funding levels, from district to
district within a given state, led to the widespread implementation of various foundation
funding formulas or equalization schemes. Such formulas and schemes, composed of a
combination of state and local revenues, were designed to help mitigate what many saw
as unequal educational opportunities tied to unequal educational resources (Lefkowits,
2004).
Foundation funding formulas were early attempts, perhaps more accurately described
as primitive attempts, at appropriating adequate funding to ensure that every child in a
given state received a minimum level of funding for education, whether they attended a
school located within a wealthy, or impoverished, community. Inequities, none-the-less,
continued to exist between school districts which represented impoverished communities,
and those which represented the more affluent that, at will, could generate additional tax
revenue over and above the state’s foundation funding levels. In a perverse twist, one
22
which represents a prime example of the law of unintended consequences, foundation
funding formulas also created a disincentive for many local communities to raise taxes in
support of their own schools. The driving force was, of course, the nature of the new
funding mechanisms in which the treasury of any given state would now cover any fiscal
gap, up to the prescribed foundation funding level.
Origins of State Level School Finance Equity Litigation
With such early school finance reform measures being largely ineffective at leveling
available resources between the most affluent and the most impoverished of school
districts in a given state, an eruption of school finance equity litigation began in the late
sixties. In 1971 the California Supreme Court decision in Serrano v. Priest I [5 Cal.3d
584, 96 Cal.Rptr.601, 487 P.2d 1241 (Cal.1971)], which was the first of a series three
such school finance suits, would literally change the face of school finance reform by
providing a model for similar state level suits across the nation. In the subsequent 1976
Serrano II Supreme Court decision [557 P.2d 929, 939 (Cal.1976)], the high court held
that “substantial disparities in expenditures per-pupil among school districts cause and
perpetuate substantial disparities in the quality and extent of availability of educational
opportunities” (18 Cal.3d 728 L.A. No. 30398). The high court rendered a ruling which
found that the state’s school finance system “fails to provide equality of treatment to all
the pupils in the state” and offered a judgment that “equality of educational opportunity
requires that all school districts possess an equal ability in terms of revenue to provide
students with substantially equal opportunities for learning” (18 Cal.3d 728 L.A. No.
30398).
23
The Federal Role in Education and School Finance
The nature and methodology of public school funding has grow in complexity since
the nineteen-fifties. The desegregation of the nation’s public schools which followed the
Supreme Court’s ruling in Brown v. Board of Education of Topeka [347 U.S. 483 (1954)]
sent ripples through the nation’s educational system. The civil rights movement of the
nineteen-sixties facilitated a dialog which led to an acknowledgement by the federal
government of inequities and injustices in many areas of American society, including
education. As a result President Johnson instituted the War on Poverty which included
the landmark Elementary and Secondary Education Act of 1965 (ESEA).
The federal role, and the nature of its involvement, in education has changed greatly
over the course of the last forty years. Following on the heels of the U.S. Civil Rights
Act of 1964, President Lyndon Johnson greatly expanded the federal role in education,
and the funding of American schools, through the Elementary and Secondary Education
Act of 1965. This landmark piece of federal education legislation continues to impact
American school policy, procedure, and has further increased the role of Washington’s
involvement in the funding of the nation’s public schools today, through its most recent
reauthorization under President George W. Bush’s No Child Left Behind Act of 2001.
This latest federal educational “mandate” has served to increase the average percentage
level of funding contributed from Washington, in the typical American public school, to
approximately 8.3% of the school’s total budget (U.S. Department of Education, 2005).
Looking back, during the 1990-91 school year, the federal funding level stood at only
5.7% (U.S. Department of Education, 2005). To put these rapidly increasing federal
24
funding levels in perspective, U.S. Department of Education (2005) statistics reveal that,
as of the 2004-2005 school year, there has been an increase of 105% over 1990-91
funding levels; a 58% increase over 1996-1997 federal funding levels; and a 40%
increase in funding levels over the 1998-1999 school year.
The Reagan era national education report, A Nation at Risk, which was released in
1983, sounded alarm bells which established a climate of concern about the quality and
effectiveness of American schools. This in turn set the stage for a series of precedent
setting decisions which would endeavor to rectify these perceived shortcomings and
inadequacies within our nation’s public school system.
The last half century thus has seen a rather dramatic, and an ongoing, shift in the
burden and responsibility concerning the funding of most of our nation’s public school
systems. Although varying widely from state to state, the general trend has been one of
shifting from a preponderance of educational dollars which flowed from local funding
closely tied to local property taxes, to one in which greater funding levels flow from state
coffers. Over the last couple of decades, state funding levels of public schools within
California have also risen rather dramatically and, according to one source, currently
stand at approximately 61% of a California schools budget, with 11% coming from
federal funds, and the remaining 21% coming from local property taxes (Ed-Data, 2007).
However, U.S. Department of Education (2005) statistics assert that, nation-wide,
roughly 83 cents out of every dollar spent on K-12 education is attributable to a
combination of state and local sources; with approximately 45.6% contributed from state
funding sources; 37.1% contributed from local governments; 8.3% contributed from
25
federal funding; and the remaining 8.9% attributable to various private sources, and
primarily directed to private rather than public schools.
However, while the federal government has been receptive, sometimes even eager, to
increase its role and involvement in legal, judicial, policy, and funding matters that
concern sub-groups of students defined as minority, migrant, impoverished, special
education, bilingual, and limited English proficient, with the Title I and federal Head
Start program being two good examples of this involvement and level of commitment,
the federal government has been reserved, or even reticent, in regard to increasing its role
in other aspects of U.S. public schools. The reluctance of the federal courts to enter into
deliberation over school finance and funding matters related to the question of equity in
school funding, is of prime concern to any discussion of public school finance in
America.
The United States Supreme Court’s 1973, 5-4, ruling in San Antonio v. Rodriguez,
[411 U.S. 1 (1973] which overturned a lower federal court decision, essentially stated an
opinion that education is not a right guaranteed under the nation’s constitution. Justice
Lewis Powell, in writing the majority opinion of the Court, offered the following:
Since the Court now suggests that only interests guaranteed by the Constitution
are fundamental for purposes of equal protection analysis, and since it rejects the
contention that public education is fundamental, it follows that the Court
concludes that public education is not constitutionally guaranteed. It is true that
this Court has never deemed the provision of free public education to be required
by the constitution. Indeed, it has on occasion suggested that state supported
education is a privilege bestowed by a state on its citizens.
This tacit silence on the part of the highest court of the land concerning the subject of
school funding equity in American public schools, essentially shifted the discussion, the
26
battle, and the burden, to the state level. As a result, over the past thirty years we have
seen a flurry of litigation, and as might be expected, numerous equity, and eventually
adequacy, studies conducted at, and focused upon, the state level.
Evolution of School Finance Litigation at the State Level
With school finance litigation effectively now relegated to the state level, the seventies
witnessed a number of school finance cases heard in a number of states across the nation.
These early school finance suits, however, were not based upon the concerns over school
funding adequacy commonly seen today, but were constitutional concerns which were
raised over issues surrounding substantial disparities in funding levels between districts
within individual a given state. With many plaintiffs receiving favorable verdicts within
state courts, states responded by beginning to establish education equalization funding
programs, seeking to comply with, or preempt, judicial mandates by attempting to close
funding disparities between districts.
In the intervening years of Serrano, between 1971 and 1976, another important school
finance reform decision was rendered in the New Jersey Supreme Court ruling, Robinson
v. Cahill [303 A.2d 273, 295 (N.J.1973)]. Because this state high court decision was
rendered only weeks after the devastating U.S. Supreme Court ruling in the Rodriguez
case, it holds great importance in the history of school finance litigation for a number of
reasons. The decision was important in preserving school finance reform litigation, at
least at the state level, by invoking the education clause of the New Jersey constitution,
which mandates that the state provide a “thorough and efficient” public education system
(Lefkowits, 2004). The ruling thus was instrumental in opening the door for future
27
litigation, based upon challenges associated with the various state education clauses, in
other states across the nation.
Origins of School Finance Adequacy Movement
The landmark Rose v. Council for Better Education heard by the Kentucky State
Supreme Court essentially began as a school finance equity case, however it came to
launch the school finance adequacy movement when the high court found “every part and
parcel” of the entire educational system, within the state of Kentucky, in violation of the
state constitution (Schrag, 2003). The ruling in this case was fairly remarkable because
of the fact that in implementing the Kentucky Education Reform Act (KERA) of 1990 the
legislature the high court provided the legislature with very well crafted, and very well
defined, definitions of the, hereunto now, vaguely defined term of educational adequacy.
Basically, the Kentucky high court held that an adequate education would offer to the
student body of the entire state, an opportunity for “sufficient capacity” in seven rather
broad categories which included: oral and written communication skills; and knowledge
of economic, social, and political systems (Lefkowits, 2004). Rose v. Council for Better
Education and KERA are remarkable for the school finance adequacy precedents they set
in defining educational adequacy, which provided a model for future state level adequacy
school finance litigation and reform, across the nation.
In 1989 when the Kentucky Supreme Court ruled in favor of the plaintiffs in Rose v.
Council for Better Education [790 S.W. 2d 186 (1989)], it became a landmark public
education suit which shifted the direction and focus of school finance activism, litigation,
and politics from equity to that of adequacy. In finding that the state’s education system
28
was in violation of the state constitution, a legal precedent was set that, in effect, opened
the floodgates to state level litigation, based upon adequacy arguments, across the nation
(Minorini & Sugarman, 1998). The high court of Kentucky did this by putting forth two
mandates: first that the state set prescribed levels of outcomes, or educational goals, that
the state’s schools should strive to reach; and secondly, that the state legislature would
earmark adequate funding levels to allow all students to reach the performance outcomes
and educational goals which the state would prescribe (National Conference of State
legislators [NCSL], 2004a).
Defining Adequacy, and Assessing the Impact of Funding Levels in Schools
Such a mandate at the birth of the school finance adequacy movement might have
been a much more straightforward undertaking if educational researchers had been able
to agree upon the role, and relative importance, that funding levels play in regard to
student achievement and student attainment. They do not. Over the years, as school
finance adequacy lawsuits have been brought in various states across the nation, this has
placed many legislators, and many legislative bodies, who have been charged with the
creation, and funding, of an adequate school system within their state, in a rather
awkward position. On the one hand they have been faced with judicial mandates to fix
their school systems and fund them to “an adequate” level; and on the other hand, as they
look to educational research for sound data upon which to predicate guidance, they are
faced with contradictory evidence, opinions, and professional judgment as to
such basic and fundamental questions such as:
29
Where do the rather substantial dollars invested by taxpayers in K-12 education,
stand to produce the greatest return on investment?
How much money is required to meet judicial mandates, and satisfy constitutional
requirements, for a sound basic education?
Do increased, or varied, levels of funding which are invested in K-12 education
matter at all?
Research, however is inconclusive; and opinion and professional judgment varies
widely as to the nature of the relationship between funding levels and achievement.
Studies range from those claiming that money matter little, or even not at all; to those
which argue that money certainly, on some level, must play a factor (NCSL, 2004a).
The issue remains however, one of defining and quantifying the relationship between
educational inputs and educational outputs.
Paul E. Barton, the director of the policy information center at Educational Testing
Center, writing in the preface of one landmark study (Wenglinsky, 1997), offered the
following perspective on this question:
Since the publication of Equality of Educational Opportunity by James S.
Coleman in 1965, there has been an enduring discussion of whether differences in
the expenditure of resources on the schools makes any difference in educational
achievement. No consensus has yet been reached, despite the fact that some of
our best academic minds have focused on the question.
Wenglinsky’s research points out the fact that, although courts and legislatures have
been actively engaged in efforts to reduce funding disparities between various school
districts (equity cases), and over the last fifteen years to raise aggregate spending in all
districts in their states (adequacy), beginning with the 1971 Serrano v. Priest suit, “little
30
agreement exists as to which expenditures and resources are most likely to improve
student performance or whether resources matter at all” (Wenglinsky, 1998).
Wenglinsky (1998) echoes the findings of the Coleman Report, which really opened
the national dialog and debate over the importance of school funding levels, by asserted
that “resources made little difference to achievement once background characteristics of
students were taken into account”. Wenglinsky acknowledges the fact that, what really
amounts to a stalemate has manifested itself within the various school funding studies
which have been conducted over the previous thirty some years. However, the research
and conclusions of Wenglinsky, and his own school funding recommendations, are both
highly respected and carry great weight within the field. Wenglinsky offers the following
opinions based upon studies which were focused upon fourth and eighth grade students
and instructional programs.
A positive correlation was documented to exist, whereas increases in expenditures
tended to yield the following positive effects upon the achievement of fourth grade
students within a couple of areas which are interrelated:
Increases in expenditures related to instruction, and expenditure increases directed
toward school district administration, tend to lead to increases in teacher-student
ratios;
Increases seen in teacher-student ratios tend to yield the effect of raising
average achievement scores in mathematics.
31
A positive correlation was documented to exist, whereas increases in expenditures
tended to yield the following positive effects upon the achievement of eighth grade
students within a couple of areas which are interrelated:
Increases in expenditures related to instruction, and expenditure increases directed
toward school district administration, tend to lead to increases in teacher-student
ratios;
Increases seen in teacher-student ratios tend to reduce problem behaviors and
improve the social environment of the school;
A lack of problem behaviors among students and a positive social environment
Tend to raise average achievement in mathematics.
A negative (or the lack of a) correlation was found to exist, whereas variations in other
expenditures and other resources, beyond those discussed above, could not be positively
correlated with variations in achievement. Such variations included in Wenglinsky’s
findings were:
Capital outlays which take the form of funding dedicated to facility construction
and maintenance;
School-site level administration;
The level of educational attainment of a teaching staff.
Of particular interest to evidence-based school finance adequacy research, however,
are Wenglinsky’s findings and viewpoints related to a productivity approach to school
reform. Over the years, as various state courts and state legislatures have wrestled with
the issue of juggling, and attempting to balance, school funding levels (inputs) in the
32
pursuit of high levels of achievement and attainment for students (outputs), many who
were involved in the reform movement did not fail to take note of the fact that increased
levels of educational expenditures frequently demanded by the courts did not always
yield the tangible increases in student achievement that many policymakers and
stakeholders had predicted, or expected to witness.
As a result, many legislators, policymakers, and stakeholders began to question the
status quo philosophy of the school funding reform movement at the time (one that held
that simply providing more money was the answer to the problem and would provide the
remedy) and began to advocate that funds be earmarking and reallocated to those
instructional programs, and other given areas of K-12 education, that they hoped and
believed would result in the greatest gains in both the quality of instruction being
provided, and educational outcomes. Such attempts to directly link funding to
educational outcomes are more important than ever in today’s world in which the
standards and accountability movement has become derigueur.
The evidence-based, or state-of-the-art, approach to school finance adequacy flows
from this logic since it essentially seeks to attack the adequacy of funding debate by
correlating various instructional programs which research has identified as the most
effective, and seeking levels of funding to meet the levels of funding that are required in
order to implement such programs. Wenglinsky’s (1998) study therefore offers a
measure of support for such a methodology since it seems to support a belief that some
dollars matter more than others in K-12 education. Likewise, Wenglinsky’s study
suggests that courts and legislatures involved in school funding reform should begin to
33
concern themselves with issues of productivity by identifying various educational
priorities and then directing appropriate levels of funding (Wenglinsky, 1998).
Evolution of the Adequacy Movement in School Finance
Today, in 2008, we can look back over the previous eighteen years of school finance
litigation, tracing the impetus, and the evolution of, the adequacy movement which has
come to dominate the politics and policy of our nation’s educational landscape. Such
adequacy based school finance litigation hinges upon the various education clauses that
exist, in various forms, within the constitutions of most of the nation’s fifty states.
The wording found within each state’s education clause varies, to a lesser or greater
degree, and therefore sets disproportionate constitutional driven levels of standards and
responsibilities. Such wording found within the various state constitutional clauses
across the nation call for a state’s school system to provide a “through and efficient”
education, or a “uniform and appropriate” educational system throughout the state.
Thus, looking back over the past fifteen years, some thirty states in the union have faced
litigation related to the adequacy, and the constitutionality, of their educational systems,
in relation to the various methods, and the means, by which their schools are financed,
with the
plaintiffs triumphing in two-thirds of those cases (NCSL, 2004a). Adequacy based
school finance litigation therefore revolves around the concept that, although prior equity
based school finance litigation may have reduced variations in spending, each state also
bears responsible for providing its citizenry with a basic level of funding, which is of a
34
sufficient level, to provide each son and daughter in the state with a proper K-12
education (NCSL, 2004b).
The nation’s current standards and accountability movement only reinforces the
imperative that each of the fifty states provide such levels of basic and sufficient K-12
funding to each and every child. Some today argue that if states, districts, and even the
federal government are mandating that every child demonstrate a mastery of standards
and curriculum, as measured through annual high-stakes, grade-level standardized tests
and high schools exit exams, should not a commensurate, a sufficient, an adequate, level
of funding exist at each and every district, at each and school, for each and every child?
And, if such a commensurate, sufficient, and adequate level of funding is lacking for any
given sub-group of students, can schools, states, and the government, in all good faith,
then hold these students, their teachers, and their schools responsible for a lack of
achievement or educational attainment? These are some of the pressing questions which
are presently driving the nation’s school finance adequacy movement which, make no
mistake, currently resides squarely within a standards and accountability educational
climate.
So, what constitutes an “adequate” education and an “adequate” educational system?
How do politicians, the courts, policymakers, educators, and other stakeholders define
this term? And how do these stakeholders begin to assess constitutional compliance?
Fundamentally, an educational system that is judged “adequate” should offer all students,
except perhaps those who are afflicted with the most severe of learning disabilities or
impediments, with the opportunity to meet prescribed educational standards and goals.
35
Such a concept of adequacy thus encompasses two prime aspects and components: the
educational outcomes which are commonly associated with definitions of student
achievement and attainment; and the commensurate levels of educational funding which
are appropriate in order to facilitate such levels of student achievement and attainment
(Odden & Picus, 2008).
School Finance Adequacy Methodology
The evidence-based, which is also commonly referred to as the state-of-the-art
approach, is one of four primary school finance methodologies which have been utilized,
over the last dozen or so years, to arrive at a determination of adequacy in K-12 funding.
The other three competing approaches or methodologies are (Odden & Picus, 2008):
The input, professional judgment or consensus, approach;
The economic cost-function approach;
The successful school, or district, approach.
The overarching premise behind all of the above methodologies, when arriving at a
conclusion as to whether a given school, district, or state funding system does, or does
not, meet the threshold of “adequacy”, lies upon the fundamental belief that a given level
of funding must be sufficient to procure the various resources required to teach all, save
perhaps for the most severely of disabled students, to the proficiency standards of a given
school district and the state in which it resides (Odden, Fermanich, & Picus, 2003).
Therefore what distinguishes today’s adequacy studies and methodologies from prior
work conducted in the field of school finance, work such as equity based research, is the
36
concerted effort to link school funding levels (inputs), with student mastery of grade-
level specific curricular standards, through systematic methods of assessment (outputs).
Adequacy based school finance studies thus differ in that they seek to
correlate student achievement and attainment data, with data concerning funding levels.
In today’s standards-based educational climate, such school finance methodologies hold a
distinct advantage in that they allow stakeholders to arrive at data driven conclusions.
Such a correlation of data may be used to guide more informed policy decisions, which
are based upon the relative strengths, weaknesses or shortcomings, successes or failures
of a given educational system which is operating under a given school finance system, at
a given level of funding.
Each of these adequacy study methodologies take a somewhat different approach to
the question of how to define, model, and quantify adequacy in school funding. Each of
the various methodologies, it could be argued, offer various strengths, weaknesses, and
biases. Each methodology may also perceive, and even frame, questions of adequacy in
a slightly different, or even unique, terms.
The influence and subsequent impact of the reauthorization of the Elementary and
Secondary Education Act, under the Bush administration’s No Child Left Behind Act of
2001, certainly should not be overlooked, or minimized, as yet another driving force
behind the continuing shift away from an equity based school finance focus, and toward
research conducted and studies which are now performed under the auspices of adequacy
methodologies. Few, if any, involved in education profession today are unaware of the
federal government’s huge influence in today’s standards and accountability movement.
37
The Input, Professional Judgment or Consensus, Approach
Basically, in employing an input or professional judgment approach to a determination
related to the question of adequacy in a given school finance system, states have typically
assembled panels of educational leaders, and those who otherwise possess high levels of
expertise in their fields, who are impaneled in order to take advantage of their collected
wisdom, experience, and expertise. Such panels come together in order to determine
what they, in their expert opinions, consider to represent effective educational strategies;
and in turn make judgments concerning the staffing and other resources required in order
to implement such a program (Odden & Picus, 2001). Once determined, the elements of
such a program are subsequently priced-out, and then all of the elements combined in
order to reach a figure which is held to reflect the adequate fiscal base for a given school
site. The final step involved in the calculation of an adequacy level under the input or
professional judgment methodology for a given school site would involve adjustments
made to the adequate fiscal base which would reflect: first, geographical cost variations
such as those represented by district size, rural verses urban environments; and second,
those expenses related to the procurement of additional resources required to meet the
exigent needs of special sub-groups of students such as those who live in poverty, those
who are English language learners, and those in need of special education (Picus, Odden,
& Fermanich, 2003).
The perceived strengths and weaknesses of the professional judgment approach each
seems to involve the fact that, like the evidence-based approach of Odden and Picus, it
offers some measure of linkage between a given funding level and what are perceived to
38
be effective educational programs and strategies (Odden & Picus, 2003). In this vein,
one common criticism directed toward the professional judgment methodology is that it
draws it conclusions as to the relative effectiveness, or ineffectiveness, of a given
educational strategy or program, based not upon sound educational research, which
critics hold offers educators sound data and quantifiable links between any given
educational strategy and student outcomes, but strictly upon what may, or may not,
reflect best practice, as determined simply by educational practitioners who are held to be
experts (Odden & Picus, 2008). A criticism has thus been leveled that the input and
professional judgment approaches are therefore lacking in regard to statistical precision
(WestEd, 2000).
State government sponsored studies employing the professional judgment approach
methodology have been conducted in Wyoming (1997), Illinois (1998), Oregon
(1997/2000), Kansas (2001), Maryland (2001), Kentucky (2003), North Dakota (2003),
Maine, and California (on-going). Similar, special interest group sponsored, professional
judgment studies have likewise been conducted in South Carolina (2000), Maryland
(2001), Nebraska (2002), Indiana (2002), Montana (2002), Colorado (2003), Missouri
(2003), Kentucky (2003), and New York (on-going). Individual researchers, meanwhile
have also conducted their own professional judgment adequacy studies in Wisconsin
(2002) and Washington (2003).
A more sophisticated input methodology, the Resource Cost Model which was
developed by Chambers and Parrish, was utilized to calculate a foundation spending level
39
in for the state schools systems of Illinois and Alaska, however, at the time of this
writing, the proposals have yet to be implemented. (Odden & Picus, 2008).
The Economic Cost-Function Approach
The economic cost function approach endeavors to quantify the question of what level
of per-pupil funding is required to attain a given level of student outcomes. The process
involves the implementation of an econometric technique, namely cost functions, and the
use of regression analysis. To this end, there are three elements which are generally
assigned as independent variables: the first is a given school district’s characteristics; the
second is the demographics and other characteristics of that district’s student body; and
finally, the performance outcome target levels. The dependent variable in this equation is
in turn represented by per-pupil funding levels (Odden et al., 2003). The result is held to
yield a figure which would represent an adequate per-pupil funding level for an average
district. Such a figure could then be further modified in relation to various exigent
circumstances related to student need, the relative costs involved in the procurement of
specific educational resources, as well as regarding to variations relating to economies of
scale. One would thus expect the expenditure levels to be higher as one seeks higher
levels of student outcomes. Furthermore, projections of expenditure levels for large
urban districts, under the economic cost-function methodology, typically tend to run
double or even triple the base, per-pupil, over the expenditure figures projected for an
“average” district (Odden & Picus, 2008). Although this methodology is said to be
gaining popularity with economists, critics of this approach argue that it relies upon
complex statistical analysis, which are difficult for many in education to understand, and
40
in turn explain to stakeholders, and may rely upon assessment data which may, or may
not, accurately measure the desired level of student outcomes (WestEd, 2000).
Cost function, independent researcher directed studies, have also been conducted in
Wisconsin (1997/2001) and Texas (1999/2001) by Reschovsky and Imazeki. Meanwhile,
New York, saw research conducted by Duncombe and Lukemeyer (2000/2003) which
juxtaposed a cost function analysis with a resource cost study related to staffing issues,
and an empirical identification study (Baker, Taylor, & Vedlitz, 2005). Although cost-
function studies have been conducted in the states listed above, to date, none of these
states have seen fit to employ this school finance adequacy methodology (Odden &
Picus, 2008).
The Successful School / District Approach
This school finance adequacy methodology begins by setting an acceptable or desired
threshold of student outcomes. Next, state standardized testing data is utilized to identify
districts that have yielded that desired level of student outcomes. From that group of
“successful” districts, those which also possess demographics and characteristics which
reflect or resemble the state average, are selected, and finally calculations are undertaken
to produce an average figure which reflects the per-pupil funding levels of these districts
(Odden & Picus, 2008).
Advocates argue that the successful district approach represents a methodology that is
simple, is straight forward, and is understandable, and that has the attribute of being
based upon data of which documents levels of success which are “already in evidence at
the identified “successful” districts”. Critics on the other hand, would argue that there is
41
currently somewhat limited district expenditure data available upon which to base such
estimates, and that this approach is also reliant upon data from assessments that may not
measure the desired performance outcomes (West Ed, 2000).
State government sponsored studies employing the successful schools approach
methodology has been employed for educational adequacy studies in the states of
Mississippi (1993), Illinois (1996), Ohio (1997), New Hampshire (1998), Louisiana
(2001), Illinois (2001), Kansas (2001), and Maryland (2001). Similar, special interest
group sponsored studies, which employ the successful schools approach methodology,
have been conducted in Colorado (2003), and Missouri (2003) (Baker et al., 2005, p. 29-
31).
State sponsored successful schools research studies were conducted by Augenblick
and Meyers, Inc. in the states of New Hampshire and Illinois. District level studies were
conducted by Augenblick and Meyers, Inc. in Colorado, Illinois, Kansas, Mississippi,
Missouri, and Ohio. Coopers and Lybrand conducted a successful schools study which
was sponsored by the Illinois State Board of Education (Baker et al., 2005, p. 29-31).
School site level studies, based upon the successful schools approach, were conducted
by Augenblick and Meyers, Inc. in New Hampshire, in Louisiana, and in Maryland. In
Colorado, Kansas, Maryland, and Missouri, Augenblick and Meyers conducted a side-by-
side and simultaneous adequacy analysis that juxtaposed the successful schools approach
with the professional judgment methodology (Baker et al., 2005, p. 29-31).
42
The Evidence-Based or School Reform Approach
This school finance adequacy methodology strives to identify, and then attach a dollar
value to, the implementation of research and evidence-based educational strategies. One
of the strengths of this approach is that its researchers aggregate the costs associated with
such strategies in order to attain adequacy figures for the state, the district, and even the
school site level. The two cornerstones of the evidence-based methodology are the
identification of, and the calculation of, the fiscal demands which are associated with the
implementation of the research and evidence-based educational strategies which are most
likely to produce increases in student outcomes (Odden & Picus, 2008). Thus, one of the
strong suites of the evidence-based adequacy approach is that it offers to administrators,
and others who are responsible, involved, or concerned in the budget process, the kind of
sound fiscal guidance which is applicable and relevant to school site level decisions, as to
how to best invest the resources which are available, in order to best achieve desired
outcome levels. In the words of one leading evidence-based researcher, Allen Odden,
what matters most in school reform is, the existence of a “laser-like focus” on improved
performance, on the part of school reformers (WestEd, 2000).
The evidence-based methodology might be characterized as eclectic, or perhaps called
a hybrid, since its origins can be traced to the absorption, and blending of, various
elements originally associated with the high-performance school approach, and the whole
schools designs which are associated with the New American Schools models (Odden &
Picus, 2008). As this evidence-based adequacy methodology continued to mature and
43
evolve, a new term, the state-of-the-art approach, has more recently come into vogue and
is now frequently associated with such techniques for determining adequacy.
Fundamentally, what the evidence-based, or the state-of-the-art approach, offers to
educational leaders, politicians, the courts, and other stakeholders, is: a methodology
which seeks to identify the various ingredients which data, research, and best practice
assert to offer the greatest likelihood of raising outcomes; then strives to, as accurately as
possible, determine the various funding levels which are commensurate with the
implementation of each of the various ingredients; and ultimately seeks to calculate an all
inclusive, overall funding level required in order to implement the sum of programs
which, hopefully, will lead to the desired outcome levels (Odden & Picus, 2008).
Advocates of the evidence-based or state-of-the-art approach would assert that two of
the greatest strengths associated with this methodology are that it: offers schools a
concrete blueprint for improving practice and raising outcomes; and likewise provides
administrators and other stakeholders with a clear picture of what resources are being
acquired relative to their funding levels (WestEd, 2000). Critics, on the other hand,
would argue that there is mixed evidence of success for many “research” or “evidence-
based” educational strategies, and would assert that there is mixed evidence as to the
“transferability” of given educational strategies across districts (WestEd, 2000).
State government sponsored, evidence-based adequacy studies have been conducted in
New Jersey (1998), Kentucky (2003), and Arkansas (2003). In New Jersey, Allen R.
Odden, in association with the University of Wisconsin and the Consortium for Policy
Research in Education, conducted state level research; while in the state of Arkansas,
44
Lawrence Picus and Associates conducted similar research and analysis. In Kentucky,
Lawrence Picus and Associates conducted side-by-side, simultaneous, adequacy analysis
which sought to juxtaposed the evidence-based approach with the professional judgment
methodology (Baker et al., 2005).
Alternative Perspectives on the Various Methodologies,
the Various Classifications, and History of Adequacy Studies
Odden and Picus (2008) have classified school finance adequacy studies into four
general methodologies: the input or professional judgment approach; the successful
district approach; the cost function approach; and the evidence-based approach. Baker et
al. (2005) assert that, in the decade and a half following the Rose v. Council for Better
Education decision, public school adequacy studies have settled into three broad
categories of methodology: average expenditure studies; resource cost studies; and
statistical modeling studies.
In reality, the philosophies and the methodologies involved in the successful district,
the successful schools, and the modified successful schools approaches place them very
much within the methodology of an average expenditure study; while both professional
judgment and evidence-based studies can be seen as falling within the definition of a
resource cost study; just as, the cost function, and production function, approaches can be
seen as falling within the statistical modeling category of K-12 public school finance
adequacy studies.
45
Generally speaking, Baker et al. (2005) have found that when comparing the various
approaches named above, which represent the gamut of three broad categories of
adequacy studies, it is the successful schools methodology which tends to produce the
lowest estimates of the funding levels required to provide an adequate education.
Comparisons of the various methodologies, also reveal that the professional judgment
methodology tends to call for highest funding levels to provide educational adequacy.
Economies and Dis-Economies of Scale
When considering the application of various adequacy approaches across districts and
across states, Baker et al. (2005) caution educators, policymakers, and other groups of
stakeholders, that the findings of adequacy funding levels generated by the professional
judgment methodology often vary to a large extent. The researchers add a caveat
concerning variations related to the cost of providing services, attributable to variations
associated with district characteristics (such as district size), and any other characteristics
related to providing special, or additional, services to various student sub-groups, with
special educational needs and requirements.
An example is given in which professional judgment studies conducted in Nebraska
and Missouri were juxtaposed. In the former, a district with a student body of four
hundred pupils generated costs which rose some forty percent over the minimum; while
in the latter, a district with a student body of three hundred and sixty four pupils
generated costs which resulted in a mere nine percent increase above the established
minimum. When one considers this peculiarity associated with the professional judgment
46
approach, a case can be made for the cost function methodology as they may tend to
provide a more consistent adequacy figure across districts. Baker et al. (2005) look to the
final two of the approaches discussed above, the evidence-based and the successful
schools studies, and seem to lament that they have, at least to date, failed to manifest such
mechanisms within their methodology.
In considering resource-oriented and performance-oriented adequacy studies, Baker et
al. (2005) assert that such adequacy study methodologies can also be split into two major
categories, based upon the types of data which provide the basis of their analysis. The
authors offer a brace of analogies, which are commonly invoked in school finance
adequacy discussions, to lend clarity: with resources-oriented analysis, one knows the
mode of transportation one is going to take, but one is not exactly sure of where one is
going; however, with performance-oriented analysis, one knows where one is going, and
even have a good idea of how much money it should require to get there, but one is not
quite sure of the best way to get there.
Approaches which tend to focus upon educational resources (inputs), that is to say
those studies which prescribe inputs (resources) toward the attainment of identified sets
of outcomes (performance goals), may be considered to fall under the heading of
resource-oriented methodologies. Such a focus would include specific resource inputs
such as teaching staff, and other instructional resources such as computers and the allied
software programs which are necessary for the implementation, or support, of educational
programs. In contrast, Baker et al. (2005) classify those approaches which tend to focus
upon accepted measures of performance goals (outcomes), as being performance-oriented
47
methodologies. In light of our nation’s seemingly omnipresent standards and
accountability movements within education, such measures understandably hold great
interest, and may be of prime concern to, today’s educational leaders, the courts,
politicians and legislators, and other stakeholders.
Two examples of what may be classified as pure performance-oriented analysis would
be, the successful schools approach, and the cost function approach. Two examples of
what may be classified to be purely resource-oriented analysis, would be the professional
judgment and the evidence-based, professional judgment approaches. Also employed in
school finance adequacy research are approaches which may best be described as blended
or hybrid methodologies. The modified successful schools approach and production
function analysis are two such methodologies which, by definition, involve analysis from
the realms of both performance outcomes, and inputs relating to resources and funding
levels (Baker et al., 2005).
Educational Productivity
In 2008, stakeholders and policy makers seek research, data, and expertise which can
help them to quantify the levels of funding required to ensure that students can meet the
expectations that have been set for them today by standards based curriculums.
Stakeholders and policy makers seek from school finance experts, sage advice as to how,
and where, to best invest taxpayer dollars. More frequently than ever before they are
demanding, when and wherever possible, empirical evidence and data, supporting how
various levels of funding, the manner in which those funds are utilized, and the nature of
48
the various resources these funding levels eventually procure, can be quantified in
relation to outcomes.
In essence, what policymakers and stakeholders desire of school finance researchers
today, involves the rather Herculean task of generating data to connect the dots between:
educational funding levels, and the expenditure levels that are tied to various educational
strategies and other resources; and the end product, educational attainment and student
achievement; as measured against today’s prescribed grade-level standards, by means of
today’s mandated accountability mechanisms (Odden & Picus, 2008).
Hanushek and Associates (1994) argued at the time that educational productivity
really involves a determination of how to induce ever higher levels of educational
outcomes through present levels of funding and resource allocation. In light of the fiscal
crisis currently facing many of the nation’s fifty states, such an outlook or approach to
school finance reform makes a great deal of sense.
When discussing educational productivity, still others claim that, even in today’s
difficult fiscal climate, the nation’s educational institutions are awash with money.
Such a viewpoint argues that the real issue facing our nation’s schools, and their funding
mechanisms, is one of “low levels of system performance” generated with “relatively
large levels of funding” (Hanushek & Associates, 1994). Such an educational view, or
productivity perspective, supports an evidenced-based adequacy methodology since it
defines the “core problem” facing K-12 public education in this country as one involving
determinations as to how the nation’s schools can most effectively reallocate current
funding levels in order to secure the most effective, or promising, resources that support
49
the implementation of the most promising instructional strategies, which will, hopefully,
lead to increases in system performance. An evidence-based adequacy approach thus fits
in well with such a perspective since such a methodology looks to research-based
evidence to support the implementation of instructional strategies that will generate high
levels of outputs.
Such inquiries or debates concerning educational productivity, would appear to bring
the argument full circle in a sense, since the question “does money matter” seems to be
tacitly answered in today’s educational environment and school funding climate, given an
ever increasing focus upon fiscal restraint, and efforts at Solomon-like wisdom and
Midas-like judgments, concerning how and where educational expenditures can best be
utilized, in order to acquire the most effective resources, in order to support the most
effective instructional strategies. Such a quest for efficiency, effectiveness, results, and
performance outcomes, on the part of stakeholders and policymakers, seems to argue for
the importance of how and where educational dollars are invested. Such concerns
therefore seem to make the answer to the question “does money matter” in relation to
public education and school funding, self-evident, in the affirmative.
Summary of School Finance Methodologies
Although the last fifteen years has seen the rapid creation, development, and wide-
spread implementation of, various approaches which all attempt to define and quantify
educational adequacy, this dissertation is most closely aligned with the state-of-the-art, or
evidence-based studies, that are most frequently associated with Dr. Lawrence Picus and
50
Dr. Allen Odden; and which may be classified beneath the broader umbrella of resource
cost studies. As an advocate of the state-of-the-art, or evidence-based, approach to public
school finance research, Dr. Picus and Dr. Odden have been leaders in school finance
research that has shifted the emphasis from the search for a means to measure equity, and
gauge remedies for differences attributable to the fiscal capacities of various school
districts, to that of school finance research which is more concerned with the study of
resource allocations associated with the implementation of sound, research-based
instructional strategies designed to facilitate student achievement and allow all, except
perhaps the most severely of disabled, students to meet prescribed grade-level standards.
Although these evidence-based, or state-of-the-art, adequacy studies have now been
undertaken in Arkansas, Kentucky, and New Jersey, each of these studies have been
somewhat limited by being restricted to assessing the overall, per-pupil funding needs
primarily upon a set of core instructional program expenses which are re-constituted, on
the basis of state averages, by researchers. The fact that non-core expenses such as
technology, safety and security, transportation, district administration, maintenance, and
facilities, etc. have, heretofore, simply been taken as fixed costs, and heretofore, simply
accepted by the researchers in their calculations of the overall, per-pupil, dollar amounts
arrived at within a given adequacy study, is certainly, as the researchers are the first to
admit, a weakness and highly problematic. This weakness or shortcoming within the
state-of-the-art and evidence-based methodology, represents a compromise which,
heretofore, has been necessitated by a dearth of research related to how these various
51
allied educational expenses can best be quantified, and then rolled into a given adequacy
study, for a given state, district, or school site.
A knowledge gap therefore currently exists within educational practice, and a gap
currently exists with the literature. This existing gap in school finance knowledge
concerns how to best define, and quantify, technology related resource allocation
requirements, and the related expenses, with existing data and methodologies associated
with ascertaining core instructional per-pupil adequacy funding, and resource allocation
levels. The pursuit of such knowledge is important in order to provide a more complete,
and accurate, assessment of the required funding levels associated with meeting the
various state and federal mandates called for in the nation’s current accountability and
standards based educational environment and political climate.
Educational Technology Resource Allocation
Technology Resource Allocation and Student Achievement
The year four report of the CEO Forum on Education and Technology (2001, pg. 1), a
five year partnership between business and education leaders which defined their mission
as one of being committed to assessing and monitoring progress toward the integration of
technology into American K-12 public schools, offered the following admonishment in
regard to the “holy grail” issue of student achievement, educational technology, and the
unrecognized, unacknowledged, and undocumented range of positive impacts technology
has already delivered to the nation’s K-12 student body:
Some critics wrongly dismiss the investment in educational technology as wasted
when test scores do not immediately improve. These critics do not consider that
52
technology was not deployed to fulfill educational objectives or that these
assessments do not accurately measure educational objectives. This is a
dangerous mistake. Our nation is already experiencing some of the benefits of
technology in education.
In reflecting upon the wisdom or validity of the statement above, consider that the CEO
Forum (2001, pg. 1) viewed this report as a “culmination and synthesis” of five years of
“exploration” on the impact of educational technology, and offered the following
counsel, and optimism, concerning the power of technological resources to effect change
within our national education system:
…the definition of student achievement must be broadened to include the 21
st
century skills that will be required for students to thrive in the future. In the same
ways that technology helped revitalize American business, education technology
offers great promise for improving education.
In reflecting upon the state and status of educational technology in their year four final
report, and at the conclusion of their existence as a body, the CEO Forum (2001)
articulated four key findings:
Education technology can improve student achievement;
Technology can have the greatest impact when integrated into the curriculum to
achieve clear, measurable educational objectives;
Assessment is not currently aligned with educational objectives, or adequately
measuring 21
st
century skills;
Measurement and continuous improvement strategies have not been widely
implemented in schools and districts.
There is no doubt that the body of research concerning educational technology and
student achievement is at best mixed, and perhaps better termed as contentious.
53
However, it is quite important to keep in mind that if we are measuring student
achievement strictly in terms of the current, federally mandated, high stakes instruments
of standardized testing, this amounts to a very narrow definition of student achievement.
When stakeholders engage in any discussions concerning student achievement in relation
to educational technology, we must keep in mind that the current crop of standardized
testing instruments were never designed to be a measure of the range of 21
st
century
educational benefits that technology has undoubtedly already delivered to our nation’s K-
12 student body.
Technology Resource Allocation and its Relationship to Instruction and Teachers
Marshall (2002, pg. 19) asserts that there is good evidence supporting the premise that
educational technology complements what great teachers tend to do naturally. He offers
that “technology is not the solution for all that ails” by suggesting that “we must
increasingly rely upon the teacher’s expertise to craft blended opportunities for students
to learn”. The U. S. Department of Education, Web Based Education Commission (2000,
pg. 39) expounds a similar belief in the crucial role of the teacher in manifesting the
effective implementation and use of technological resources in the classroom:
It is the teacher, after all, who guides instruction and shapes the instructional
context in which the Internet and other technologies are used. It is a teacher’s
skill at this, more than any other factor, that determines the degree to which
students learn.
The commission’s report also ties the critical role of our nation’s teachers to an
imperative for a sound technology professional development program:
We must train the nation’s teachers—and the principals and administrators who
lead them—or investments in high tech educational resources will be wasted.
54
Teachers are the key to effective use of web-based tools and applications, but first
they must become skilled at using them.
The white paper “Learning With Technology: Evidence That Technology Can and
Does Support Learning”, Marshall (2002, pg. 18) furthers this argument involving the
essential nature of teachers, and the imperative for sound professional development, by
insisting that the implementation and use of technology must be in accord with best
practice in order to be effective:
Effective classroom use involves planning and purposeful application of
technology and the content it delivers to learning objectives and instructional
pursuits. In the classroom, this responsibility falls largely on the teacher. The
teacher is the gatekeeper – to instruction, technology, and learning.
In another white paper generated pursuant to an extensive review and analysis of the
published research involving technology integration, and its effect upon learning
outcomes involving the nation’s K-8 students, Waddoup (2004) asserts that, in addition to
sound technology design, effective technology implementation, and integration which is
ongoing, the importance of the teacher, and their ability to engage in sound curriculum
design leading to the successful integration of technology, is critical to the effectiveness
of technological resource allocations. Waddoup, who compiled and evaluated the
findings of 34 different studies which met the standards associated with either a gold or a
silver level research study, arrived at “11 recurring themes” he considered associated
with the use of technology to improve student learning, which were “synthesized” into
the “four technology integration principals” listed above. The critical role that the
teacher plays in the effectiveness of technology resource allocation is addressed in the
first of the four Waddoup (2004, p.4) categories, Teachers:
55
Teachers, not technology, are the key to unlocking student potential and fostering
achievement. A teacher’s training in, knowledge of, and attitudes toward
technology and related skills are central to effective technology integration.
Technology is the tool whose master greatly shapes the outcome. In the hands of
a poorly trained master, technology is ineffectual, a blunt instrument or worse.
The findings above which have defined the teacher as playing a critical role in the
ultimate success of technology resource allocation, are echoed by Michael Fullan who
asserts that “the more powerful that technology becomes, the more indispensable good
teachers are (Fullan, 1998).
However, the research of Larry Cuban (Cuban 1999, 2002, 2004; Cuban, Kirkpatrick
& Peck, 2001) highlights existing problems that the nation’s teaching force faces
involving the absence of meaningful and sustained professional development, and the
absence of the critical release time, that is required to design and then integrate
technology within a given curriculum. Cuban’s research, undertaken at the turn of the
millennium, indicates that: almost half of America’s teachers never use computers within
their classrooms; fewer than 20% of the nation’s teachers are serious users of computers
within their classrooms (several times a week); and three to four are said to be only
occasional users (approximately once a month). Why? Cuban points to the wide spread
practice where teachers are seldom consulted for their opinions concerning their
classroom technology requirements, during either the selection, or the implementation
process. Secondly, Cuban points to the abysmal, or even non-existent, levels of
professional development afforded to teachers in order to gain proficiency with both the
technology, and the various integration strategies, required to make the use of the
technology practical and efficient.
56
It is important to put the K-12 educational technology environment, during the period that
Cuban was conducting his survey of teacher use of technology, into perspective. The
year 1 report of the CEO Forum on Education and Technology, which was a 5 year
partnership between business and education leaders committed to assessing and
monitoring progress toward the integration of technology into American K-12 public
schools, reported in their first year report “School Technology Readiness: From Pillars to
Progress” that roughly sixty percent of American schools lacked the technology required
to prepare students for the 21
st
century, and only 3% of the nation’s schools had achieved
complete integration of technology into classrooms (CEO Forum on Education &
Technology, 1997).
21
st
Century literacy
Although questions concerning the existence of a positive relationship, or correlation,
between technology and student achievement within the traditional classroom
instructional model and curriculum are an important consideration for stakeholders, they
are by no means the only measure by which stakeholders should judge the value, worth,
and wisdom of incorporating technology into our nation’s public school system.
Another, perhaps even a more compelling argument for the inclusion of technology in our
schools, involves the beliefs of legions of stakeholders, including political and business
leaders, who today would assert that equipping our students with the 21
st
century tools,
and skill sets, they will require to live full and productive lives within an ever more
technologically centered society and culture, is of prime importance and should be a
prime directive.
57
When one surveys the nature of the concerns, and opinions, of these political and
business leaders as they relate to public education, a cynic, might even arrive at the
conclusion that their interests are far less related to what is in the best interest of the
individual, but really more centered about the need for a more highly skilled and
competitive workforce, and ultimately, the strength of the nation’s economy and in
insuring our competitiveness within a global economy. These concerns are certainly
present at the federal level where numerous studies have been conducted, countless
panels or commissions have been assembled, and an estimated three billion dollars
annually is being invested into educational programs involving, what has been coined by
Judith A. Ramaley, the former director of the National Science Foundation’s education
and human-resources division, as STEM: science, technology, engineering, and math
(Editorial Projects in Education, 2008). However, the most fervent calls to action may be
taking place at the state level where many of the nation's governors, who’s states have
been hard hit by job losses in the manufacturing sector, are currently focused on thinking
about “what the 21
st
century economy is going to look like in their states, and how the K-
12 systems in their states can contribute to that economy” (Cavanagh, 2008, p. 10).
One example of this level of concern is illustrated by a policy guide issued by the
National Governors Association (NGA) (2007a) on K-12 STEM education. The guide
urged the nation’s governors to promote the following three key public education goals
within their states:
Align state K–12 standards, assessments, and requirements with postsecondary
and workforce expectations for what high school graduates know and can do;
58
Examine and increase the state’s internal capacity to improve teaching and
learning, including the continued development of data systems and new models to
increase the quality of the K–12 STEM teaching force;
Identify best practices in STEM education and bring them to scale, including
specialized schools, effective curricula, and standards for Career and Technical
Education (CTE) that prepares students for STEM-related occupations.
The overarching philosophy guiding these state level policy recommendations is one
of preparing our students for the 21
st
century cultural, occupational, and societal
environment in which they will be forced to exist and, hopefully, prosper. The NGA
Center for Best Practice (2008, pg. 1) characterized their educational goals in terms of the
need to structure educational practice to achieve the following outputs:
This requires schools with challenging math and science courses for every student
that emphasize how math, science, and technology shape our world and where
faculty and students work “outside the school walls” to investigate solutions to
real world problems. These schools will prepare all students for success after
high school, regardless of whether they specialize in STEM fields or not.
Political leaders, however, are not the only stakeholders who are asserting that the
nation’s public schools must provide our student body with access to technological
resources, and the related instruction required, in order to assure proficiency leading to a
mastery of 21
st
century skills. A 2006 public opinion poll commissioned by the NGA,
and conducted by the renown pollster, Dr. Frank Luntz, found that nine out of ten
Americans hold a belief that if our nation fails to innovate, our children and our economy
will be left behind (NGA, 2007b). The Luntz polling found that Americans hold a belief
that, while we have the most innovative nation in the world at the moment, still ahead of
59
China and Japan, they see America losing its edge in this area over the next twenty years
unless we begin to act today. This was reflected by the response that Americans believe
that other nations are more committed to education (NGA, 2007b). Queried as to “what
action would have the most positive impact on our economy”, 46% responded that
“encouraging and supporting innovation in our schools and businesses” (NGA, 2007b,
pg. 2).
The Luntz (2007) NGA poll, Americans Talk Innovation: A Survey for the National
Governors Association, also yielded the following stakeholder responses when queried as
to the importance of innovation, the role of technology, and its place in public education.
When asked “What technology or invention over the past two decades or so do you think
is the single best example of innovation that has improved your quality of life”, 41%
asserted that it was computers – laptops – computer software; while 15%, the second
highest response, named the internet – broadband. Presented with the question, “When
you hear the word “innovation” what do you think of”, 63% chose technology; 36%
chose science; 31% chose education; 25% chose computers; 19% chose business; and
13% chose manufacturing. When asked, “Which of the following areas would you most
like to see the greatest attention to innovation”, the largest response registered by the poll
was a 56% figure which asserted the field of education, while healthcare scored the
second highest response at 53%, followed by technology, which scored a 24% response.
When Luntz framed one question, “Which is closest to your opinion”: “The best action to
increase global competition is to encourage more innovation in American education,
manufacturing and technology” or “The best action to increase global competition is to
60
protect American jobs and industries by tightening the rules and increasing the costs for
foreign companies to sell their products”, 68% of democrats chose innovation and
technology, while 64% of republicans chose innovation and technology over
protectionism. The message of educational stakeholders, in the form of the American
public, is clear; they favor reform and innovation in our public schools and believe that
technology and 21
st
century skills are required in order for our children, and our nation, to
compete in the global economy. The Luntz polling instrumentation employed for the
innovation attitude study included two instant response dial sessions and a nationwide
public opinion survey of 750 people, and had an error rate of +/- 3.7% NGA (2007c).
A similar survey, commissioned by the Partnership For 21
st
Century Skills (2007),
supports the accuracy of the Luntz survey in gauging the overwhelming support of
Americans for the presence of technology, and need to engender 21
st
century skills,
within today’s and tomorrow’s, public schools. This national survey was conducted by
Peter D. Hart Research Associates and Public Opinion Strategies, and encompassed 800
registered voters, polled between September 10
th
and 12
th
of 2007, and has a margin of
error of +/- 3.46%. To summarize the survey:
Ninety-nine percent of those surveyed believe that teaching 21
st
century skills is
important to our nation’s future economic success;
Eighty-eight percent, almost nine in ten of those surveyed, hold the belief that
schools can, and should, incorporate computer and technology skills, and other
related 21
st
century skills;
61
Eighty percent of those surveyed believe that the types of things student need to
learn in their K-12 schools are different than those of just 20 years ago;
Sixty-six percent of those surveyed hold the belief that our nation’s schools need
to integrate a broader range of skills into the curriculum because today’s students
need to learn more than the basics of reading, writing, and math;
Fifty-three percent of those surveyed believe that schools should place an equal
emphasis on 21
st
century skills and basic skills.
Many other local, state, and federal organizations are aligned with, and support, the
call for a broader role for technology within American schools. Likewise, academic
policy research also tends support the call to expand the presence, and role, of
technology, and widely supports the premise that there is an imperative for 21
st
century
skills with our schools, amidst a burgeoning global economy which is increasingly
competitive. In “Transforming Learning for the 21
st
Century: An Economic Imperative”,
Dede, Korte, Nelson, Valdez, and Ward (2005, pg. 14) submit that the challenges facing
K-12 education in 2008 involve a cocktail of imperatives which concern the well being of
both individual, and the nation, calling for a new approach to curriculum and learning:
Quality of life, personal and national security, and economic development are the
principal drivers for the agendas of most nations today. A 21st century education
is the fundamental driver for achieving these interrelated goals.
Blomstrom, Kokko, and Sjoholm (2002) point out that there is no need to speculate as to
the relative effectiveness of a 21
st
century focus in K-12 education since results have
already been already witnessed in such countries as South Korea, Ireland, and Finland
62
that have “advanced their global economic reach by investing in education that leverages
learning technologies to prepare sophisticated workers”. Dede, et al. (2005) propose the
formation of “powerful regional partnership and networks to align education and
economic development” to meet the 21
st
century needs of both students and commerce.
They offer the following 21
st
century K-12 educational initiatives surrounding what they
have characterized as “an intersection of economic and educational development”:
Preparing students for 21st century work and citizenship;
Helping teachers and principals prepare their students for 21st century work and
citizenship;
Building a cadre of business, policy, and education leaders;
Creating a standards-based technology infrastructure;
Sponsoring research.
Technology Mandates Under The No Child Left Behind (NCLB) Act of 2001
No Child Left Behind (NCLB) includes a mandate that each state can show that
“every student is technologically literate by the time the student finishes the eighth grade,
regardless of the student’s race, ethnicity, gender, family income, geographical location,
or disability” (U.S. Department of Education, 2001, pg. 1672). Our nation’s public
schools, at least those that accept Title I funding, thus are, by federal law, predisposed to
investments in 21
st
technology skills. Therefore, as to those who would continue to
engage in provincial arguments debating the wisdom, or judiciousness, of educational
dollars invested in the arena of technology, especially those based largely upon the
perpetual debate over, and quest for, an “irrefutable”, “undeniable” link between
63
increases in “student achievement” and technology resource allocation, are engaging in
an argument which is, by and large, moot.
In the spirit of, and in keeping with, this federal mandate there are currently 26 states
which have created and enacted distinct, stand alone, technology standards. Sixteen
states currently have technology standards which are integrated into standards for other
subjects. Fifteen of those sixteen states, in fact, have gone so far as to embed their
technology standards into the four core subjects of history, English, math, and science.
Finally, there are 6 states which have elected to create a matrix that includes both
strategies of implementing technology standards. Only three of the nation’s states,
Delaware, Iowa, and the District of Columbia, still lack any technology standards. To
date, however, Arizona, Georgia, North Carolina, Pennsylvania, and Utah, stand as the
only states to also incorporate a state-administered assessment component to gauge their
student’s competencies related to the technology standards that they have enacted
(Education Week, 2008).
Data Driven Decision Making (DDDM)
With the implementation of the No Child Left Behind (NCLB) Act of 2000, which
instituted the modern era standards and accountability movement, the collection, analysis,
tracking, and reporting of educational data has became an imperative in American
schools. A quote from a Mid-continent Research for Education and Learning (2003, Pg.
1) report sets the stage nicely for a discussion about the place of Data Driven Decision
Making in today’s K-12 educational environment: “Educators in schools that sustain
improvement know that gut feelings, instincts, and anecdotes are poor substitutes for
64
empirical data when important decisions need to be made”. In the same vein, the
Consortium on School Networking (CoSN), through its Data Driven Decision Making
initiative, Vision to Know and Do, asserts that the mandates of NCLB necessitate that
administrators today distance themselves from antiquated decision making practices, and
embrace a data driven decision making culture through which policy and instructional
strategy decisions flow, in order to provide successful instructional strategies that lead to
increases in student achievement and other desired outcomes. The Consortium on School
Networking (2008a) points to the wisdom and benefits of such a shift in decision making
strategy by, stating the obvious: “Decisions in school districts have been made according
to tradition, instinct, and regulations”; and identifying the perks of DDDM: “More access
to better information enables educational professionals to test their assumptions, identify
needs, and measure outcomes”. The Consortium on School Networking (CoSN) argues
that such a linkage, one between DDDM and the modern accountability and standards
movement, is not only a natural fit, but one that has become an essential now that state’s
school sites are ranked and evaluated under California’s Academic Yearly Progress
(AYP) accountability program and are, for better or worse, often perceived and labeled as
either successful or unsuccessful based upon that methodology. The Consortium on
School Networking (2008a) states this belief succinctly, “With the right data at the right
time to inform decisions about resources, grouping, and instruction, schools are more
likely to meet their Adequate Yearly Progress (AYP) requirements and comply with
NCLB”.
65
A broad, rather simplistic definition of DDDM is, “the use of data analysis to inform
when determining course of action involving policy and procedures” (Picciano, 2006, p.
79). For the narrower purposes of this study, which is focused upon technology resource
allocations and student achievement, perhaps a better definition of DDDM would be, “the
use of student assessment data and relevant background information, to inform decisions
related to planning and implementing instructional strategies at the district, school,
classroom, and the individual student levels” California Learning Resource Network
(2008).
DDDM is important within today’s standards and accountability based K-12
educational environment since, “research shows that if instructional plans at the state,
county, district, school, classroom, and individual students levels are based on assessment
information relevant to the desired outcomes for students, the probability is increased that
they will attain these desired learning outcomes” California Learning Resource Network
(2008).
The allure and the promise of DDDM, within our standards and accountability based
K-12 educational environment, is as follows. DDDM facilitates the ability of teachers
and administrators to align curriculum, standards, and instruction. It allows student
assessment measures, which have been aligned to state standards and other desired
student outcomes, to be archived, and progress to be tracked. DDDM also allows
educators to input a wealth of student level data beyond these academic assessment
measures, student demographics, socioeconomic status, behavior, attendance, parental
involvement, etc., for a more complete picture of student achievement. Finally, DDDM
66
allows educators to design and implement interventions when and where they are needed
to bring students up to proficiency in any academic areas in which they may be falling
behind. This is an invaluable tool for the classroom teacher and for the principal.
This alignment between state standards and the assessment measures employed through
DDDM allows both teachers and administrators to identify knowledge gaps or academic
subject areas where students are not meeting standards, and then design an appropriate
intervention for a student, a class, or an entire grade level. The ability to illuminate such
academic needs in a student, or a group of students, design a tailored academic
intervention, and then retest to assure academic success, is an invaluable tool, a “cycle of
inquiry”, in increasing the probability that learning is occurring at a school site, student
outcome goals are being met, and a that a given school will indeed meet their required
AYP (California Learning Resource Network, 2008).
Interestingly, since DDDM student level aggregated data concerning one’s strengths
and weaknesses is commonly made available to students under such programs, in order
for them to continually monitor their own academic standing, the CoSN research report,
Vision to Know and Do: The Power of Data as a Tool in Educational Decision Making
(2008b), has additionally cited the benefits of such data manipulation skills by students
“as a way to prepare students for the 21
st
century”.
Significant impediments and barriers to the effective use of DDDM data can and do
exist however, as was reported in both the 2004 CoSN commissioned survey which was
conducted by Grunwald and Associates, and the CoSN, Data Driven Decision Making,
initiative. What is not surprising is the refrain commonly heard when discussing most
67
school technology projects of any kind: a lack of professional development time and
dollars; a hodgepodge of hardware and software to work with; and a situation in which
one must make the best of outdated equipment, systems and networks, due to inadequate
replacement cycles. The CoSN (2008a) research data found that: fifty percent of
respondents cited a “lack of training” as a barrier; forty two percent identified
“interoperability”, or “systems that are unable to share or exchange data”, as a barrier;
thirty one percent of respondents cited “outdated technology” or so called “legacy
systems” as a barrier; and twenty two percent of respondents cited a “user interface”
which was “too complicated to understand” as a barrier.
Datnow, Park & Wohlstetter (2007, pp. 5-7) in an examination of “performance-
driven” school systems, engaged in case studies focused upon four school systems that
had been “identified as leaders in data-driven decision making”. Their research
concluded that, although each and every school district, and school, approached DDDM
differently, “sustaining a culture of continuous improvement through the use of data-
driven decision making requires a continual investment in data management resources,
including both human and social capital”.
As a part of their agenda, (CoSN), through their Data-driven Decision Making
initiative, Vision to Know and Do, and has now created an on-line, district level, self
assessment tool for educational administrators. To view or use the CoSN Data Driven
Decision Making tool visit: http://3d2know.cosn.org/ assessment/survey.cfm.
68
Costing Out Methodologies and Cost/ Benefit Analysis
Case studies and research show that in the past, and even to this day, many if not most
K-12 organizations that are faced with the prospect of implementing a technology
program, do so with a fragmented, short sighted, approach that is too narrow in its
consideration of the investments required to sustain such a program. Such organizations
typically fail to take into consideration the long term, protracted costs required, especially
in the areas of network maintenance and professional development. School
administrators are however increasingly aware of, and coming to depend upon, a costing
out methodology borrowed from the business world, called Total Cost of Ownership
(TCO) that seeks to include all costs, short term and protracted, required to fund a
successful, sustained technology program. Another aligned and interrelated business
world analysis, Return on Investment (ROI), has also recently been adapted to the K-12
environment, and has been given the educational goal related term of Value of
Investment (VOI).
Rich Kaestner (2007), the director of the Total Cost of Ownership and Value of
Investment Projects for the Consortium for School Networking (COSN) in Washington,
DC, poses the following questions surrounding technology resource allocations to K-12
school administrators:
Is you school district leveraging its technology effectively?
Do the costs justify the benefits?
What does your technology infrastructure cost in terms of money and time?
69
Kaestner then laments his experience that few administrators are able capable of
answering these fundamental questions with any degree of certainty. Kaestner therefore
asserts that most school administrators tend to base their technology decisions upon
“perceptions of value”, rather than a solid understanding of the “real costs” of developing
and maintaining computer networks or the expectant benefits surrounding their planned,
and proposed, school technology projects, in any “measurable terms”.
This begs the question, what methodologies or tools are currently available to the
school administrator, in order to quantify beliefs, theories, and perceptions surrounding
the costs and benefits associated with technology resource allocation. Total Cost of
Ownership (TCO) and Value of Investment (VOI) are two interrelated, and
complementary, concepts which can offer school administrators such a means to an end.
Total Cost of Ownership (TCO)
Total Cost of Ownership (TCO) is a concept and a methodology that was borrowed
from the corporate and business world whereby organizations endeavor to measure all of
the costs associated with the implementation, and the long term maintenance of,
technology resources. Kaestner (2006) explains that TCO allows the administrator to
examine three major expense categories, “annualized technology costs, direct labor, and
indirect labor”. He also offers the following definitions of these three categories:
Annualized technology costs as, “amortized costs of client desktop/laptop
computers and devices, network equipment servers, software, printers, supplies
and external service providers”;
70
Direct labor costs as, “burdened costs for all personnel who have responsibility
for buying, implementing maintaining, and managing the technology
infrastructure”. This category includes “Teachers and other school staff who
provide tech support”, as well as “outsourced services”;
Indirect Labor costs as, “time user spend in training and dealing with system and
application issues that affect productivity”.
Total Cost of Investment is therefore said to be an essential concept and methodology in
order to answer the fundamental question of what a technology infrastructure is costing
an organization.
Value on Investment (VOI)
Value on Investment (VOI) is a concept and methodology that was also borrowed
from the corporate and business world, however Kaestner (2007) points out that the
yardstick by which one measures costs and benefits in business world, versus in the
academic world, is quite different:
Businesses use processes like Return on Investment and Net Present Value to
project costs and benefits of proposed projects with an eye to increasing bottom-
line or top-line dollars. In contrast, educators focus on addressing non-monetary
goals and mandates such as improved student performance, equity and 21
st
century skills.
Therefore, in practice, once a given organization has performed their TCO analysis,
which focused on examining an organization’s established, or planned, technology
program’s costs in their entirety, a VOI analysis is next employed in order to juxtapose
those projected costs, against the related benefits, of said technology program. Value on
Investment is therefore said to be essential to answering the fundamental question of
71
what is the best direction, or manner, for an organization to proceed with their future
programs, and investments in technology. In VOI one is faced with the difficult task of
attempting to correlate technology resource allocations to student achievement. VOI
goes a step beyond the tradition Return on Investment (ROI) model from the business
world by also taking into consideration a range of qualitative factors associated with
student achievement. Such qualitative benefits, it has been asserted, “need to positively
affect school mission, goals, mandates or other imperatives” (edtechvoi.org, 2006) Some
examples of such positive qualitative benefits would include, the relationship between
desired outcomes, curriculum, and technology; an alignment with a given school’s
objectives, equity issues ensuring learning across all sub groups of students; and
providing 21
st
century skills. To assist school administrators in performing TCO and VOI
assessments for their organizations, the Consortium for School Networking (CoSN) has
developed three tools which are available online at: http://classroomtco.cosn.org/gartner
_intro; http://www.edtechvoi.org/methodology/; and http://www.edtechvoi.org/
assessment/index.cfm.
CoSN-Gartner K-12 TCO Tool (a full scale TCO analysis tool)
CoSN Project Cost Estimator Tool (an abbreviated TCO analysis tool)
CoSN K-12 VOI Assessment Tool
The Institute for the Advancement of Emerging Technologies at AEL, offers K-12 school
administrators access to yet another online, web and Excel spreadsheet based, K12 TCO
calculator at http://129.71.174.252/tcov2/start.cfm.
72
CHAPTER THREE
METHODOLOGY
Introduction
The purpose of this study is to define and quantify the nature of, and required levels
of, technology resources at the school level in order for students to meet standards and
prescribed levels of educational attainment. Principals, and select members of their
teaching staff, were queried as to their professional opinions concerning the
implementation, best practice, and the effectiveness of technology resource allocations in
raising student achievement.
Taken in concert with aligned, and similar, studies currently being conducted as
thematic dissertations at the University of Southern California under the supervision of
Dr. Lawrence Picus, the goal of this study is to advance the literature and improve
practice by providing a more accurate picture of school site level per pupil funding
adequacy by defining, and quantifying, aspects of those related educational costs which
have, hereto, been undefined, and have traditionally remained un-quantified, in resource
based or state-of-the-art studies and research.
This successful school study was conducted to examine these levels of technology
resource allocations, their usage, and their effectiveness relative to student achievement,
in a select group of elementary schools located in the state of California. This was
accomplished by means of two distinct survey instruments: (a) a researcher designed
survey, in the form of a self-administered questionnaire, which was used to collect data
73
from a sample; (b) a School Technology Survey (STS), that is a joint undertaking
between the California Department of Education (CDE) and the California Technology
Assistance Project (CTAP), which California schools complete for the state annually and
is retrievable from a state database. The researcher asserts that this mixed methods,
successful school study research approach was appropriate for this study. The school site
questionnaires, which were cross referenced with the California Department of Education
(CDE) database during the analysis phase of the study, were seen as the most likely
method of instrumentation to provide the researcher with the desired breadth, and depth,
of data and other information concerning technology resource allocations at these school
sites.
This study was conducted at the school site level and answers the three research
questions:
4. How should technology be deployed and employed, at the school site level, in
keeping with best practice?
5. Does technology, deployed and employed in keeping with best practice, provide
school sites an effective vehicle for raising student achievement?
6. What levels of resource allocations are required to establish, and sustain, an
effective technology program at the school site level?
This chapter restates the purpose of the study and provides a description of the school
sites examined. It also provides a detailed description of the survey instrumentation, the
methods by which the survey instrumentation was implemented and by which the data
74
was collected, and detailed information concerning the processes by which data analysis
was conducted in order to answer the research questions posed.
Research Population
The population selected for the study consisted of 362 California public elementary
schools that were designated by the California Department of Education (2008) as being
fourth or fifth year Program Improvement (PI) schools pursuant to Title I of No Child
Left Behind (NCLB) during the 2006-2007 school year. Schools which have been
designated as PI, year four, have failed to make Adequate Yearly Progress (AYP) for a
period of at least five years.
The researcher selected this population for the study because of the dire academic
situation of these elementary schools, and the high pressure environment that this group
of principals and the teaching staff were facing. A school site in year four PI status is not
only a serious concern to students, parents and other stakeholders, but this status has
serious repercussions for the school site administration and staff. The districts which are
responsible for such schools are required by federal law to draft a restructuring plan for
the following school year, which these school sites would be forced to implement, should
the site once again fail to meet their AYP growth target. The restructuring such schools
face entails fundamental reforms, and a major reorganization of the site, that includes the
site’s staffing and governance.
The researcher hypothesized that these principals, as the de facto instructional leaders
of their sites, would be desperate for resources and strategies that could boost student
75
achievement and raise the AYP of their schools. Undoubtedly, they would be seeking and
reaching out for proven methodologies, examples of best practice, research driven
educational strategies, and additional resources. The researcher therefore wanted to
know if technologically based resources entered into the equation. Did such resources
ultimately assist these principals, and these school sites, to raise student achievement,
meet their AYP, and exit program improvement?
The sample selected from this population of year four or five PI schools was narrowed
from this group of schools to include only the 28 California elementary schools that were
successful in exiting from PI by meeting their AYP at the conclusion of the 2006-2007
school year. Twenty five of the above California elementary schools were invited to
participate in the research study. However, district administrators that oversee thirteen of
the school sites represented in the population, declined to allow their schools to
participate in the study. Twelve California year four or year five PI elementary schools,
ultimately agreed to participate in the study. This study therefore collected and analyzed
school site level data at 12 elementary schools, as described above, scattered across the
state of California. The 12 schools that ultimately agreed to participate in this study are
contained within a total of nine distinct California school districts. Each of the 12
elementary school sites were subsequently forwarded the survey instruments, and all 12
of the school sites who agreed to participate in the study, completed the survey
instruments, and then returned them to this researcher. The participants of this study
were the 12 school site principals, and the two teachers at each site that the principal was
asked to select, to participate in the study. The principal was asked to select one “upper
76
grade” and one “lower grade” teacher at their site who they considered to be a
“technology leader” or well versed in educational technology.
Instrumentation
The study employed researcher designed, self-answer survey questionnaires that were
mailed to the participating school sites scattered across California, completed at the sites,
and returned to the researcher, over a five month period of time. This process unfolded in
the following manner. Once the approval for the study had been received from each of
the nine school district administrative offices, a large United States Postal Service
(USPS) Priority Mail envelope was sent to the principal at each of the 12 school sites.
The packet the researcher assembled, contained:
A letter of introduction directed to the principal that detailed the focus of the
study, the method by which school sites were selected, and a request for their
participation. The introductory letter also informed the principal that the
confidentiality of all participants, the school sites, and the districts that
participated in the study would be protected. The letter explained that all schools,
all staff members, and all school districts discussed in the study, would only be
referenced through a completely random, assignment of the various letters of the
alphabet. The introductory letter also requested that the principal select one upper
and one lower grade teacher at their school site, who they judged to be either a
leader or quite proficient with educational technology, to complete a teaching
staff survey questionnaire;
77
A copy of a researcher designed district level permission form that had been
signed by either the superintendent or area administrator granting permission for
the school site to participate in the study;
The principal survey questionnaire instrument that included instructions on the
various procedures by which the researcher desired the survey be filled out;
Two teacher survey questionnaire instruments, one upper grade survey
questionnaire and one lower grade survey questionnaire, that included instructions
on the various procedures by which the researcher desired the surveys be filled
out;
A self addressed, pre-paid, USPS Priority Mail large Tyvek envelop in which to
return the completed survey instruments to the researcher.
The design, and composition, of the researcher designed school site survey
instruments was as follows. Each school site received a 14 page principal survey
questionnaire instrument, a two page upper grade teacher survey questionnaire
instrument, and a two page lower grade teacher survey questionnaire instrument. The
principal survey instrument contained both qualitative and quantitative questions. The
principal survey contained 45 open-ended questions and 27 structured, so called forced
choice or fixed response, scaled-construct questions, for a grand total of 72 questions.
The fixed choice questions included within the principal survey utilized a six level
continuous scale: “strongly agree”; “agree”; “disagree”; “strongly disagree”; “don’t
know”; and, “not applicable”. The teacher survey instruments contained three open-
78
ended questions and 25 structured, so called forced choice or fixed response, scaled-
construct questions, for a grand total of 28 questions. The fixed choice questions
included within the two teacher survey questionnaires utilized a five level continuous
scale: “strongly agree”; “agree”; “disagree”; “strongly disagree”; and, “don’t know”.
Each of the survey questionnaire instruments asked a wide range of questions related to
technology resource allocation. The questions probed for information concerning
implementation, use, best practice, and effectiveness in relation to raising student
achievement. Hardware, software, curriculum integration, tech support, Data Driven
Decision Making (DDDM), professional development, replacement cycles, and
technology budgeting are just a sampling of the various areas of inquiry explored within
the surveys.
In addition to the survey questionnaire instruments, the researcher performed a
structured record review of school site level data from a California Department of
Education (CDE) online database. Contained on this database is the School Technology
Survey (STS) which all California schools complete annually. The STS, which is a joint
undertaking between the CDE and the California Technology Assistance Project (CTAP),
provides a wealth of quantitative data on levels of technological resources, their
implementation, connectivity, support staffing, and usage at each school site.
The researcher juxtaposed the qualitative and quantitative data gathered from the three
survey questionnaire instruments, provided at the school site level, with the quantitative
data available for the school site in the CDE database containing their School Technology
Survey, during the analysis phase of the study.
79
The survey questionnaire instruments utilized within this study were validated through
professional consultation to ensure their appropriateness for the elementary school level
educational setting, and to assure that the instruments could be relied upon to accurately
gather, and gauge, useful data and other information that would ultimately answer the
research questions that were posed in the study. This validation came by means of an
administrator who was the principal of an elementary school site that had been in year
four of Program Improvement (PI) for the 2005-2006 school year, and had employed
educational technology to raise student achievement, and meet the school’s AYP growth
targets, in order to escape from PI at the conclusion of the 2005-2006 school year.
Data Collection
Non-probability sampling, specifically in the form of purposive sampling, was utilized
in order to identify school sites within the population that met specific criteria. The
criteria for selection included the following:
Elementary level, public school sites, located within the state of California;
Fourth, or fifth, year level of program improvement (PI) status;
District administrators amenable to allowing their school(s) to participate in the
study;
School site principals amenable to participation in the study.
The data collection methodology of the study employed a mixed methods approach.
The qualitative aspect of the research design involved the collection of narrative data, and
80
entailed data analysis through the coding of data, and the production of an inductive,
verbal synthesis.
The quantitative aspect of the research design involved the collection of numerical data,
and entailed data analysis by means of a deductive, statistical methodology. The mixed
methods, quantitative and qualitative, approach to the research design was employed
within following instruments:
A self-administered, researcher designed, principal survey questionnaire
instrument, composed of open-ended and fixed response elements, to collect data
from a sample;
A self-administered, researcher designed, teacher survey questionnaire
instrument, composed of open-ended and fixed response elements, to collect data
from a sample;
A structured record review of school site level data from a California Department
of Education (CDE) online database that allowed the researcher access to data
provided from the school site, by means of a School Technology Survey (STS),
that is completed by all California schools, on an annual basis.
81
Data Analysis
The implementation of the study was concurrent, and a concurrent triangulation
strategy was employed to confirm, cross-validate, and corroborate the findings. When
consideration was given to the nature of the quantitative and qualitative data involved in
the study, the researcher decided to set an equal priority between the two. Integration
between the two types of data occurred at the data collection stage, the data analysis
stage, and the interpretation stage of the study. During the data analysis and
interpretation stages, the data gathered through the survey instruments at each of the 12
school sites was correlated and juxtaposed in order to provide answers to the three
research questions posed by the study.
Once the researcher had received the data from each of the participants representing
the 12 school sites, the following steps were undertaken to analyze the data:
The data contained within the principal and teacher survey instruments was
organized;
In order to begin the analysis of the survey questionnaire’s quantitative, forced-
choice responses, calculations were performed as to the number of responses by
participants who responded “strongly agree”, “agree”, “disagree”, “strongly
disagree”, “don’t know”, or “not applicable” to each of the 27 forced-choice
questions contained on the principal surveys, and the 25 forced-choice questions
contained on the teacher surveys;
82
In order to begin the analysis of the survey questionnaire’s qualitative, open-
ended descriptive narrative responses, the data was coded, and an inductive,
verbal synthesis was produced for each of the 45 open-ended questions contained
on the principal surveys, and the three open-ended questions contained on the
teacher surveys;
Having completed the above calculations for each of the three survey instruments
returned from each school site, calculations were next undertaken concerning the
cumulative responses, entailing all twelve school sites, in the same manner;
In order to analyze the open-ended responses, the qualitative data was combed for
the presence of themes and patterns apparent in the responses of both principals
and teachers;
In order to analyze the forced-responses, the data was entered into a Microsoft
Excel spreadsheet format in order to create frequency charts for each of the 27
principal survey questions, and each of the 25 teacher survey questions. These
frequency charts were then used to determine the various trends, relationships,
and patterns which existed within the data;
A final extensive review of all of the data that had been compiled was performed,
and a final summarization of the data was produced, in order to establish the
findings, which were subsequently interpreted in order to generate the study’s
various conclusions.
83
Summary
Chapter three has presented a detailed description of the study’s research methodology
that was employed to offer the best answers to the study’s research questions. The
various processes and procedures associated with the collection, organization, and
analysis of the data compiled for the study was discusses, in depth. The following
chapter presents the results of this study’s data collection, analysis, and a discussion of
the findings.
84
CHAPTER FOUR
DATA ANALYSIS AND INTERPRETATIONS OF THE FINDINGS
Introduction
This chapter presents an analysis of the data the researcher collected during the course
of the study. The purpose of the study was to investigate the role and value that
educational technology played at Fourth and Fifth Year, Program Improvement (PI)
elementary schools, that were able to meet their Adequate Yearly Progress (AYP) growth
targets. Data were requested, and was collected from 12 California elementary school
sites that were successful in raising student achievement and therefore were able to exit
from Program Improvement status at the conclusion of the 2006-2007 school year. The
study addressed the following three research questions:
1. How should technology be deployed and employed, at the school site level, in
keeping with best practice?
2. Does technology, deployed and employed in keeping with best practice, provide
school sites an effective vehicle for raising student achievement?
3. What levels of resource allocations are required to establish, and sustain, an
effective technology program at the school site level?
To find answers to these three questions, a mixed methods research analysis approach
was followed in order to examine school site levels of technology resource allocations, its
usage, and its effectiveness, relative to student achievement. Data collection was
85
facilitated through the use of two distinct survey instruments: (a) a researcher designed
survey in the form of a self-report questionnaire which was used to collect data from a
sample; (b) an existing California Department of Education School Technology Survey,
which California schools complete for the state annually, and is retrievable from a state
database. The school site survey instruments included the following three components:
(a) a 14 page principal survey; (b) a two page upper grade teacher survey; (c) and a two
page lower grade teacher survey. The principal survey instrument contained a mix of
quantitative and qualitative questions. The principal survey instrument contained 27
open-ended questions and 44 structured, so called forced choice or fixed response,
scaled-construct questions, for a grand total of 71 questions. The teacher survey
instruments contained three open-ended questions and 25 structured, so called forced
choice or fixed response, scaled–construct questions, for a grand total of 28 questions.
The CDE School Technology Surveys, which were retrieved on-line, contained
qualitative data exclusively.
The data, in the form of the quantitative answers gathered through each of the three
survey instruments from each of the 12 school sites, was entered into an Excel
spreadsheet format and then tabulated. Figures were then generated from the data in
relation to each of the individual quantitative questions posed to the participants. Data, in
the form of the qualitative answers gathered through each of the three survey instruments
from each of the 12 school sites, was transcribed and compiled in accordance with each
of the individual qualitative questions posed to the participants. To assist in illuminating
any latent belief patterns, opinions, or philosophies that exist, and might be revealed by
86
engaging in a comparison or a juxtaposition of responses from survey question to survey
question, the researcher has provided the reader with a system by which the individual
participant sites, and districts, have been given randomly selected letter and number
identity codes. This system is in keeping with the ethical treatment of participants and
the commitments the researcher made to the school sites as to protecting their anonymity.
The implementation of the study was therefore concurrent, with a concurrent
triangulation strategy employed to confirm, cross-validate, and corroborate the findings.
Equal priority was given to the quantitative and qualitative data involved in the study.
Integration between the two types of data occurred at the data collection stage, the data
analysis stage, and the interpretation stage of the study.
During this, the data analysis and the interpretation stage, the data gathered from the
site survey instruments, and the CDE School Technology Surveys, was correlated and
juxtaposed in order to provide the accurate answers to the three research questions posed
by the study.
Sample and Population
The 2006-2007 school year was marked by a grand total of 2,216 K-12 public schools
that were classified under California’s Adequate Yearly Progress (AYP) school
accountability program, as being Program Improvement (PI) sites. Of that number, 692
K-12 schools were classified as being in Fourth Year and Fifth Year status of PI, which is
the most dire and the most serious status since these school sites had failed to make their
AYP growth targets for five years, and therefore were facing reorganization and
87
restructuring. Of those 692 schools, 362 were elementary schools. Of these California
362 elementary schools, only 28 sites were successful in raising student achievement and
meeting the AYP growth targets in order to exit from PI status at the end of the school
year. Since so few had been successful in this endeavor, the researcher’s curiosity was
peeked, and therefore a deliberate decision was made to select this particular group of
Fourth and Fifth Year PI elementary schools as the sample for this research study, in
order to ascertain the role and value of educational technology on raising student
achievement. Twenty five California elementary schools were subsequently invited,
through their district offices, to participate in the study. The districts which represent
thirteen of these schools declined to participate in the study. This yielded a sample of 12
California elementary schools represented by a total of nine school districts.
The 12 principal participants completed survey questionnaires, that posed both
quantitative and qualitative questions, which began with a few school site background
information queries, followed by an exhaustive line of questioning which encompassed
three broad areas of inquiry: (1) Implementation, integration, and usage of educational
technology; (2) perceived effects of educational technology upon raising student
achievement; (3) levels of educational technology resource allocations. Some of the
areas of interest included beneath these three broad areas of inquiry were: Professional
development; technical support; software; hardware; Data Driven Decision Making
(DDDM); and technology plans, budgets, and funding mechanisms.
The teacher survey instrument posed a mix of 25 quantitative with three qualitative
questions. The teacher survey instrument however, unlike the principal questionnaire,
88
contained questions framed exclusively in the context of whether or not a given element
that is typically found in technology program helped to raise student achievement. There
were three broad categories: (1) Technology hardware; (2) Technology
software/programs; (3) Technology support. There were two teachers surveyed at each
of the 12 school sites, one upper grade teacher, and one lower grade teacher. The
principals at each site were asked to select two teachers from their school staff who were
technology leaders, or individuals they perceived to be proficient in educational
technology, to complete the teacher survey instruments. This therefore yielded data
gathered through a total of 24 teacher survey instruments.
The three survey questionnaire instruments utilized within this study were validated
through professional consultation in order to ensure their appropriateness for the
elementary school level educational setting, and to assure that the instruments could be
relied upon to accurately gather, and gauge, useful and relevant data and any other
pertinent information that would answer the research questions posed in the study. This
validation came by means of an administrator who was the principal of a California
elementary school site that had been in year four of Program Improvement (PI) for the
2005-2006 school year, and had employed educational technology to raise student
achievement and meet the school’s AYP growth targets, in order to escape from PI at the
conclusion of the 2005-2006 school year.
89
Findings for Research Question Number One:
How should technology be deployed and employed at the school site level, in keeping
with best practice?
To answer the research question which addresses how technology should be
implemented and utilized at the school site level in keeping with “best practice”, the
quantitative data that was gathered through each of the principal survey instruments, from
the 12 school sites, was entered into a database. Excel spreadsheets were utilized to
group and tally the answers to each of the 44 quantitative questions contained on the
principal survey from across the 12 school sites. Once the grouping and tallying of this
quantitative data was complete, Excel was also utilized to produce the figures which
visually depict the frequency of the fixed response, scaled-construct, answers that were
provided by the principals at the 12 school sites. The principal participants provided the
researcher with data which was gathered through a six level continuous scale: “strongly
agree”, “agree”; “disagree”; “strongly disagree”; “don’t know”; and “not applicable”.
In addition to these six response categories, the researcher added one additional category,
“unanswered”, to the Excel spreadsheets and figures since some of the questions went
unanswered by the various participants.
Qualitative data was collected from the 12 principal participants through 27 open
ended questions. Each of the participant’s descriptive comments were transcribed and
compiled in accordance with each of individual questions that were contained on the
survey instrument.
90
In presenting the data and an interpretation of the findings associated with the first
research question, the quantitative and qualitative data have been interspersed and
juxtaposed in order to provide a clearer and more accurate portrayal of the judgments,
perspectives, and opinions of the various participants concerning the ability, and the
effectiveness, of educational technology to positively affect student achievement.
The principal participants were asked two inter-related, qualitative, open-ended
questions. The first question was: “What were some of the key factors/programs that
helped your school to raise student achievement and exit PI?” Although the depth and
breadth of responses that were gathered through this survey question are presented in
their entirety within the chapter 4, second research question section, these participant
responses also yield insights that serve to answer the first research question pertaining to
best practice associated with the implementation and use of a school site’s technology
resource allocations. It is more than interesting, it is striking, that the greatest number of
responses that can be attributed to any single arena in which technology resources were
allocated, is directly or indirectly associated with Data Driven Decision Making
(DDDM). The following represents a synopsis of the range of responses linked to
DDDM, as gathered by the principal survey instrument:
“Ongoing assessment and data analysis.” “Teacher weekly collaboration and
planning.” (principal H, district 7)
“Interventions provided throughout the day for At-Risk learners.” (principal F,
district 6)
91
“Effectively using data to drive instruction.” (principal J, district 8)
“Moved from staff meetings every other week to grade level meetings each
week.” (principal J, district 8)
“Staff development.” (principal L, district 9)
“Intervention program.” (principal L, district 9)
“Data analysis to drive instruction.” “Weekly collaboration meetings (by grade
level).” “Intervention programs.” (principal L, district 9)
“Major technology update; Computer lab – Fast Forward software; Kaleidoscope;
Data Days for grade level data exchange and planning; collaboration.” (principal
K, district 9)
“Collaboration.” (principal D, district 4)
“Targeted, focused instruction.” (principal D, district 4)
“In school intervention system.” (principal A, district 1)
“Voyager intervention program.” (principal B, district 2)
“Targeting (focusing) areas of instruction.” (principal B, district 2)
“Frontloading. Monitor (closely) student progress and needs. Set interventions
according to student needs.” (principal C, district 3)
92
One will notice that all of the above instructional strategies are made possible, and flow
from, the implementation and utilization of technology resource allocations invested in a
school wide DDDM program. That is to say these instructional strategies were facilitated
by virtue of the fact that an investment of technology resource allocations were directed
into a school wide DDDM program, in keeping with best practice. In fact, as discussed
in chapter 2, research has shown that all of the above instructional strategies are truly in
keeping with the very philosophy and tenets of a school wide DDDM program.
The second of the two qualitative open-ended questions was: “To what extent was
technology integrated into or supportive of these factors/programs?” Although, once
again as with the first question, the depth and breadth of responses that were gathered
through this survey question are presented in their entirety within the chapter 4, second
research question section, the participant responses also yield insights that serve to
answer the first research question pertaining to best practice associated with the
implementation and use of a school site’s technology resource allocations. In addition to
DDDM, other prominent arenas in which the principal participants implied, if not
implicitly asserted, that best practice usage of technology had been employed were:
Computer labs and mobile laptop carts; classroom instructional delivery medium such
Smart-boards, LCD projectors, auditory amplification systems, etc; as a vehicle to
provide background or remedial knowledge; and when utilized holistically wherein there
had been a complete integration of technology across the curriculum in concert with
grade level content standards. The following represents a synopsis of the range of
93
responses which are closely linked to best practice usages of technology, as gathered by
the principal survey instrument:
“Data “harvesting” …coordinate, plan, and track intervention programs”
(principal L, district 9)
“Data meetings every 6-8 weeks. OARS reports by teacher, grade level, student,
to analyze specific student areas of need.” (principal B, district 2)
“All teachers were provided laptops and wireless access. This provided support
for assessment, planning (collaborative), utilizing EL support strategies…”
“United Streaming “ “Power Point” (principal H, district 7)
“Technology played a big part in supporting our success.” “United Streaming to
provide background knowledge for CORE.” “Camera Doc.” “Smart-boards.”
(principal J, district 8)
“Technology was a key factor in making the change.” “Computer lab –
computers update, programs/software purchased, Smart-boards in every room,
3
rd
/4
th
/5
th
two roving laptop carts purchased, classroom computers (3 each)
updated.” "Each teacher has laptop and training.” (principal E, district 5)
“Address grade level technology standards.” “Lexia for all CELDT 1&2.”
“Integrate technology across curriculum.” “Regular lab schedule for all grade
levels.” (principal G, district 7)
94
“Success-Maker” “Lab for elementary students.” “Smart-boards for grades 7/8.”
(principal A, district 1)
This is not to say that principal participant belief in the positive link between technology
resource allocations and a given school site’s recent reversal of fortunes in regard student
achievement and AYP were universal, unanimous, or unwavering, as these three
responses indicate:
“Technology in the primary grades is very limited. Therefore, its integration into
these factors is basically non-existent. The upper grades (4-6) have more access
to technology through Promethean Technology. Teachers are learning how to use
this instructional tool and the integration of technology is still limited.” (principal
C, district 3)
“Technology is integrated into some of our interventions. It is not a strong part of
our base program, but supports supplemental programs.” (principal I, district 7)
“Very minimally.” (principal F, district 6)
The principal participants were asked a quantitative question inquiring into the
importance of integrating technology into the core curriculum, in the context of raising
student achievement: “Do you credit the integration of technology into the core
curriculum as an important element of raising student achievement?” Figure 1 provides
an illustration as to the beliefs of the 12 principal participants in this regard.
95
Figure 1: Technology Integration into Curriculum and Student Achievement
The 12 principal participants were asked a series of three inter-related questions
predicated and based upon this question: “Did the use of technology at your site designed
to raise student achievement include computer assisted learning (aka computer assisted
instruction or computer-augmented instruction)?” The participants were then directed,
“If you agree or strongly agree, please briefly elaborate on the following aspects…” The
first of the three questions then inquired: “How widespread was this use of technology in
the classroom?” The following represents a synopsis of the range of responses that were
gathered by the principal survey instrument:
“K-6” (principal G, district 7)
“Most classrooms.” “Two computer labs.” (principal B, district 2)
“Weekly lab.” (principal A, district 1)
1
8
3
00 00
12 School Site Principal Survey
Do you credit the integration of technology into the core curriculum as
an important element of raising student achievement?
96
“Whole school.” (principal E, district 5)
“All classes schedule time for computer lab, including kindergarten and
preschool.” “Teachers have 4-6 computers in classrooms.” “Over eighty wireless
computers teachers can check-out.” “Teachers use Smart-boards and United
Streaming regularly.” (principal J, district 8)
“All classrooms are required to use projectors and SM [Success-Maker].”
(principal K, district 9)
“Not much. About 30% of my teachers use the management systems
consistently.” (principal L, district 9)
“Technology is used in all classrooms.” (principal I, district 7)
The second of the three questions then inquired: “How frequently was this use of
technology employed in the classroom?” The following represents a synopsis of the
range of responses that were gathered by the principal survey instrument:
“Daily.” (principal G, district 7) (principal B, district 2)
“Success-maker –daily.” (principal A, district 1)
“Daily.” (principal E, district 5)
“Daily and in majority of classrooms. Camera Document cameras used
frequently.” (principal J, district 8)
97
“Daily in all lessons.” (principal K, district 9)
“Daily, for some teachers, 10%.” (principal L, district 9)
“Daily.” (principal I, district7)
The third of the three questions then inquired: “Did this use of technology employed
in the classroom vary by grade level? (If so, how did it vary?)” The following represents
a synopsis of the range of responses that were gathered by the principal survey
instrument:
“Yes.” (principal G, district 7)
“Yes. It varied by level of teacher knowledge of the program. We’ve had 2 years
of staff development on AR [Accelerated Reader] to increase usage.” (principal
B, district 2)
“By group of student.” (principal A, district 1)
“Yes. Rotation of students: Some teachers have scheduled rotation; Others use it
as a center; Others for specific groups of students.” (principal E, district 5)
“Depending on student need (background knowledge). Depending on teacher fear
of technology and moving past their fear.” (principal J, district 8)
“Some grade levels more than others, depending on the experience and comfort
level of the teachers.” (principal L, district 9)
98
“Yes. K – once weekly (very limited); 1
st
– all classrooms implement Waterford
daily; 2
nd
-6
th
– Lexia (varies daily).” (principal I, district 7)
The 12 principal participants were asked a series of inter-related questions predicated
and based upon this question: “Was “Data Driven Decision Making” or data analysis
employed in an effort to raise student achievement? If you agree or strongly agree that
DDDM or data analysis was employed in an effort to raise student achievement, please
describe the use and/or importance of the following…” The participant responses for two
of the survey questions from this set are presented in the context of the second research
question, in the following section, while two of the survey questions from this set are
presented here in the context of best practice. The first of the two questions inquires:
“If technology was employed, please describe the specific programs employed?” The
following represents a synopsis of the range of responses that were gathered by the
principal survey instrument:
“Measures, Excel, MW.” (principal D, district 4)
“Edusoft.” (principal I, district 7) (principal C, district 3) (principal G, district 7)
“District purchased services for above.” (principal L, district 9)
“EADMS.” (principal K, district 9)
“?” (principal F, district 6)
“Rx Net.” (principal E, district 5)
99
“Target Teach.” (principal E, district 5)
“Edusoft (w/ outside consultant).” (principal H, district 7)
“Excel, AZQ, OARS.” (principal A, district 1)
“OARS.” (principal B, district 2)
“Our student data system is Aeries.” (principal B, district 2)
“Accelerated Reader. (setting reading levels and recording results from reading
comprehension tests).” (principal C, district 3)
The second of the two questions inquires: “Did DDDM or data analysis employed in
an effort to raise student achievement vary by grade level …and if so, how?” The
following represents a synopsis of the range of responses that were gathered by the
principal survey instrument:
“Yes, according to grade level standards.” (principal D, district 4)
“Yes. K-2 – less emphasis. 3-6 – more emphasis, probably related to CST
levels.” (principal I, district 7)
“All grade levels used them for: Lesson planning; Intervention; Enrichment;
Sharing of successful practices; Parent conferences.” (principal L, district 9)
“No – Consistent across grades.” (principal K, district 9)
100
“All grade level use data analysis consistently because I provide tasks that must
be completed in order for teachers to earn Passport hours because we are Reading
First.” (principal J, district 8)
“?” (principal F, district 6)
“Yes. Some teachers did not use the system (Rx Net): Kinder and 1
st
because
there are no test scores.” (principal E, district 5)
“No.” (principal H, district 7) (principal A, district 1)
“I guess I should say yes, as Kindergarten only gets assessed twice a year – mid.
and end of year. The rest of the grades 1-6 get accessed every 6-8 weeks.”
(principal B, district 2)
“Instructional planning with administrators occurred to analyze data, arrange
interventions and instruction based on need, and plan support across each grade
level.” (principal C, district 3)
Summary of the Findings for Research Question Number One:
How should technology be deployed and employed, at the school site level,
in keeping with best practice?
The data gathered to answer the first research question, in respect to data driven
decision making (DDDM), revealed key examples of best practice technology use at the
12 school sites, in two different ways. The first of the two aspects pertains to the manner
101
in which the actual implementation, and the utilization of, an effective technology based
DDDM program was structured. The second of the two aspects pertains to the various
follow-up components which flow from, and are in keeping with, the goals and the
tenants of a successful DDDM program. The researcher is referring to the various
interventions, and the various adjustments to instruction and practice, which are
subsequently implemented following the analysis of student level assessment data that
has been gathered and reviewed. DDDM programs therefore, when fully implemented,
also facilitate best practice in terms of both student achievement, through various
interventions, and in terms of instruction, through the reevaluations of, and the
adjustments which are made to, instructional strategies. The research data collected
supports a strong case being made for the effectiveness of both aspects of best practice
uses of DDDM programs.
The integration of technology within and across the curriculum was another frequently
cited example of technology resource allocations being implemented and utilized in
accordance with best practice. The researcher found the 12 principal participants to be
quite forthright in their reviews, or rebukes, of progress in this arena. The data,
quantitative and qualitative, which was gathered shows widespread acknowledgement of
the value associated with the integration of technology within and across the curriculum,
and a recognition that such implementation is in accordance with best practice
methodology. However, few, if any of the principal participants made claims that such
levels of integration of technology resources had been achieved, school wide, at their
school sites. Insufficient levels of expertise within the teaching staff, insufficient levels
102
of staff development, staff resistance to changes in instructional practice, and even some
staff members fear of technology resources were cited as personnel related barriers to
achieving the implementation of technology resources within and across the curriculum
in accordance with best practice.
Another example in which technology resource allocations are implemented and
utilized in accordance with best practice occurs when a given school site has
implemented an alignment of all instructional medium and materials, all assessments,
and all interventions with grade level content standards and curricular frameworks.
Instructional software, as utilized by students in both intervention programs and in order
to build background knowledge for core subjects, was one arena in which technology met
this threshold.
A more problematic aspect of best practice, one which arose from a review of the
research question data that had been gathered, involves speculation over where
computers, specifically, those that have been allocated for student use, are housed at the
school site, in order to be most effective in raising student achievement. The research
data gathered from the 12 school sites revealed some semblance of a pattern as to how
computers were deployed, however the data ultimately fails to lead one to any
overarching consensus as to the best physically setting in which to situate such resources,
in keeping with best practice. Six of the school sites reported having structured at least a
portion of their technology resources into computer labs. One of these school sites
reported that they had recently established a new computer lab. One school site reported
that they had recently refurbished an existing computer lab. One of these school sites
103
reported having, not one, but two computers labs on campus. Three of the school sites
reported an investment in laptops computers and mobile carts. One of the school sites
reported an investment made in Alpha-smart computers. Given an absence of data
related to the actual expenses associated with the various deployment strategies that
were implemented at the 12 school sites, in relation to available levels of funding, the
researcher is unable to divine, but cannot help speculate, that the expense ratio of
computers to the numbers of students served, favors either the computer lab system of
deployment or the roving laptop computer cart system of deployment. Since this
particular issue of best practice seems inexorably inter-twined with other questions
pertaining to given levels of, or even the availability of, funding, the researcher therefore
will postpone any further discussion related to best practice, and the range of physical
deployment strategies associated with student computer use found within the 12 school
sites, until chapter five.
Findings for Research Question Number Two:
Does technology, deployed and employed in keeping with best practice, provide school
sites an effective vehicle for raising student achievement?
To answer the research question involving the effect of educational technology upon
student achievement, the quantitative data gathered through each of the three survey
instruments, from the 12 school sites, was entered into a database. Excel spreadsheets
were utilized to group and tally the answers to each of the 44 quantitative questions
contained on the principal survey from across the 12 school sites. Once the grouping and
104
tallying of this quantitative data was complete, Excel was also utilized to produce the
figures which visually depict the frequency of the fixed response, scaled-construct,
answers that were provided by the principals at the 12 school sites. The principal
participants provided the researcher with data which was gathered through a six level
continuous scale: “strongly agree”, “agree”; “disagree”; “strongly disagree”; “don’t
know”; and “not applicable”. In addition to these six response categories, the researcher
added one additional category, “unanswered”, to the Excel spreadsheets and figures since
some of the questions went unanswered by the various participants.
The teacher survey instruments contained a total of 25 quantitative fixed response,
scaled-construct questions. The teacher participants provided the researcher with data
which was gathered through a five level continuous scale: “strongly agree”, “agree”;
disagree”; “strongly disagree”; and “don’t know”. In addition to these five response
categories, the researcher added one additional category, “unanswered”, to the Excel
spreadsheets and figures since some of the questions went unanswered by the various
participants.
Qualitative data was collected from the 12 principal participants through 27 open
ended questions. Each of the participant’s descriptive comments were transcribed and
compiled in accordance with each of individual questions that were contained on the
survey instrument. The qualitative data gathered from the three open ended questions
contained on the teacher survey was transcribed and compiled in the same manner, with
the additional caveat that this data was transcribed and compiled in accordance with the
participant’s upper grade teacher status, or lower grade teacher status.
105
In presenting the data and an interpretation of the findings associated with the second
research question, the quantitative and qualitative data have been interspersed and
juxtaposed in order to provide a clearer and more accurate portrayal of the judgments,
perspectives, and opinions of the various participants concerning the ability, and the
effectiveness, of educational technology to positively affect student achievement.
The principal participants were asked two inter-related, qualitative, open-ended
questions. The first question was: “What were some of the key factors/programs that
helped your school to raise student achievement and exit PI?” The following represents a
synopsis of the range of responses that were gathered by the principal survey instrument:
“Ongoing assessment and data analysis.” “Teacher weekly collaboration and
planning.” (principal H, district 7)
“Effectively using data to drive instruction.” (principal J, district 8)
“Creating a computer lab.” (principal J, district 8)
“Teachers consistently getting to computer lab work on Study Island and Achieve
3000 standard based computer games.” (principal J, district 8)
“Data analysis to drive instruction.” “Weekly collaboration meetings (by grade
level).” “Intervention programs.” (principal L, district 9)
“Major technology update; Computer lab – Fast Forward software;
Kaleidoscope.” “Data Days for grade level data exchange and planning;
Collaboration.” (principal K, district 9)
“Building background knowledge (United Streaming).” (principal K, district 9)
106
“All lesson plans presented auditorally (amplification systems) and visually (LCD
projectors).” (principal K, district 9)
The second of the two qualitative open-ended questions was: “To what extent was
technology integrated into or supportive of these factors/programs?” The following
represents a synopsis of the range of responses that were gathered by the principal survey
instrument:
“Technology played an important role as it was a gateway into comprehension of
the subject matter” (principal D, district 4)
“Data “harvesting” …coordinate, plan, and track intervention programs”
(principal L, district 9)
“All teachers were provided laptops and wireless access. This provided support
for assessment, planning (collaborative), utilizing EL support strategies…”
“United Streaming “ and “Power Point” (principal H, district 7)
“Technology is integrated into some of our interventions. It is not a strong part of
our base program, but supports supplemental programs” (principal I, district
“Technology played a big part in supporting our success.” “United Streaming to
provide background knowledge for CORE.” “Camera Doc.” and “Smart-
boards.” (principal J, district 8)
“Very minimally” (principal F, district 6)
“Technology was a key factor in making the change.” “Computer lab –
computers update, programs/software purchased; Smart-boards in every room;
107
3
rd
/4
th
/5
th
two roving laptop carts purchased; classroom computers (3 each)
updated.” "Each teacher has laptop and training.” (principal E, district 5)
“Address grade level technology standards.” “Lexia for all CELDT 1&2.”
“Integrate technology across curriculum.” “Regular lab schedule -all grade
levels.” (principal G, district 7)
“Technology in the primary grades is very limited. Therefore, its integration into
these factors is basically non-existent. The upper grades (4-6) have more access
to technology through Promethean Technology. Teachers are learning how to use
this instructional tool and the integration of technology is still limited.” (principal
C, district 3)
“Data meetings every 6-8 weeks. OARS reports by teacher, grade level, student,
to analyze specific student areas of need.” (principal, district 2)
“Success-Maker” “Lab for elementary students.” “Smart-boards for grades 7/8.”
(principal A, district 1)
“Technology was a factor and supported advancement in a number of ways: 1)
Student engagement – direct instruction; 2) Intervention programs.” (principal K,
district 9)
The 12 principal participants were asked two closely related questions concerning the
use and integration of technology. First they were whether or not the use and integration
of technology helped their school to achieve their Adequate Yearly Progress (AYP)
108
growth target, in relation to those subject(s) and subgroup(s) that had been a particular
problem at their site. Secondly, the 12 principal participants were asked if the integration
of technology into the core curriculum was an important factor in raising student
achievement at their school sites. Figure 2 and Figure 3 provide an illustration of the 12
principal participants beliefs in this regard.
Figure 2: Technology, AYP and Struggling School Site Subgroups
A follow-up, qualitative, open-ended question relative to the question presented in
figure 2 was also posed: “If you agree or strongly agree, please briefly explain how?” A
synopsis of the range of principal participant responses follows:
“Intervention programs.” “Direct instruction (projectors).” “Student engagement
(amplification system).” (principal K, district 9)
0
9
2
000
1
Principal Survey
Did the use, and integration of, technology help you to make AYP with
your particular struggling subgroup and subject?
109
“Teachers use it to pull data; reading coaches and I used it to track progress on
benchmark assessments, intervention, etc.” (principal L, district 9)
“I agree that it played a part, but it was not the sole factor.” (principal D, district
4)
“We use both the Waterford program and Lexia.” (principal I, district 7)
“The technology provides our ELL students with background knowledge for
complete understanding of core.” (principal J, district 8)
“Data analysis of weak areas.” (principal A, district1)
“Helped us target students for teacher classroom modifications and student
intervention programs.” (principal B, district 2)
110
Figure 3: Technology Curriculum Integration and Student Achievement
The 12 principal participants were asked a series of four inter-related questions that
inquired as to any expansions that they may have undertaken to their technology
program, as the de facto instructional leader of the school site. The series of questions
were drafted to explore and assess the various beliefs and expectations that the principals
held concerning the role and the value that educational technology could play in raising
student achievement (Figure 4).
1
8
3
00 00
12 School Site Principal Survey
Do you credit the integration of technology into the core curriculum as
an important element of raising student achievement?
111
Figure 4: Expansion of School Site Technology Program
The second in the series of questions was phrased to inquire if the principal
participants believed that technology, in and of itself, could raise student achievement
(Figure 5).
4
7
0
1
000
12 School Site Principal Survey
As the de facto instructional leader of your site, did you deliberately
expand the use of technology at your site?
112
Figure 5: Technology as Stand Alone Solution to Student Achievement
The third in the series of questions was phrased to inquire if the principal participants
believed that technology would simply help to raise student achievement (Figure 6).
Figure 6: Can Technology Help to Raise Student Achievement
0
4
5
3
00 0
12 School Site Principal Survey
If so, was the expansion of your site's use of technology undertaken with
the expectation that it would, in and of itself, raise student achievement?
2
6
11
00
2
12 School Site Principal Survey
If so, was the expansion of your site's use of technology undertaken
with the expectation that it would help to raise student achievement?
113
The fourth in the series of questions was phrased to inquire if the principal participants
expanded the school site technology program for reasons beyond accepted, standards
based, measures of student achievement, namely in order to facilitate their student’s
mastery of 21
st
century skill sets (Figure 7).
Figure 7: Students and Mastery of 21
st
Century Skill Sets
The principal participants were also asked about the use of technology at the school
site to monitor student achievement, measure the effectiveness of classroom practice, and
then focus instructional strategies for interventions. The term data driven decision
making (DDDM) is commonly used in discussing this use of technology to raise student
achievement and focus the various resources of a school site in such a manner that AYP
growth targets are more likely to be achieved. Such DDDM technology resources are
commonly employed by both district and site administers, teachers, and sometime even
2
5
11
0
1
2
12 School Site Principal Survey
If so, was facilitating your students mastery of "21st century skills" a
factor in your expansion of educational technology at your site?
114
by students themselves in order to reveal academic strengths and weaknesses across the
range of grade level content standards and frameworks. The 12 principal participants
were asked a series of inter-related questions that inquired as to the role and value of a
technology based data driven decision making program which may have been
implemented at their school sites (Figure 8).
Figure 8: Was DDDM Employed to Raise Student Achievement
A follow-up, qualitative, open-ended question was then posed. The question was
framed in the following manner: “If you agree or strongly agree that DDDM or data
analysis was employed in an effort to raise student achievement, was technology
employed in facilitating these efforts? If so, how?” The following statements present a
synopsis of the range of responses that were gathered by the principal survey instrument:
“Yes, to run the reports.” (principal D, district 4)
6
5
000 0
1
12 School Site Principal Survey
Was "Data Driven Decision Making" or data analysis employed in an
effort to raise student achievement?
115
“Use of Edusoft.” (principal I, district 7)
“Use of web based reports such as those from CDE.” (principal I, district 7)
“Yes, DSAT, OARS, and EADMS were used.” (principal L, district 9)
“Teachers pulled data and use re-pulls for strengths and areas of concerns (lesson
planning).” (principal L, district 9)
“Reading coaches.” (principal L, district 9)
“For overall, grade level, strengths: to share successful teaching practices; for
grade level for interventions.” (principal L, district 9)
“Yes, inputting of data into system use of EADMS for school-wide data analysis.”
(principal K, district 9)
“District provided the results for school site’s use.” (principal F, district 6)
“The district SIS was used to collect reports and info.” (principal E, district 5)
“Yes, to access and manipulate data.” (principal H, district 7)
“Analyzing data by student, by grade, by subject.” (principal A, district 1)
“Yes. The Reading First coach scans the assessments and runs a variety of
reports – by grade, teacher, student. We then meet by grade level to discuss
successes and improvements that need to be made (teaching practices).”
(principal B, district 2)
116
“Through Edusoft teachers could print their trimester tests to analyze individuals
for proficiency, and also collaborate across the grade level team for proficiency
and instructional ideas.” (principal C, district 3)
“Assessment data results. We analyzed – ongoing basis – to target interventions
and plan instruction.” (principal G, district 7)
In a second question the 12 principal participants were asked if such a school site
DDDM program gave the teaching staff the ability to more closely monitor their
student’s progress, and therefore facilitated increases in student achievement (Figure 9).
Figure 9: Teaching Staff Use of DDDM and Student Achievement
6
5
000 0
1
12 School Site Principal Survey
The presence of the technology which was employed within your
DDDM system in order to allow your teaching staff to more closely
monitor their students progress, led to increases in student achievement.
117
In a third question the 12 principal participants were asked if the use of a DDDM
program facilitated collaboration between the various members of the site’s grade level
teaching staff and therefore played a significant role in raising student achievement
(Figure 10).
Figure 10: Technology, DDDM, Teacher Collaboration and Student Achievement
In a fourth question the principal participants were asked if their DDDM program
enabled students to track their own academic progress, and therefore led to significant
increases in student achievement (Figure 11).
4
6
1
000
1
12 School Site Principal Survey
The technology employed within your DDDM system, by facilitating
collaboration between your grade level teachers, was a significant
factor in raising student achievement.
118
Figure 11: Student Use of DDDM Data and Student Achievement
Another follow-up, qualitative, open-ended question was then posed. “How
significant a factor in raising student achievement, was the presence of the technology
which was employed and utilized within your system of DDDM?” The following
statements present a synopsis of the range of responses that were gathered by the
principal survey instrument:
“It was a significant factor because of the time saved and the value of the reports.”
(principal D, district 4)
“Essential.” (principal I, district 7)
“Very significant, as we were able to produce reports immediately and were able
to segregate data in a variety of ways.” (principal L, district 9)
“Very important – accessible to all!” (principal K, district 9)
1
0
5
2
1
2
1
12 School Site Principal Survey
The use of DDDM technology as a tool to allow students to monitor
their own academic progress led to significant increases in student
achievement.
119
“It was a huge factor – teachers were able to learn specific grade level standards,
review and discuss test questions from unit assessments, and make decisions that
would drive instruction or provide re-teach to students based on needs.”
(principal J, district 8)
“?” (principal F, district 6)
“Extremely significant.” (principal E, district 5)
“It significantly increased access to data.” (principal H, district 7)
“Important – and becoming more crucial over time as we set assessment
benchmarks.” (principal A, district 1)
“It was very significant. The use of color laser printers is also key. It’s one thing
to talk about numbers, but to see bar graphs in purple, green, red and yellow is
quite impressive.” (principal B, district 2)
“Very – working from data very powerful in designing instruction based on need
and in goal setting with individual students in some classrooms. (principal C,
district 3)
The principal participants were queried as to the “types and brands of technological
hardware that have been utilized at your school during the past three years, particularly
as it relates to raising student achievement”. The following represents a synopsis of the
range of responses that were gathered by the principal survey instrument:
120
“PC desktops, pc tablets, Promethean Technology, Alpha Smarts, Active Slates,
Promethean whiteboards.” (principal C, district3)
“Two computer labs, classroom computers, four computers in library.” (principal
B, district2)
“Laptops, Smart-board (1), video cameras, document cameras.” (principal A,
district 1)
“Laptops, projectors, digital cameras.” (principal H, district 7)
“Apple notebooks and desktops.” “Smart Tech.” “NEC projectors.” (principal
E, district 5)
“Teachers utilize OARS (Online Achievement Results for Students) to review
student scores, provide interventions for students based on individual needs.”
(principal F, district 6)
“New computer lab, Smart-boards, camera document” (principal J, district 8)
“Smart-slates (boards), projectors, teacher amplification system (Redicat),
revamped computer lab, four student computers in classrooms.” (principal K,
district 9)
“Visualizers, projectors, computers” (principal D, district 4)
The principal participants were queried as to the “types and brands of technological
software that have been utilized at your school during the past three years, particularly as
121
it relates to raising student achievement”. The following represents a synopsis of the
range of responses that were gathered by the principal survey instrument:
“Waterford, Lexia” (principal I, district 7)
“Renaissance Learning, Lexia” (principal D, district 4)
“Management systems: Successmaker, A/R and OARS; EADMS; DSAT for
data.” (principal L, district 9)
“Waterford program in kindergarten.” “Fast-forward (Scientific Learning).”
“Success-maker.” “Accelerated Reader.” “Grade-book.” “EADMS(web based).”
(principal K, district 9)
“Kidbiz 3000 (Achieve 300).” “Study Island.” “United Streaming.” “Easy-
Grade Pro.” (principal J, district 8)
“Software accompanying Open Court & Harcourt Math.” (principal F, district 6)
“School House Rock.” “Learning Company.” “District provided programs/
websites.” (principal E, district 5)
“Edusoft (data management for assessment).” “Compass (In lab setting).”
“Power-Point.” (principal H, district 7)
“Success-maker and Accelerated Reader.” (principal A, district 1)
“Success-maker, Accelerated Reader, Baileys Book-house, Word (for reports).”
(principal B, district 2)
122
“Active-studio 3, Document Camera software, United Streaming.” (principal C,
district 3)
“Lexia.” (principal G, district 7)
The 12 principal participants were asked a detailed series of 10 questions concerning
their professional experiences, during the site’s status within program improvement,
regarding the relative effectiveness of educational technology in raising student
achievement. This series of questions were prefaced by a brief paragraph which framed
the questions which followed:
In conclusion, based upon your success in guiding your school out of program
improvement, and your site’s experiences concerning the effectiveness of
educational technology in that environment, how would you advise other
educational professionals, who may be administrators and instructional leaders,
concerning the value and effectiveness of technology in raising student
achievement.
The first set of three questions inquired as to the degree of importance educational
technology held within the overall site program improvement plan (Figure 12, Figure 13,
Figure 14).
123
Figure 12: Educational Technology as Import Aspect of Improvement Plan
Figure 13: Educational Technology as Essential Element of Improvement Plan
1
11
00 00 0
12 School Site Principal Survey
Educational technology was an important aspect of my overall school
site program improvement plan.
1
6
4
00 0
1
12 School Site Principal Survey
Educational technology was an essential element of my overall school
site program improvement plan.
124
Figure 14: Educational Technology as Not an Important Aspect of Plan
The second set of three questions inquired as to the relative impact, role, and effect
educational technology played in raising student achievement at the school sites of the 12
principal participants (Figure 15, Figure 16, Figure 17).
0
1
6
4
00
1
12 School Site Principal Survey
Educational technology was not an important aspect of my overall
school site improvement plan.
125
Figure 15: Technology Had a Significant Impact on Student Achievement
Figure 16: Technology Played Minor Role in Raising Student Achievement
2
9
1
00 00
12 School Site Principal Survey
Educational technology had a significant impact on raising student
achievement at my school site.
0
2
6
3
00
1
12 School Site Principal Survey
Educational technology played only a minor role in raising student
achievement at my school site.
126
Figure 17: Technology Had Little or No Effect on Student Achievement
The third and final set of three questions inquired as to the relative role and
contribution educational technology played in raising student achievement and helping
the school sites to exit from program improvement status (Figure 18, Figure 19, Figure
20).
0
1
55
00
1
12 School Site Principal Survey
Educational technology had little or no effect on raising student
achievement at my school site.
127
Figure 18: Technology Played Significant Role and Made Significant Contribution
Figure 19: Technology Played Minor Role and Made Small Contribution
2
7
2
00 0
1
12 School Site Principal Survey
Educational technology played a significant role in my site's success
in raising student achievement and made a significant contribution in
helping our school exit from program improvement.
0
3
44
00
1
12 School Site Principal Survey
Educational technology played a minor role in my site's success in
raising student achievement and made a small contribution in
helping our school exit from program improvement.
128
Figure 20: Technology Played Little or No Role and Contribution
The final fixed response quantitative question on the principal survey instrument
asked the 12 participants if their school site program improvement plans, and the best
efforts of their staff, would have been successful without the inclusion of an educational
technology component (Figure 21).
0
1
4
6
00
1
12 School Site Principal Survey
Educational technology played little or no role in my site's success in
raising student achievement and made little or no contribution in
helping our school exit from program improvement.
129
Figure 21: Absence of Technology in Improvement Plan and Student Achievement
The principal survey instrument concluded with a final qualitative, open-ended
question: “Do you have any other comments concerning educational technology, or is
there anything else you would like to add?” The following statements present a synopsis
of the range of responses that were gathered by the principal survey instrument:
“I believe technology plays an important role in student achievement, however
technology is only as good as the person running it. Whatever “it” may be. As
educators we do what we can and realize the importance of training, data, and
accountability. We consistently work towards becoming more knowledgeable in
the area of technology.” (principal D, district4)
“The use of technology was one of several factors that played a role in raising
student achievement; not solely, but weighted equally to the other factors.”
(principal L, district9)
2
7
2
0
1
00
12 School Site Principal Survey
The best efforts of my staff, and the implementation of my program
improvement plan, might not have been effective in raising student
achievement, without the incorporation of an effective educational
technology component.
130
“Technology must be used – [one can] say you have a lot, and not use it. [These
are] two totally different issues. At [my site] I can honestly say, we use it.”
(principal J, district 8)
“This is a primary K-3 school site. Classrooms have 2-3 viable computer stations
in their classrooms along with a printer. Each classroom has an LCD projector
and an Elmo which each teacher uses daily for instruction. Email between
principal and staff members are used with frequency. Tutorials for reading and
math are available for student use. Accelerated Reader is accesses by all
students.” (principal F, district 6)
“We had a significant partnership with GEAR-UP / UCSD which funded a lab,
five laptops per class (grades 4-8) and video technology equipment (2003-2007).
This gave the students access to the internet, Success-maker, learning games,
UCSD tutors. We now have the hardware / software, grant is over.” (principal A,
district 1)
While the principal participant data, on its face, generally seemed to support the
hypotheses that educational technology has a positive effect on student achievement, the
data gathered from the upper and lower grade teacher participants, on its face, seemed
overwhelmingly supportive of that belief. Since all of the teacher participant questions
were framed in the context of, whether or not a given aspect or an element typically
found in a school site technology program helped to raise student achievement, the data
gathered from the two grade level specific survey instrument questionnaires is presented,
131
in its entirety, within this section of the results chapter, since it is concerned with the
second research question involving technology and student achievement.
Approximately 66% of all lower grade teachers responses were entered into the “Agree”
or “strongly agree” columns, while 72% of all upper grade teacher responses were
entered into the “agree” or “strongly agree” columns. Less than 4% of all teacher
responses were entered into the “disagree” column, while only 1% of all teacher
responses were entered into the “strongly disagree” column. Twenty four percent of all
upper grade teacher responses were entered into the “don’t know” column, while 29% of
all lower grade teacher responses were entered into the “don’t know” column. In the first
of the three broad categories of questions, Technology Hardware, a combined tally of the
both the upper and the lower grade teacher responses revealed that only five responses
out of the total of 288 responses by the participants were entered into the “disagree” or
“strongly disagree” columns. In the second broad category, Technology Software/
Programs, a combined tally of both the upper and lower grade teacher responses revealed
that the ratio was slightly higher with two responses out of the total of 192 responses by
the participants being entered into the “disagree” or “strongly disagree” columns. In the
third and final broad category, Technology Support, which included questions about the
ability of tech coaches, computer aides, staff development, and hands-on training to help
raise student achievement, the ratio was highest of all with a total of 26 responses out of
the total of 120 responses by the participants being entered into the “disagree” or
“strongly disagree” columns.
132
Figure 22 presents an overall summary of the responses from the combined 12 school
site, lower grade surveys. Figure 23 presents an overall summary of the responses from
the combined 12 school site, upper grade surveys.
Figure 22: Twelve Site Lower Grade Teacher Response Summery
92
107
11
3
87
00
12 School Site Lower Grade Teacher Survey
Summary of 25 quantitative question responses.
133
Figure 23: Twelve Site Upper Grade Teacher Response Summary
In addition to the 25 quantitative questions discussed above, the teacher participants
were also posed three qualitative, open ended questions. The first of the three questions
inquired about additional hardware, software, or tech support, or other factors that helped
to raise student achievement. In addition to specific software program recommendations,
standards based software was sited, as was DDDM. Professional development was
mentioned multiple times, as was the use of LCD projectors, scanners, and smart-slates.
New, faster computers were cited, as were Alpha-smart laptop computers. Website
resources, hosted by governmental, academic, and private enterprises were also cited.
There was one comment that “excellent” tech support helped to raise student
achievement. One specific comment, however, above all stood out: “Technology is an
expectation here, and we all use it daily.”
83
133
11
1
72
00
12 School Site Upper Grade Teacher Survey
Summary of 25 quantitative question responses.
134
Summary of the Findings for Research Question Number Two:
Does technology, deployed and employed in keeping with best practice, provide school
sites an effective vehicle for raising student achievement?
When queried as to the key factors in raising student achievement to exit PI, over 50%
of responses attributed to the principal participants were directly, or indirectly, associated
with the implementation of, or expanded use of, technology resources at the school site
level. Similarly, the most frequently cited arena in which technology resources were said
to have been directed, was in related to, or associated with, school site level programs of
Data Driven Decision Making (DDDM). This overwhelmingly cited use of school level
technology resources is even more impressive when one considers the fact that in
addition to the various hardware and software resources which were deployed to gather,
compile, sort, and finally produce the various reports employed by administrators and
staff in order to direct instruction, there are many more closely related arenas in which
the related interventions are facilitated, and enabled, by the presence of technology
resources.
Such uses include: intervention programs – at both the classroom and student level;
grade level staff collaboration; and staff development. Each one of these school site
improvement measures are, of course, predicated upon the ability of the school site to
efficiently gather student level data related to their knowledge and mastery of academic
subject matter. As a practical matter, each of these school site improvement measures
could only occur by virtue of the school site level technologies which enabled ongoing
assessments through measures which have been carefully aligned with grade level
135
content standards, curriculum frameworks, and other desired outcomes. When one looks
at the various intervention measures, and the various collaboration and staff development
measures, which the principal participants strongly credited with raising student
achievement in order to exit PI, they are very well aligned with what research has shown
are the hallmarks of a successful DDDM program. That is to say that the various
measures undertaken in order to raise student achievement, and improve classroom
practice, flow from data gathered through a given school’s DDDM technology resources,
and are then subsequently implemented by means of additional technology resources
which have also been allocated at the school site.
The data that was gathered during the course of the research study shows that
technology was employed most often, and to the largest extent, within the realm of
DDDM, as well as within the implementation of the various school site improvement
measures which flowed from the program. These school site improvement measures,
which were frequently cited as being key factors in supporting the school site
improvement plan, included the arenas of student academic interventions, collaboration,
staff development, and within instructional practice or strategies.
After compiling both the quantitative and qualitative data that was gathered in the
course of the research study, it is evident no other arena of technology resource allocation
came even close in regard to the support, and the praise, that was articulated for DDDM
in regard to facilitating student achievement, and consequently proved to be a key
strategy in enabling the school site to exit from PI.
136
The research data also shows that student centered software, which was aligned with
both content standards and curricular frameworks, was also frequently cited as being a
key factor in supporting many aspects of the various school site improvement plans,
especially as employed in order to provide background knowledge and facilitate student
academic interventions.
The research data reveals that classroom level investments in technology resources,
which function as instructional medium, were also cited by many of the 12 principal
participants as being a key factor in supporting many aspects of their school site
improvement plan. Smart-boards, sound amplification systems, and LCD projectors were
items frequently mentioned.
Another key factor in raising student achievement in order for the school site to exit
PI, as identified by the 12 principal participants, was the integration of technology into
the core curriculum. The quantitative data shows that 75% of the principals responded by
selecting that they “agree” or “strongly agree” with that premise. However, the data also
indicates that such a viewpoint was not universal or unanimous, as the other 25% of the
principals responded by selecting that they “disagree” with that premise.
The principal participants from five separate schools, from five different school
districts, also indicated that a regular schedule of student access to computer labs was a
key factor in raising student achievement in order for the school site to exit PI. A
computer lab deployment strategy, however, was not the only methodology employed in
the 12 schools. The data shows that there were alternative deployment strategies which
were also employed to raise student achievement. The principal survey instrument shows
137
that there was also widespread support for the use of laptop computers and mobile carts,
as employed by some of the 12 school sites, as was support for Alpha-smart laptops in
the lower grades.
The findings from the teacher survey instruments also support the premise that
technology, deployed and employed in keeping with best practice, provide school sites an
effective vehicle for raising student achievement. The quantitative and qualitative data,
which was collected by means of the upper and lower grade teacher survey instruments,
and then combined, indicate an overwhelming belief on the part of the classroom
teachers, from the 12 participant sites, that the various components typically associated
with the best practice implementation of an educational technology program, “helped to
raise student achievement” at their school site.
Findings for Research Question Number Three:
What levels of resource allocations are required to establish, and sustain, an effective
technology program at the school site level?
To answer the research question involving the levels of resource allocations that are
required to establish, and sustain, an effective technology program at the school site level,
the quantitative data that was gathered through each of the principal survey instruments,
from the 12 school sites, was entered into a database. Excel spreadsheets were utilized to
group and tally the answers to each of the 44 quantitative questions contained on the
principal survey from across the 12 school sites. Once the grouping and tallying of this
quantitative data was complete, Excel was also utilized to produce the figures which
138
visually depict the frequency of the fixed response, scaled-construct, answers that were
provided by the principals at the 12 school sites. The principal participants provided the
researcher with data which was gathered through a six level continuous scale: “strongly
agree”, “agree”; “disagree”; “strongly disagree”; “don’t know”; and “not applicable”. In
addition to these six response categories, the researcher added one additional category,
“unanswered”, to the Excel spreadsheets and figures since some of the questions went
unanswered by the various participants.
Qualitative data were collected from the 12 principal participants through 27 open
ended questions. Each of the participant’s descriptive comments were transcribed and
compiled in accordance with each of individual questions that were contained on the
survey instrument.
In presenting the data and an interpretation of the findings associated with the third
research question, the quantitative and qualitative data have been interspersed and
juxtaposed in order to provide a clearer and more accurate portrayal of the judgments,
perspectives, and opinions of the various participants concerning the levels of resources
allocations that are required to establish, and sustain, an effective technology program at
the school site level.
The quantitative CDE School Technology Survey data, which the researcher retrieved
by means of an on-line state database, provided the study with baseline school site
technology resource allocation demographic information that was valuable in order to
cross-check and validate participant responses.
139
The larger purpose behind the third research question was to advance knowledge in
the field of school finance by gathering data that will help to develop a research based
estimate of the resources needed for a school level technology program that will support
instructional programs aimed at dramatically improving student performance. The
resource allocation related questions included within the principal survey instrument
questionnaire were informed by a school finance adequacy model developed by Odden
and Picus (2008). This model relies on research on what works in schools to identify
resources schools need to ensure all children have a chance to perform at high levels.
One goal of the researcher, in gathering data to answer the third research question, was to
take the Picus/Odden technology “costing-out” model and compare the projected
resource estimates with real world school site level budgets and funding.
The Picus/Odden costing-out model groups data by elementary, middle, and high
schools. The Picus/Odden model originally allocated approximately $250, per student,
for instructional materials and supplies (including textbooks). The original and the
inflated figures – as of 2005 – are $250.50/$285.57 for elementary level schools. The
Picus/Odden technology costing-out model identifies direct technology staffing costs
separately from direct technology costs, by incorporating the direct labor technology
expenses into their general professional development recommendations. Therefore, the
Picus/Odden model seeks to identify the direct costs related to purchasing, upgrading,
and maintaining computer technology hardware and software. Under this narrowly
defined assessment of technology costs, the Picus/Odden model finds annual costs per
student are approximately $250 for the purchase, update, and maintenance of hardware
140
and software. The Picus/Odden model asserts that the $250 figure is sufficient to
purchase, upgrade, and maintain computers, servers, operating systems and productivity
software, network equipment, and student administrative system and financial systems
software, as well as other equipment such as copiers. This figure is said to be sufficient
to cover medium priced student administrative and financial systems software packages.
The 12 principal participants provided data in response to the following inter-related
series of qualitative, open-ended questions concerning levels of technology resource
allocations, from each of their elementary level school sites, which are spread across
northern and southern California. The first question in the series of questions inquired
about per pupil funding levels dedicated to educational technology resources, in term of
adequacy: “In your professional opinion, is the Picus/Odden $285.57 per student
proposed dollar figure for funding educational technology at the elementary school level
adequate, or inadequate?” The following statements encompass the range of responses
that were gathered by the principal survey instrument:
“It is adequate. I hope it becomes real.” (principal C, district 3)
“I’d love to have $285.57 per student!” (principal B, district 2)
“Unclear – we are a very small school which makes funding formulas skewed
(less than 300).” (principal A, district 1)
“Not sure.” (principal H, district 7)
“Inadequate.” (principal E, district 5) (principal K, district9)
141
“Not known.” (principal F, district 6)
“Big schools (large enrollments) would benefit more and if it is yearly.”
(principal J, district 8)
“Yes” (principal L, district 9)
“Adequate” (principal I, district 7)
“This is probably an adequate amount for maintenance, but not for start-up costs.”
(principal D, district 4)
The second in the series of questions inquired about actual per pupil funding levels, at
the participant’s school site, that are dedicated to educational technology resources: “As
the principal of your site, do you possess projections – or actual dollar figures –
associated with per student funding levels for technology as described above? (If so, what
are those figures?)” The following statements encompass the range of responses that
were gathered by the principal survey instrument:
“No.” (principal C, district 3) (principal B, district 2) (principal A, district 1)
(principal H, district 7) (principal F, district 6) (principal K, district 9) (principal
L, district 9) (principal I, district 7) (principal D, district 4)
“Not at this time.” (principal E, district 5)
142
“I use projections provided. It is hard to budget for the whole year because of
carry-over coming late, or budget adjustments coming late in the year.” (principal
J, district 8)
The third in the series of questions inquired about technology related personnel
allocations, and how they are funded, at the participant’s school site: “In regard to
personnel allocations, how are your technology related staffing positions funded?”
The following statements encompass the range of responses that were gathered by the
principal survey instrument:
“We do not have personnel assigned to technology because we cannot afford their
salaries – full or part.” (principal C, district 3)
“District office pays – I’m not sure out of which budget.” (principal B, district 2)
“General fund.” (principal A, district 1) (principal D, district 4)
“Title I, LEP.” (principal H, district 7) (principal E, district 5)
“None allotted.” (principal F, district 6)
“District: tech support. Site: classified computer lab aide.” (principal K, district
9)
“Only pay 80 hours a year through Title I funds.” (principal L, district 9)
“Through categorical funds.” (principal I, district 7)
143
The fourth in the series of questions inquired about as to the number of technology
related staff positions which exist at the participant’s school site: “How many technology
related staff positions exist at your site?” The following statements encompass the range
of responses that were gathered by the principal survey instrument:
“Within HEARTS, one part time position. Hearts is the after school program
funded by a grant.” (principal C, district 3)
“One.” (principal B, 2) (principal H, district 7) (principal E, district 5) (principal
K, district 9) (principal D, district 4)
“0.” (principal A, district 1) (principal F, district 6) (principal J, district 8)
(principal L, district 9)
“Three. One is a tech person (3 days / month). One tech coach (shared with seven
other sites).” (principal I, district 7)
The fifth in the series of questions follows-up on the previous question by inquiring as
to the number of the school site’s technology related staff positions reported above, that
are funded as full time positions: “Of this number, how many are funded as full time
positions?” The various statements below encompass the range of responses that were
gathered by the principal survey instrument:
“0.” (principal C, district 3) (principal A, district 1) (principal F, district 6)
(principal J, district 8) (principal L, district 9) (principal D, district 4)
144
“One.” (principal B, district 2) (principal H, district 7) (principal E, district 5)
(principal K, district 9)
“One at site –tech aide.” (principal I, district 7)
The sixth in the series of questions once again follows-up on the previous question by
inquiring as to the section of their budget in which funding for the aforementioned
technology staff position(s) show-up: “In what section of your budget do the dollars for
these full or part time positions show up?” The various statements below encompass the
range of responses that were gathered by the principal survey instrument:
“After school program.” (principal C, district 3)
“District office budget.” (principal B, district 2)
“0.” (principal A, district 1) (principal F, district 6) (principal J, district 8)
(principal L, district 9)
“Strategies for increased student performance.” (principal H, district 7)
“Title I & LEP.” (principal E, district 5)
“Classified personnel – Title I.” (principal K, district 9)
“Title I.” (principal I, district 7)
“General fund.” (principal D, district 4)
145
The seventh in the series of questions inquired as to technology related professional
development expenditures, and whether such expenditures were situated with the overall
professional development budget, or within the expenditure framework of their
technology program: “Do you include professional development expenses related to
educational technology – and its integration into the curriculum – within your framework
of technology expenditures or budget; or do you include these expenses into your overall
professional development budget?” The following statements encompass the range of
responses that were gathered by the principal survey instrument:
“No – not enough funds.” (principal C, district 3)
“Overall professional development budget.” (principal B, district 2) (principal E,
district 5) (principal J, district 8) (principal I, district 7)
“Overall figure – though currently very little is subscribed to technology.”
(principal F6, district)
“No.” (principal L, district 9)
The eighth in the series of questions inquired as to budget projections, at either the site
or district level, for technology: “Does your site – or your district – project a budgeted
dollar figure, per student, for technology? (If so, what is that figure?)” The following
statements encompass the range of responses that were gathered by the principal survey
instrument:
“Not sure.” (principal C, district 3)
146
“No.” (principal B, district 2) (principal A, district 1) (principal H, district 7)
(principal F, district 6) (principal J, district 8) (principal I, district 7) (principal D,
district 4)
“Yes. Not sure” (principal D, district 4)
“District –yes. Site –no.” (principal E, district 5)
“?” (principal K, district 9)
“Not at my site. I don’t know at district level.” (principal L, district 9)
The ninth in the series of questions inquired as to the existence of annual technology
plans and budgets: “During your site’s Program Improvement status did your site draft
an annual technology plan or budget? (If so, what was the dollar figure?)” The following
statements encompass the range of responses that were gathered by the principal survey
instrument:
“There was a technology plan. Unknown” (principal E, district 5)
“There was a technology plan.” (principal D, district 4)
“Yes. $20,000” (principal I, district 7)
“No.” (principal L, district 9) (principal J, district 8) (principal A, district 1)
“I am not sure.” (principal K, district 9)
147
“No, when I arrived computers and technology [were] very out of date.”
(principal B, district 2)
“Yes, technology plan. No budget” (principal C, district 3)
“No budget.” (principal H, district 7)
The tenth in the series of questions follows-up on the previous question by inquiring
as to the existence of budget projections, in the absence of a formal technology plan or
budget: “If you did not draft a formal technology plan or budget, did you project a dollar
figure necessary to meet your technology needs for a given period?” The various
statements below encompass the range of responses that were gathered by the principal
survey instrument:
“Yes, the projected amount was $65,000.” (principal D, district 4)
“Yes.” (principal I, district7) (principal E, district 5)
“No.” (principal L, district 9) (principal J, district 8) (principal F, district 6)
(principal A, district 1)
The eleventh in the series of questions once again follows-up on the previous question
by inquiring as to the principal participants ability to look back and place a dollar figure
on the technology related aspects of their school improvement plan, in the absence of a
formal budget or plan: “If not, can you now look back and attach a dollar figure to the
technology related aspects of your school Improvement plan?” The various statements
148
below encompass the range of responses that were gathered by the principal survey
instrument:
“$18,000.” (principal L, district 9)
“I had money so I spent it on upgrading technology.” (principal J, district 8)
“Yes – approximately $460,000 was spent on technology hardware and software
during program improvement.” (principal E, district 5)
The twelfth in the series of questions inquired as to the existence of any bond
measures that assist in the funding of technology: “Did your school have the benefit of a
local bond issue designed to assist in funding your school’s implementation and use of
technology?” The various statements below encompass the range of responses that were
gathered by the principal survey instrument:
“No.” (principal D, district 4) (principal I, district 7) (principal L, district 9)
(principal J, district 8) (principal E, district 5) (principal H, district 7) (principal
A, district 1) (principal B, district 2)
“Yes.” (principal K, district 9)
“No. Our district received a grant and manages all funds received.” (principal C,
district 3)
The thirteenth, and final, in this series of questions inquired as to the percentage of the
site’s technology budget, or spending, has been dedicated to professional development:
149
“What percentage of your technology related budget, or spending, has been dedicated to
professional development?” The various statements below encompass the range of
responses that were gathered by the principal survey instrument:
“About 10%.” (principal D, district 4) (principal H, district 7) (principal B,
district 2)
“Currently none – In the past contracts have included staff development for Lexia
and Waterford.” (principal I, district 7)
“None.” (principal L, district 9)
“What I have found is that professional development comes with the purchase of
technology related purchases.” (principal K, district 9)
“Because we were PI -10%.” (principal J, district 8)
“About 1%.” (principal F, district 6)
“This is done at district level.” (principal E, district 5)
“District level management of funds.” (principal C, district 3)
The 12 principal participants were also presented with one quantitative, fixed
response, scaled-construct question related to their school site’s technology program, and
professional development resource allocations. Subsequent to being asked, “As the de
facto instructional leader of your site, did you deliberately expand the use of technology
at your site”, a closely related follow-up question was immediately posed that asked
150
whether or not any expansions in the use of technology at the school site had been
undertaken with an expectation that dedicated professional development time and dollars
would be required (Figure 24).
Figure 24: Technology and Resource Component Required for Professional Development
The 12 principal participants provided data in response to the following series of five
inter-related, qualitative, open-ended questions concerning their school site’s replacement
cycle for computers and peripherals. The first question in the series inquired if a cycle
had been set: “Has your site set a replacement cycle for your computers and peripherals?”
The various statements below encompass the range of responses that were gathered by
the principal survey instrument:
“Yes.” (principal G, district 7) (principal A, district 1) (principal E, district 5)
(principal K, district 9)
4
7
000
1
0
12 School Site Principal Survey
If so, was the expansion of your site's use of technology undertaken
with an expectation that it would require dedicated professional
development time and dollars?
151
“No.” (principal C, district 3) (principal L, district 9)
“No not a definite one.” (principal B, district 2)
“Yes – based on funding availability.” (principal H, district 7)
“Laptops were replaced with 22 computer stations in the computer lab.”
(principal F, district 6)
“No formal plan.” (principal I, district 7)
“No, however we do replace 10 to 15 computers per year.” (principal D, district
4)
The second in the series of questions follows-up on the previous question by
inquiring, if the school site has set a replacement cycle, and the prescribed length of time
that has been decided upon: “What is the length of time your site has set for this hardware
replacement?” The various statements below encompass the range of responses that were
gathered by the principal survey instrument:
“3 years for lab. As needed for classroom.” (principal G, district 7)
“Probably 3-5 years.” (principal B, district 2)
“Seven years.” (principal A, district 1)
“Three to five years.” (principal H, district 7) (principal K, district 9)
152
“Every 7-10 years as the site financially support replacement.” (principal E,
district 5)
“Three to five years or depending on funds.” (principal J, district 8)
“Three to four years.” (principal L, district 9)
“Three to five years – past practice.” (principal I, district 7)
“Not set.” (principal D, district 4)
The third in the series of questions once again follows-up on the previous question by
inquiring, if the school site has set a replacement cycle, if funding is currently in place, or
if there is an expectation that sufficient funding will be available, to meet this
replacement cycle: “Is funding currently in place, or do you expect that funding will be
available, in order to meet the future expenditures associated with meeting your projected
hardware replacement cycle?” The various statements below encompass the range of
responses that were gathered by the principal survey instrument:
“Uncertain budget for next year.” (principal G, district 7)
“No funding. We use school based or categorical monies for all computer related
updates, replacements.” (principal C, district 3)
“Some of it.” (principal B, district 2)
“Yes.” (principal A, district 1)
153
“?” (principal H, district 7)
“No funding is in place at this time. We ordered 10 computers this year, 1/6
th
of
what is needed.” (principal E, district 5)
“Funding may be limited in the upcoming year(s).” (principal F, district 6)
“None available.” (principal L, district 9)
“Funds are in place.” (principal J, district 8)
“Teacher computers are replaced by the district every 3-5 years (District pays for
this). Computer lab, every 5-7 years.” (principal K, district 9)
“We utilize categorical funds for technology replacement.” (principal I, district 7)
“We do not have funding in place.” (principal D, district 4)
The fourth in the series of questions inquired as to whether or not the school site
staggers their purchases of hardware in order to spread-out replacement costs: “Does
your site endeavor to stagger your purchases of computers and peripherals in an effort to
spread the replacement costs associated with hardware replacement cycles?” The various
statements below encompass the range of responses that were gathered by the principal
survey instrument:
“Yes – under direction of tech coach.” (principal G, district 7)
154
“Yes.” (principal B, district 2) (principal A, district 1) (principal H, district 7)
(principal K9) (principal I, district 7) (principal D, district 4)
“Yes – we could not run the school if we did not. There is simply insufficient
funds. (principal E, district 5)
“We purchase peripherals as monies allow.” (principal F, district 6)
“Yes, as extra funding comes in.” (principal L, district 9)
The fifth, and final, in this series of questions inquired as to where the school’s
hardware are relative to the site’s replacement cycle: “Where are your site’s computers
and peripherals in their replacement cycle? (Describe what percentage of your computers
and peripherals, are at what point of their replacement cycle)” The various statements
below encompass the range of responses that were gathered by the principal survey
instrument:
“50% new in 07/08. 50% new in 06/07.” (principal B, district 2)
“?” (principal A, district 1)
“1/6
th
of what is needed has been purchased.” (principal E, district 5)
“All of our classroom computers and lab computers are new within the last two
years.” (principal F, district 6)
“Computer lab: 35 computers, one year old. Teacher stations – new this year.
Smart Slates – two years old.” (principal K, district 9)
155
“25% of computers are +5 years old.” (principal I, district 7)
“We replace 18% of our technology hardware per year.” (principal D, district 4)
Summary of the Findings for Research Question Number Three:
What levels of resource allocations are required to establish, and sustain,
an effective technology program at the school level site?
The findings show that the 12 principal participants largely agreed that the Picus/
Odden $285.57 per student proposed dollar figure for technology was adequate. Two of
the principals indicated that they were “not sure” or it was “not known”. One principal
indicated that this was “unclear” and indicated a concern that the very small student
enrollment at their school site, less than 300, tends to skew funding formulas. One
principal indicated a belief that this per pupil figure was “probably” adequate for
maintenance, but not adequate for start-up costs. Only one of the 12 principal
participants indicated a belief that the Picus/Odden proposed dollar figure was
inadequate.
The data shows that not one of the 12 principal participants could articulate actual
dollar figures, or projections, associated with per pupil funding levels for technology at
their school site. However, one of the principals did indicate that “projections”,
presumably from the school district, were provided to the school site; No figures,
however, were provided on the research instrument.
156
The data concerning the existence of technology related staff positions, and the manner
in which those positions are funding showed little continuity. Four schools indicated that
they do not have a single staff position, full or part time, allotted. Most school sites
indicated that they relied on a part time staff position. Less than 50% of the schools
indicated that their campus had allocations for a full time technology staff position on
site. The data shows that, when school sites had a tech position, a hodgepodge of
funding sources that ran the gamut from district funds, Title I, general fund, afterschool
programs, “Strategies for Increased Student Performance”, to categorical funding, were
all being tapped by the one or more of the 12 participant school sites.
None of the participant school sites reported having professional development dollars
related to technology, earmarked from whatever funds were available for technology
expenditures. The research data indicates that, when technology related professional
development does take place, it is funded out of the general professional development
budget.
While most, if not all, of the 12 principal participants were at least able to
acknowledge the existence of a school site technology plan, only one of the twelve school
sites indicated that they had a formal technology budget ($20,000). Only three of the 12
principal participants indicated that, in the absence of a formal budget, they had projected
a dollar figure necessary to meet their technology needs. Only one of the three principal
participants provided a figure, as requested ($65,000). When asked if they could look
back over the previous school year and affix a dollar figure to the technology related
aspects of their school improvement plan, only two of the 12 principal participants
157
provided the researcher with such figures. The first of the two principals indicated an
expenditure of $18,000, while the second of the two principals indicated that a rather
remarkable figure of $460,000 was invested in school site technology hardware and
software during the course of program improvement. The third, and final, response to
this question that was recorded on the principal survey instrument was rather pragmatic,
“I had money so I spent it on upgrading technology”. In a related funding and budget
question, only one of the 12 principal participants indicated that their school site had
benefitted from a local bond measure to assist in funding their technology program.
When the 12 principal participants were asked to provide data concerning the ratio of
technology related professional development expenditures to school site level technology
spending, a figure of 1% was provided from one school site, while two other school sites
indicated 10%, with one of the two sites commenting that this level of funding was
attributable to the fact that the site had been in program improvement. The principals at
two additional school sites indicated that this is a district responsibility. The final
principal response recorded on the survey instrument stated, “What I have found is that
professional development comes with the purchase of technology related purchases”.
Three of the participant school sites indicated that they had established a replacement
cycle for their computers. Six school sites indicated that they either lacked a plan, or any
such plan was based on funding availability, or they simply relied upon whatever funds
were available annually in order to replace a given number of their older computers each
year. When asked about the length of time that had been prescribed for a replacement
cycle of computers, three years, three to four years, and three to five years were the most
158
common responses. One of the 12 principals cited seven years, one principal cited a
seven to ten year cycle, and one principal responded that no replacement cycle had been
set. A few of these responses were qualified by the caveat that such a cycle was
dependent upon the availability of funds. Only two of the principal participants gave a
resounding “yes” to a follow-up question which inquired if replacement cycle funding
was already in place, or if there was an expectation that it would in fact be available.
Other responses ranged from, “some of it”, to “none available”, to “?”, to “uncertain
budget for next year”, etc. One principal stated, “No funding is in place at this time; we
ordered 10 computers this year, 1/6
th
of what is needed”. Seven of the school sites
indicated that they do in fact endeavor to stagger their purchases in order to spread-out
replacement costs. Three other schools indicated that any such plans are driven, by and
large by the availability of funding, more that in keeping with best practice.
California Department of Education 2007 School Technology Survey Data
The eight tables displayed below have been constructed by the researcher based upon
data sets gathered from California School Technology Survey. Each of the tables are
composed of data that was compiled from each of the 12 school sites that participated in
this research study, subsequent to being retrieved from an on-line California Department
of Education database.
159
Table 1: School Site Student to Computer Ratios
Site / District
Students per
Computer
Student per
Internet
Connected
Computer
A-1 2.19 2.19
B-2 6.33 0.00
C-3 5.06 4.38
D-4 2.93 2.81
E-5 2.46 2.46
F-6 7.52 7.52
G-7 5.75 5.75
H-7 5.84 5.84
I-7 5.04 5.04
J-8 3.77 3.77
K-9 3.67 3.67
L-9 3.01 3.01
Statewide 4.11 4.59
160
Table 2: School Site Computers Connected to Internet
Site / District
% Computers
Connected to
Internet
% Computers
Not Connected to
Internet
A-1 100.00% 0.00%
B-2 0.00% 100.00%
C-3 115.65% -15.65%
D-4 104.29% -4.29%
E-5 100.00% 0.00%
F-6 100.00% 0.00%
G-7 100.00% 0.00%
H-7 100.00% 0.00%
I-7 100.00% 0.00%
J-8 100.00% 0.00%
K-9 100.00% 0.00%
L-9 100.00% 0.00%
Statewide 89.63% 10.37%
161
Table 3: School Site Computer Location
Site / District Classrooms Lab Library Other
A-1 65.66% 30.30% 4.04% 0.00%
B-2 50.00% 47.92%2.08% 0.00%
C-3 86.39% 13.61%0.00% 0.00%
D-4 42.86% 42.86%14.29%0.00%
E-5 80.47% 18.34%1.18% 0.00%
F-6 100.00% 0.00% 0.00% 0.00%
G-7 73.68% 22.81%3.51% 0.00%
H-7 71.55% 27.59%0.86% 0.00%
I-7 84.47% 14.56%0.97% 0.00%
J-8 84.11% 15.42%0.47% 0.00%
K-9 68.12% 21.01%5.80% 5.07%
L-9 81.82% 16.04%2.14% 0.00%
Statewide 65.33% 25.55%5.65% 3.48%
162
Table 4: Age of School Site Computers
Site / District < I Year 1‐2 Years 2‐3 Years 3‐4 Years 4+ Years
A‐1 50.81% 0% 0.81% 27.42% 20.97%
B‐2 20.69% 3.45% 30.34% 0.00% 45.52%
C‐3 8.84% 0.68% 51.02% 2.72% 36.73%
D‐4 no data no data no data no data no data
E‐5 0% 31.38% 31.80% 28.45% 8.37%
F‐6 0% 37.04% 0% 0% 62.96%
G‐7 1.75% 29.82% 0% 7.02% 61.40%
H‐7 6.03% 28.45% 18.10% 29.31% 18.10%
I‐7 0% 28.16% 18.45% 7.77% 45.63%
J‐8 19.16% 57.94% 12.15% 4.67% 6.07%
K‐9 2.90% 2.17% 0% 94.20% 0.72%
L‐9 4.28% 4.28% 1.60% 22.99% 66.84%
Statewide 12.47% 12.88% 13.58% 14.17% 46.90%
163
Table 5: Expected Change in School Site Computer Availability
Site / District
% Computers
Scheduled to
be Retired
% Computers
Expected to
be Added
% Net Gain
or Loss
A-1 no data no data no data
B-2 no data no data no data
C-3 19.73% 0.00% -19.73%
D-4 28.57% 28.57% 0.00%
E-5 2.09% 16.74% 14.64%
F-6 27.78% 55.56% 27.78%
G-7 61.40% 0.00% -61.40%
H-7 18.10% 0.00% -18.10%
I-7 45.63% 0.00% -45.63%
J-8 6.07% 28.04% 12.15%
K-9 5.07% 18.12% 13.04%
L-9 5.35% 9.63% 4.28%
Statewide 6.80% 8.46% 1.66%
164
Table 6: Average School Site Hardware Fix Time
(1 = Two hours; 2 = One day; 3 = Two to five days; 4 = One week; 5 = One month +)
Site / District
Hardware
Fix Time
A-1 3.00
B-2 2.00
C-3 4.00
D-4 2.00
E-5 2.00
F-6 4.00
G-7 4.00
H-7 4.00
I-7 4.00
J-8 4.00
K-9 3.00
L-9 3.00
Statewide 2.74
165
Table 7: School Site Level Technical Support Staffing (FTE’s per 1,000 students)
Information is in units of full-time equivalent (FTE) personnel. Only teachers and
technical staff given specific assignments for technical support were counted. Only sited
based personnel were included. District level based technical support personnel were not
included, nor were curriculum or staff development positions related to technology.
Site / District
On-Site
Certificated
Support
On-Site
Classified
Support
A-1 0.37 0.00
B-2 0.00 0.00
C-3 0.00 0.00
D-4 0.00 0.00
E-5 0.00 1.70
F-6 0.00 0.00
G-7 0.00 0.00
H-7 0.00 0.00
I-7 0.00 0.00
J-8 0.25 0.00
K-9 0.00 3.94
L-9 24.91 3.56
Statewide 0.35 0.79
166
Table 8: School Site Level Curriculum Support Staffing (FTE’s per 1,000 students)
Only site based staff personnel are included. Only curriculum or staff development
positions related to technology were included. District staff and technical support
personnel are not included.
Site / District
Curriculum
Support Staffing
(On-Site Certificated)
Curriculum
Support Staffing
(On-Site Classified)
A-1 0.74 0.00
B-2 0.00 1.09
C-3 0.00 0.00
D-4 0.00 0.00
E-5 0.00 1.70
F-6 0.00 0.00
G-7 0.00 0.00
H-7 0.00 0.00
I-7 0.00 0.00
J-8 0.25 0.00
K-9 0.00 0.00
L-9 0.02 0.00
Statewide 0.38 0.28
167
CHAPTER FIVE
SUMMARY, CONCLUSIONS, AND IMPLEMENTATIONS OF THE FINDINGS
Overview of the Problem
Public Education is an expensive proposition. This is a fact that has been further
exacerbated in recent years in which we have seen widespread, severe, and repeated,
budget crises at the state level across the nation. It should not come as a surprise that
during such times many governors, legislators, educational leaders, and stakeholders
frequently call for increased fiscal accountability and efficiency cost cutting measures to
cope with leaner budgets.
Picus (2000) has observed that, “Despite the large sums of money spent annually for
K-12 education, we know remarkably little about how those funds are used at the
individual student – and school – level”. This dissertation therefore was conducted, in
concert with similar research conducted at the University of Southern California within a
thematic dissertation group chaired by Dr. Lawrence Picus, in order to provide a better
understanding of the various expenditures present at the school site level. The focus of
the research conducted for this dissertation is pertinent and timely since researchers in
school finance currently know a great deal about how much our schools spend for
salaries, benefits, contracts, and so forth, but know relatively little about expenditures
identified by a “function” such as those allied costs which are attributable to central
district administration, maintenance and operations, transportation, safety and security,
and technology (Odden & Picus, 2008).
168
In attempting to quantify adequate per pupil funding levels by means of evidence
based or state-of-the-art methodologies, such studies have traditionally been conducted
by defining and re-calculating expenses associated with core instructional programs, and
have proceeded by simply accepting the existing data associated with other related, or
non-core, expenses such as district administration, maintenance and operations, facilities,
food services, transportation, special education, safety and security, and technology. A
far more accurate calculation of per pupil funding requirements and adequacy levels is
therefore plausible once these traditionally assumed allied expenses can also be defined
and quantified.
Within the current standards and accountability movement draconian sanctions can
attach at both the student and school level when prescribed achievement levels and goals
are not met. It is therefore imperative that today’s educational professional and the
various bodies of stakeholders be armed with a clearer understanding of whether or not
the nation’s K-12 schools have adequate resources, and are receiving adequate funding
levels, in order to meet the present needs of a diverse student body of young Americans.
Educational policy in such a national climate therefore demands that future research
provide educational stakeholders with the ability to correlate and juxtapose inputs in the
form of dollars and resources, with outputs in the form of a student body that meets
existing state and federal standards and other established levels of achievement. Thus, a
body of detailed and nuanced research which can provide stakeholders with the first half
of this equation is needed. Once the issue of site-level, per-pupil funding levels and
adequacy has been quantified, continued research can advance practice and research by
169
defining and exploring the nature of the relationship, and by endeavoring to quantify the
correlations which exist between inputs and outputs. This has been lacking or absent in
the literature and research, but is imperative in order to guide effective educational policy
and improve practice.
Purpose of the Study
The purpose of this study is to define and quantify the nature of, and required levels
of, technological resources at the school in order for students to meet standards and
prescribed levels of educational attainment. In concert with other aligned, and
interrelated, research that was conducted within a thematic dissertation group at the
Rossier School of education under the supervision of Dr. Lawrence Picus, the goal of this
study is to advance the literature and improve practice by providing a more accurate
assessment of school site level, per pupil funding adequacy. This is done defining and
quantifying aspects of those related educational costs which have traditionally remained
un-defined and un-quantified in resource based or state-of-the-art studies and research.
Specifically, this study collected data to expand knowledge concerning site level, per
pupil adequacy levels, by considering the technology needs that constitute an integral,
and essential, part of a modern day K-12 education. For the purposes of this study, these
are defined as those technology needs that are necessary for a student to meet prescribed
grade level standards, as well as those that are necessary to prepare a student to fulfill
their professional and civic responsibilities, and facilitate success for those who continue
their studies in higher education. This research study therefore ultimately collected data
170
to more accurately define, and quantify, technology related resource requirements, and
expenses, with overall per-pupil adequacy levels in order to provide a clearer and a more
accurate picture of the necessary funding levels associated with meeting the various
mandates called for in today’s educational environment and political climate.
This research study addresses and answers three research questions:
How should technology be deployed and employed, at the school site level, in
keeping with best practice?
Does technology, deployed and employed in keeping with best practice, provide
school sites an effective vehicle for raising student achievement?
What levels of resource allocations are required to establish, and sustain,
an effective technology program at the school site level?
Methodology
Twelve principals, and two select members of their teaching staff, were queried as to
their professional opinions concerning the implementation, best practice, and the
effectiveness of technology resource allocations in raising student achievement. This
successful school study was conducted to examine these levels of technology resource
allocations, their usage, and their effectiveness relative to student achievement, in a select
group of elementary schools located in the state of California. This was accomplished by
means of two distinct survey instruments: (a) a researcher designed survey, in the form of
a self-administered questionnaire, which was used to collect data from a sample; (b) a
171
School Technology Survey (STS), that is a joint undertaking between the California
Department of Education (CDE) and the California Technology Assistance Project
(CTAP), which California schools complete for the state annually, and is retrievable from
a state database.
Sample and Population
The population selected for the study consisted of 362 California public elementary
schools that were designated by the California Department of Education (2008) as being
fourth or fifth year Program Improvement (PI) schools pursuant to Title I of No Child
Left Behind (NCLB) during the 2006-2007 school year. Schools which have been
designated as PI, year four, have failed to make Adequate Yearly Progress (AYP) for a
period of at least five years.
The sample selected from this population of year four or five PI schools was narrowed
from this group of schools to include only the 28 California elementary schools that were
successful in exiting from PI by meeting their AYP at the conclusion of the 2006-2007
school year. Twenty five of the above California elementary schools were invited to
participate in the research study. However, district administrators that oversee thirteen of
the school sites represented in the population, declined to allow their schools to
participate in the study. Twelve California year four or year five PI elementary schools,
ultimately agreed to participate in the study.
This study therefore collected and analyzed school site level data at 12 elementary
schools, as described above, scattered across the state of California. The 12 schools that
172
ultimately agreed to participate in this study are contained within a total of nine
California school districts. The participants of this study were the 12 school site
principals, and the two teachers at each site that the principal was asked to select, to
participate in the study. Each principal was asked to select one “upper grade” and one
“lower grade” teacher at their site who they considered a “technology leader”, or
someone well versed in educational technology.
Data Collection
The data collection methodology of the study employed a mixed methods approach.
The qualitative aspect of the research design involved the collection of narrative data, and
entailed data analysis through the coding of data, and the production of an inductive,
verbal synthesis.
The quantitative aspect of the research design involved the collection of numerical data,
and entailed data analysis by means of a deductive, statistical methodology. The mixed
methods, quantitative and qualitative, approach to the research design was employed
within following instruments:
A self-administered, researcher designed, principal survey questionnaire
instrument, composed of open-ended and fixed response elements, to collect data
from a sample;
A self-administered, researcher designed, teacher survey questionnaire
instrument, composed of open-ended and fixed response elements, to collect data
from a sample;
173
A structured record review of school site level data from a California Department
of Education (CDE) online database that allowed the researcher access to data
provided from the school site, by means of a School Technology Survey (STS),
that is completed by all California schools, on an annual basis.
Data Analysis
Once the researcher had received the data from each of the participants representing
the 12 school sites, the following 12 steps were undertaken to analyze the data:
Data contained within the principal and teacher survey instruments was
organized;
The survey questionnaire’s quantitative, forced-choice responses, were calculated
as to the number of responses by participants who responded “strongly agree”,
“agree”, “disagree”, “strongly disagree”, “don’t know”, or “not applicable” to
each of the 27 forced-choice questions contained on the principal surveys, and
each of the 25 forced-choice questions contained on the teacher surveys;
The data gathered from the survey questionnaire’s qualitative, open-ended
descriptive narrative responses was coded and an inductive, verbal synthesis to
each of the 45 open-ended questions contained on the principal surveys, and the
three open-ended questions contained on the teacher surveys was produced;
Having completed the above calculations for each of the three survey instruments
returned from each school site, similar calculations that represented the
174
cumulative responses, entailing all twelve school sites, were performed in the
same manner;
The researcher, to analyze the open-ended responses, combed the qualitative data
for the presence of themes and patterns apparent in the responses of both
principals and teachers;
To analyze the forced-responses, data was entered into a Microsoft Excel
spreadsheet format in order to create frequency charts for each of the 27 principal
survey questions, and each of the 25 teacher survey questions. These frequency
charts were then used to determine the various trends, relationships, and patterns
which existed within the data;
A final extensive review of all of the data compiled, in its totality, produced an
overall final summarization of the data compiled, and established the findings
leading to the research study’s various conclusions, that follow.
Findings by Research Question:
Research Question One
The first research question inquired; how should technology be deployed and
employed, at the school site level, in keeping with best practice? The findings reveal
that, by far, the most frequently cited arena in which technology resources were
successfully utilized was directly, or indirectly, associated with Data Driven Decision
Making (DDDM). It was striking that almost all of the instructional strategies described
175
by the school site principals were largely facilitated by, or flowed from, the school’s
DDDM program. Likewise, instructional strategies employed by the 12 school site
principals in an effort to raise student achievement, were effective largely by virtue of the
fact that technology resources had deliberately been directed into a school wide DDDM
program, in keeping with best practice. In fact, as discussed in chapter two, research has
shown that a large percentage of these instructional strategies are truly in keeping with
the very philosophy and tenets of a school wide DDDM program. This is almost
certainly a prime factor behind the overwhelming level of support for DDDM on the part
of the 12 school site principals. Witness the positive number of responses that were
associated with a survey question, one which specifically queried if a DDDM program
was employed in an effort to raise student achievement, in which every single response
provided by the respondents was “agree” or “strongly agree”.
The researcher, however, found one area of concern in this arena. Two questions
included within the school site principal survey inquired as to the ability of students to
utilize the site’s DDDM program in order to monitor their own academic strengths,
weaknesses, and progress.
The responses were disappointing since research suggests that the availability of such
DDDM academic data to students is considered one element of best practice
implementation and use. When asked if, “the technology employed within your system
of DDDM allowed students to monitor their own academic progress”, only five principals
provided an affirmative response.
176
In a follow-up question, when asked if, “The use of DDDM technology as a tool to allow
students to monitor their own academic progress led to significant increases in student
achievement”, only one of the 12 school site principals expressed a belief in the
affirmative.
Beyond being simply one element of these programs that is not in keeping with best
practice, the researcher views this particular aspect of the implementation, and use of, the
various school site’s DDDM programs as a squandered opportunity. The researcher
asserts that students could have benefited in two ways: First, students could have gained
motivation by being empowered through access to the same academic data that teachers
and administrators are utilizing through their DDDM programs; Second, this was also a
missed opportunity to provide the students at these school sites with an experience in
which they could have gained the kind of 21
st
century skills that educators and
stakeholders frequently profess to be valuable, and claim to support in our schools. One
sees a significant degree of ambivalence expressed when the 12 school site principals
were queried as to whether facilitating their students mastery of 21
st
century skills was a
factor in the expansion educational technology at their schools. In this instance, only two
of the school site principals replied that they “strongly agreed” combined with an
additional five school site principals who replied that they “agreed”. The other side of
the coin saw one school site principal who replied that they “disagreed”, one school site
principal who replied that they “strongly disagreed”, and one principal who answered that
the question was “not applicable”.
177
In fairness, the researcher must restate the fact that the 12 school sites, as year four
and five program improvement schools, were facing draconian reorganization and
restructuring measures if they failed to raise student achievement and meet their school
AYP growth targets by the end of the school year. The stakes at these 12 schools were
high and results needed to be immediate. Teachers and administrators in such a high-
stakes environment, therefore obviously focused on immediate measures to raise student
achievement over a focus upon the mastery of 21
st
century skills.
The survey responses of the 12 school site principals surveyed in the course of this
research study shows the following items are perceived as key elements associated with
best practice implementation, and use, of a successful DDDM program:
Leads to meaningful and fruitful collaboration between grade level staff members;
Confirms, or identifies deficiencies, associated with the mastery of subject matter
at the grade level, classroom level, and student level;
Allows teachers and administrators to pinpoint the need for timely, and focused,
instructional interventions at the grade level, classroom level, and student level.
The survey responses of the 12 school site principals surveyed in the course of this
research study shows the following items are perceived as key elements associated with
best practice implementation, and successful use, of technology resources in general:
Technology resources should be integrated within and across the core curriculum;
178
Not only should such technology resources be incorporated into the core
curriculum, they should also be aligned with grade level standards and curricular
frameworks;
Rather than using technology resources simply for the sake of learning and using
technology, such resources should be a utilized as a tool employed for the
teaching of, and in order to facilitate the mastery of, the core curriculum, in the
pursuit of meeting various grade level standards and state curricular frameworks.
One caveat as to the best practice implementation, and use, of technology resources at
the elementary level should be highlighted. Many of the K-8 school site principals
pointed out that the use, and value, of technology resources does vary significantly by
grade level. A key reason cited was the limited amount of testing performed within the
lower elementary grades. This is largely attributable to the fact that that kindergarten and
first grade students are currently exempt from the standardized testing associated with
NCLB and the school site AYP targets. It was largely perceived that since DDDM
programs play less of a role in the lower grades, and since lower grade students are too
young to fully utilize technology resources, the use and value of technology is not as keen
in the lower two grade levels.
Finally, a consensus associated with the deployment of computers designated for
student use, in accordance with best practice, remains largely unclear after a review of the
opinions, and data, provided by the 12 school site principals. The deployment of
computers designated for student use at the 12 school sites varied. The 12 school sites
did have in common the fact that none of the schools had a ubiquitous, one to one,
179
computer program. Likewise, none of the schools had classrooms equipped with a
sufficient number of computers for every student. The classrooms associated with the 12
school sites surveyed were typically equipped with a grand total of three or four
computers which were linked to printers and had internet connections. This meant that
each of the school sites faced a critical decision as to how best implement a practical
strategy for deploying an adequate number of computers to handle classroom projects or
assignments which utilized, mandated, or incorporated technology resources. The 12
school sites sometimes opted for a computer lab, or sometimes even created two
computer labs on campus. In contrast, some of the schools pursued a mobile laptop cart,
or carts, strategy in which computers could simply be wheeled to the classroom, rather
than trying to deal with the rather herculean task of orderly relocating two dozen, or
more, elementary student across campus. Some schools employed a combination of the
two strategies. Which is strategy is more effective? Which strategy is more cost
effective? The bottom line is that, in a perfect world, money aside, ubiquitous
computing would be the norm, and each and every child would have their own computer.
This makes the question of “best practice” implementation associated with the
deployment of computers that are designated for student use, less of a question concerned
with “best practice” and more of an issue related to the perceptions of educational
professionals, and stakeholders, as to the role and value of technology within America’s
public schools.
180
Finally, an overwhelming majority of the 12 school site principals expressed a belief
that a lack of sustained funding levels, year in and year out, dictates, limits, or trumps
ideal uses of technology resources and related instructional programs.
Research Question Two
The second research question inquired: does technology, deployed and employed in
keeping with best practice, provide school sites an effective vehicle for raising student
achievement?
Few, if any, of the 12 school site principals believed that technology, in and of itself,
could raise student achievement, and facilitate the meeting of the school site AYP growth
target. Yet, few, if any, of the 12 school site principals doubted that technology
contributed to raising student achievement, and was also a factor in the school site
meeting their AYP growth target.
The realm in which educational technology was most frequently credited as raising
student achievement was DDDM. This arena of technology resource allocation was
frequently cited as being a key factor in supporting the school site improvement plan.
The 12 school site principals saw value in dedicating technology dollars to DDDM
because the program facilitated student academic interventions, staff collaboration, and
staff development. Likewise, DDDM was credited with raising student achievement by
improving instructional strategies and practice.
The data show that no other arena of technology resource allocation was more highly
credited with helping to raise student achievement. DDDM was unanimously seen as
181
having facilitated student achievement, and was consistently cited by the 12 school site
principals as being a key strategy that enabled their school sites to exit from PI.
Student centered software, which was aligned with both content standards and
curricular frameworks, was also frequently cited as being a key factor in supporting many
aspects of the various school site improvement plans which led to raising student
achievement. This was said to be especially true when employed in order to provide
background knowledge, or employed to facilitate student academic interventions.
Investments in classroom level technology resources, which function as instructional
medium, were also cited by many of the 12 school site principals as being a key factor in
supporting their school site improvement plan, and raising student achievement. Such
resources as Smart-boards, sound amplification systems, and LCD projectors, were items
frequently mentioned.
The findings revealed a widespread belief that weekly or bi-weekly scheduling that
provided student access to computers on a class-wide basis, was a key factor in helping
the school sites to raise student achievement, and was broadly credited with assisting
their school site to exit PI.
There was however, no consensus as to a best practice implementation for the
deployment of computers designated for the use of students. Implementation strategies
therefore ranged from computer labs, to mobile laptop carts, to a combination of the two.
The findings support the premise that technology, which has been deployed and is
used in keeping with best practice, provided the school sites surveyed an effective vehicle
for raising student achievement. The data also reveal that classroom teachers at the 12
182
school sites exhibit overwhelming support for educational technology, and support the
belief that their school site’s educational technology programs had a positive effect on
raising student achievement.
Research Question Three
The third research question inquired; what levels of resource allocations are required
to establish, and sustain, an effective technology program at the school site level? The
findings show that the 12 school site principals that were surveyed largely agreed that the
Picus/Odden $285.57 per student proposed dollar figure for technology was adequate.
One of the principals indicated that this was “unclear” and indicated a concern that the
very small student enrollment at their school site, less than 300, tends to skew funding
formulas. Another principal indicated a belief that “big schools” with large enrollments
would benefit more. One principal indicated a belief that this per pupil figure was
“probably” adequate for maintenance, but not adequate for start-up costs. Only one of
the 12 principal participants indicated a belief that the Picus/Odden proposed dollar
figure was inadequate.
The findings show that not one of the 12 principal participants could articulate an
actual dollar figure, or a projection, associated with per pupil funding levels for
technology at their school site. One of the school site principals did indicate that
“projections”, these presumably generated by the school district, were provided to the
school site, however, no projections were provided on the research instrument.
Only one of the 12 school sites indicated that they had a formal technology budget.
Only three of the 12 principal participants indicated that, in the absence of a formal
183
budget, they had projected a dollar figure necessary to meet their technology needs. Only
two of the 12 principal participants stated that they could look back over the previous
school year and affix a dollar figure to the technology related aspects of their school
improvement plan. Estimates as to the dollars dedicated to school site technology
hardware and software during the course of program improvement ranged from a low of
$18,000, to the rather remarkable figure of $460,000.
Only three of the participant school sites indicated that they had established a
replacement cycle for their computers. The responses were qualified by the caveat that
such a cycle was dependent upon the availability of funds. The findings therefore
indicate that such plans are driven, by and large by the availability of funding, more that
in keeping with best practice.
Conclusions
The findings revealed that there was a unanimous belief that DDDM was the arena of
school technology which was most highly utilized, and valued, as a key tool in helping
school sites to: raise student achievement; help reach their AYP goals; and help their
organizations to escape from Program Improvement status. The data shows that DDDM
programs were used as a tool for the implementation of school site improvement plans.
A prime use of DDDM was seen in the identification of areas of deficiency at the grade
level, classroom level, and the individual student level. Once deficiencies were
identified, DDDM was used to facilitate re-teaching and remedial instruction. Staff
member collaboration was also bolstered, as grade level, subject, or classroom needs for
184
additional professional development, were identified and facilitated through the school
sites DDDM programs.
This study supports the research which asserts that the best practice
implementation, and successful use, of educational technology resources at the school
site level should involve the following fundamental principles:
Technology resources should be integrated within and across the core curriculum;
Not only should such technology resources be incorporated into the core
curriculum, they should also be aligned with grade level standards and curricular
frameworks;
Rather than using technology resources simply for the sake of learning and using
technology, such resources should be a utilized as a tool employed for the
teaching of, and in order to facilitate the mastery of, the core curriculum, in the
pursuit of meeting various grade level standards and state curricular frameworks.
The findings revealed a widespread belief that weekly or bi-weekly scheduling that
provided student access to computers on a class-wide basis, was a key factor in helping
the school sites to raise student achievement, and was broadly credited with assisting
their school site to exit PI. There was however, no consensus as to a best practice
implementation for the deployment of computers designated for the use of students.
Implementation strategies ranged from computer labs, to mobile laptop carts, to a
combination of the two.
185
The data demonstrate that few, if any, of the 12 school site principals surveyed
believed that technology, in and of itself, could raise student achievement and facilitate
the meeting of the school site AYP growth target. However, the data also show few, if
any, doubt that technology contributed to raising student achievement and contributed to
the school site meeting their AYP growth target. Therefore, the findings support the
hypothesis that technology, which has been deployed and is used in keeping with best
practice, provided the school sites surveyed an effective vehicle for raising student
achievement. Although support for this belief had a broader scope, such support for
educational technology was most frequently, and fervently, articulated in relation to the
various DDDM programs that had been instituted at each of the 12 school sites.
The findings reveals that the range of expenses that are associated with the
implementation and maintenance of technology found at the school site remain largely
undefined, and remain largely un-quantified. The data shows that the 12 school site
principals who were surveyed largely agree that the Picus/Odden $285.57 per student
proposed dollar figure for technology was adequate. A single principal participant from
the group of 12 indicated a belief that the Picus/ Odden proposed dollar figure was
inadequate. The findings however indicate that not one of the 12 principal participants
could articulate an actual dollar figure, or a projection, associated with per pupil funding
levels for technology at their school site. The findings show only one of the 12 school
site principals indicated the existence of a formal technology budget. The findings show
only three of the 12 principal participants indicated that, in the absence of a formal
budget, they had projected a dollar figure necessary to meet their technology needs.
186
Surprisingly, the findings also show that only two of the principal participants indicated
that they could look back over the previous school year and affix a dollar figure to the
technology related expenditures of their school site improvement plan. In summation,
these school site level investments made in educational technology programs, as
discussed, remain as largely unbudgeted expenses, which are haphazardly funded, and
therefore tend to remain undocumented, even at the conclusion of the school year.
Broader Implications and Limitations
Broader implications extend beyond the sample and population of the study. At the
time of the study, there were 1,913 California K-8 schools in Program Improvement
status. Many of these schools, being faced with the same imperative to quickly raise
student achievement, can benefit from the experiences documented in this study. Indeed,
one wonders under the current NCLB guidelines and policies, just how many schools
across the nation are facing the same dire need to raise student achievement in short
order. The researcher believes that for many of the schools which are currently
struggling to meet the mandates of NCLB, the experiences with educational technology
documented within this study could be invaluable.
The researcher believes that the broader implications of the study could be somewhat
limited in their application to other schools which lay outside of the Program
Improvement environment. The 12 PI school sites surveyed were facing draconian
reorganization and restructuring measures if they failed to raise student achievement and
meet their school AYP growth targets by the end of the school year. Therefore, the
187
stakes at these 12 schools were very high, and results needed to be immediate. In such a
high-stakes environment, there obviously was a laser-like focus upon immediate
solutions, and therefore there was an implementation of technology resources which held
the greatest promise to raise student achievement in short order. The researcher believes
that otherwise similar elementary schools, which were exempt from the pressures of
Program Improvement, may have opted to lead with a very different approach to, or
implementation of, technology resources. That is to say that school sites which already
have achieved satisfactory levels of student achievement, as defined by the state of
California pursuant to NCLB, might implement technology in another fashion, or may
elect to direct technology resource dollars differently.
Implications
As previously stated, education is an expensive proposition; therefore practitioners
like principals who administer their own school site budgets are quite naturally reluctant
to invest precious resources into any programs which are unproven, or controversial, in
regard to their ability to raise student achievement. This research study is important since
it provides such practitioners with the experiences, and professional opinions, of other
practitioners who have already found success in raising student achievement with
educational technology.
Politicians, policymakers, parents, and other stakeholders will also benefit from the
research study since they will gain a sense of the circumstances under which educational
technology has proven to be effective in raising student achievement. Stakeholders will
188
be confronted with the current reality of the unbudgeted, haphazardly funded,
methodology commonly associated with educational technology programs. Likewise,
this study will be valuable to stakeholders if read as a cautionary tale which warns that
shortsighted decisions concerning the funding of educational technology has the
potential to quickly erode, or destroy, the infrastructure of such programs and mitigate
the effectiveness of existing resources.
Recommendations
The researcher hopes that as the 12 school sites that participated in this study, and
have now escaped from Program Improvement status, will continue to make investments
in technology resources in the future. More to the point, the researcher hopes that these
school sites will continue to raise student achievement and will begin to entrust more
technology resources directly into the hands of their students. Investments in such
implementations of educational technology hold the promise of having a direct impact on
instructional practices, how student learn, and facilitating valuable 21
st
century skills that
will be of paramount importance in their future careers and professions. Even though
facilitating a mastery of 21
st
century skills was not a priority within the 12 school sites
surveyed, there is no reason that, as these schools continue to improve, they cannot move
forward and continue to make progress in providing their students with a quality, well
rounded, 21
st
century educational experience.
189
The researcher believes it is important to recount the problem of inconsistent funding
that is frequently associated with educational technology programs. Educational
technology programs require stable, long-term fiscal planning in order to be effective.
A lack of such fiscal planning, or drastic, last minute, budget cuts tend to play havoc
with the best practice implementation of technology resources. The results of such
shortsightedness are often profound and frequently lead to long term consequences.
It is still common for basic Total Cost of Ownership principles to be ignored, and for
replacement cycles to be stretched or non-existent. Erratic funding levels, from school
year to school year, at best erode, and at worst can destroy, the heart and soul of a school
site technology program. The researcher believes that it is imperative that administrators
and educational professionals, politicians and policymakers, and parents and stakeholders
come to understand that effective educational technology programs require sustained
levels of funding, from year to year.
Suggestions for Further Research
Further research into perceptions and professional opinions taken from a sample of
school site principals, within a population of “successful” California schools, would be a
logical progression following upon the work of this study. Sites that have been named as
a California Distinguished School, or as a National Blue Ribbon School, immediate come
to mind as a potential population. Such a study would be valuable, relevant, and timely
in the respect that such a juxtaposition of perceptions and professional opinions, as to the
role and value of technology resources, holds the promise of bringing to light meaningful
190
philosophical and pedagogical differences. One might hypothesize that a sample of
schools which have already demonstrated high levels of student achievement, and high
AYP rankings, would begin to use technology resources in a manner more likely to have
an effect on nurturing their student body’s mastery of 21
st
century skills.
A similar approach that would be a logical progression following upon the work of
this study, would involve research involving a juxtaposition of “successful” and
“unsuccessful” school sites by undertaking a concurrent collection of data. As such,
one might undertake a study of Program Improvement schools, to be compared and
contrasted with, a sample of California Distinguished Schools. Such a study might
inquire into perceptions and professional opinions associated with the role and value of
technology, in respect to the facilitation of 21
st
century skills and existing levels of
technology resources that are actually in the hands of, or at the disposal of, students.
191
REFERENCES
Baker, B.D., Taylor, L., & Vedlitz, A. (2005). Measuring educational adequacy in public
schools. Retrieved June 23, 2008, from Texas A&M University, Bush School of
Government & Public Service Web site: http://bush.tamu.edu/research/workingpapers/
ltaylor/measuring_edu_adequacy_in_public_schools.pdf
Blomström, M., Kokko, A., & Sjohölm, F. (2002). Growth and innovation policies for a
knowledge economy: Experiences from Finland, Sweden, and Singapore. Stockholm,
Sweden: Stockholm School of Economics. Retrieved June 27, 2008, from http://www-
1.mtk.ut.ee/doc/Ariartikkel.doc
California Department of Education. (2007, August). 2007 Adequate Yearly Progress
Report Information Guide. Retrieved June 30, 2008, from http://www.cde.ca.gov/ta/
ac/ay/documents/infoguide07.pdf
California Department of Education. (2008). Title I Program Improvement status data
files: Adequate yearly progress. Retrieved July 1, 2008, from http://www.cde.ca.gov/
ta/ac/ay/tidatafiles.asp
California Learning Resource Network (2008). Data-Driven Decision-Making and
Electronic Learning Assessment Resources (ELAR). Retrieved July 6, 2008, from
http://www.clrn.org/elar/dddm.cfm#A
Cavanagh, S. (2008). States heeding calls to strengthen STEM. Technology Counts 2008.
27(30), 10-23.
CEO Forum on Education & Technology. (1997, October). School technology readiness:
From pillars to progress. Retrieved June 28, 2008, from http://www.ceoforum.org/
reports.html
CEO Forum on Education & Technology. (2001, June). Key building blocks for student
achievement in the 21
st
century. Retrieved June 28, 2008, from http://www.ceoforum
.org/ reports.html
Consortium on School Networking. (2008a). Data-driven decision making FAQ.
Retrieved July 06, 2008, from http://3d2know.cosn.org/FAQ.html
Consortium on School Networking. (2008b). Vision to know and do: The power of data
as a tool in educational decision making. Washington, DC: Consortium for School
Networking.
Cuban, L. (1999, August 22). Don’t blame teachers for low computer use in classrooms.
Los Angeles Times. Retrieved June 28, 2008, from http://www.latimes.com
192
Cuban, L. (2002). Oversold and underused: Computers in the classroom. Cambridge,
MA: Harvard University.
Cuban, L. (2004). The blackboard and the bottom line: Why schools can’t be businesses.
Cambridge, MA: Harvard University.
Cuban, L., Kirkpatrick, H., & Peck, C. (2001, Winter). High access and low use of
technologies in high school classrooms: Explaining an apparent paradox. American
Research Journal.
Datnow, A., Park, V. & Wohlstetter, P. (2007, pp. 5-7). Achieving with data: How high-
performing school systems use data to improve instruction for elementary students.
Retrieved July 6, 2008, from http://www.newschools.org/files/AchievingWithData.pdf
Dede, C., Korte, S., Nelson, R., Valdez, G., & Ward, D.J. (2005) Transforming learning
for the 21
st
century: An economic imperative. Retrieved June 28, 2008, from Harvard
University, Harvard Graduate School of Education Web site: http://www.gse.harvard
.edu/~dedech/Transformations.pdf
Editorial Projects in Education. (2008). Technology Counts 2008: STEM – The push to
improve science, technology, engineering, and mathematics. Retrieved June 27, 2008,
from www.edweek.org/go/tc08/
Education Week. (2008). Tracking U.S. Trends: States vary in classroom access to
computers and in policies concerning school technology. Retrieved June 27, 2008,
from http://www.edweek.org/ew/articles/2008/03/27/30trends.h27.html
Ed-Data. (2007, February). A guide to California’s school finance system. Retrieved June
23, 2008, from http://www.ed-data.k12.ca.us/Navigation/fsTwoPanel.asp?Bottom =%
2F profile%2Easp%3Flevel%3D04%26reportNumber%3D16
EdSource. (2008). Glossary. Retrieved June 30, 2008, from http://www.edsource.org/
glo.cfm
Edtechvoi.org. (2006, September). About the CoSN value of investment (VOI) initiative.
Retrieved July 04, 2008, from www.edtechvoi.org
Fullan, M. (1998). The three stories of educational reform: Inside; inside/out; outside/in.
Kappan Professional Journal. Retrieved June 28, 2008, from http://www.pdkintl.org/
kappan/kful0004.htm
Hanushek, E. & Associates. (1994). Making schools work: Improving performance and
controlling costs. Washington, D.C.: The Brookings Institution.
193
Institute for the Advancement of Emerging Technologies at AEL. (2008). Glossary.
Retrieved July 04, 2008, from http://129.71.174.252/tcov2/glossary.cfm
Jazzar, M. & Friedman, A. (2007). Highly effective IT leadership that promotes student
achievement. Retrieved June 28, 2008, from http://cnx.org/content/m14114/latest/
Kaestner, R. (2006, November). Value of investment in technology: Simple questions,
difficult to answer. School Business Affairs, 72. Retrieved July 04, 2008, from http://
www.edtechvoi.org/resources/ASBO_Nov06_SBA_Article_Investment-in-
Technology.pdf
Kaestner, R. (2007, May). Gauging technology costs and benefits. The School
Administrator. Retrieved, July 04, 2008, from http://www.edtechvoi.org/resources/
Gauging Technology.PDF
Lefkowitz, L. (2004, March). School finance: From equity to adequacy. [policy brief].
Aurora, CO: Mid-Continent Regional Educational Laboratory.
Luntz, M. (2007). Americans talk innovation: a survey for the national governors
association. Retrieved June 23, 2008, from http://www.nga.org/Files/pdf/WM07
LUNTZ.PDF
Marshall, J.M. (2002, May). Learning with technology: Evidence that technology can,
and does, support learning. Retrieved June 23, 2008, from http://medialit.org/reading_
room/pdf/545_CIC ReportLearningwithTechnology.pdf
Mid-continent Research for Education and Learning. (2003). Sustaining school
improvement: Data driven decision making. Retrieved July 6, 2008, from http://www.
mcrel.org/PDF/LeadershipOrganizationDevelopment/5031TG_datafolio.pdf
Minorini, P.A., and Sugarman, S.D. (1998). Educational adequacy and the courts: The
Promise and problems of moving to a new paradigm. Retrieved September 01, 2004,
from http://www.law.berkeley.edu/faculty/sugarmans/Adequacy.htm
National Conference of State Legislatures. (2004a). Adequacy and education finance.
National Center on Education Finance. Retrieved September 01, 2004, from http://
www.ncsl.org/programs/educ/ResearchReport.htm
National Conference of State Legislators. (2004b). Education finance litigation: History,
issues, and current status. National Center on Education Finance. Retrieved
September 01, 2004, from http://www.ncls.org/programs/educ/LitigationCon.htm
194
National Governors Association. (2007a). Building a Science, Technology, Engineering,
and Math Agenda, Retrieved June 23, 2008, from http://www.nga.org/Files/pdf/0702
INNOVATIONSTEM.PDF
National Governors Association. (2007b). Testimony of Dane Linn; director education
division, Center for Best Practices, National Governors Association. Retrieved June
23, 2008, from http://www.nga.org/Files/pdf/070322TESTIMONYLINN.PDF
National Governors Association. (2007c). Innovation America: a final report. Retrieved
June 23, 2008, from http://www.nga.org/Files/pdf/0702INNOVATIONSTEM.PDF
New Media Consortium. (2005). A global imperative: The report of the 21
st
century
literacy summit. Retrieved June 28, 2008, from http://www.adobe.com/education/pdf/
global imperative.pdf
NGA Center for Best Practices. (2008). Promoting STEM education: A communications
toolkit. Retrieved June 26, 2008, from http://www.nga.org/Files/pdf/0804STEMTOOL
KIT.PDF
Odden, A., Fermanich, M., & Picus, L. (2003). A state of the art approach to school
finance adequacy in Kentucky. Retrieved September 01, 2004, from http://www.
schoolfunding.info/states/ky/KySEEKStudy.PDF
Odden, A.R. & Picus, L. (2001). Assessing SEEK from an adequacy perspective.
Retrieved September 01, 2004, from http://www.education.ky.gov/NR/rdonlyres/
ejwvpbd5agzvm2hc7y3a44zwhrw7n4z6qt21fnwtcos4z6xmysdj65mfesc31ekp7of
sbaup6dg5zhd3f62ybdonpe/SEEKAdqPerspective.doc
Odden, A.R. & Picus, L.O. (2008). School finance: A Policy perspective, (4th ed.).
New York: McGraw-Hill.
Partnership for 21
st
century skills. (2007). Beyond the three Rs: Voter attitudes toward
21
st
century skills. Retrieved June 26, 2008, from http://www.21stcenturyskills.org/
documents/P21_pollreport_singlepg.pdf
Picciano, A. (2006). Educational leadership and planning for technology. (4
th
ed.). Upper
Saddle River, NJ: Pearson.
Picus, L. O. (2000). Student-level finance data: Wave of the future? The Clearing House,
74 (2), 75-80.
195
Picus, L.O., Odden, A.R., & Fermanich, M. (2003). A professional judgment approach
to school finance adequacy in Kentucky. Retrieved September 01, 2004, from http://
www.education.ky.gov/cgi-bin/MsmGo.exe?grab_id=28377924&EXTR_ARG=&
host_id=1&page_id=253&query=professional+judgment +approach&hiword=
PROFESSIONAL+JUDGMENT+APPROACH+
Schrag, P. (2003). Final test: The battle for adequacy in America’s schools. New York:
New Press.
U.S. Department of Education. (2001). Public law print of PL 107-110, the No Child Left
Behind Act of 2001. Retrieved June 27, 2008, from http://www.ed.gov/policy/elsec/
leg/esea02/107-110.pdf
U.S. Department of Education. (2005). 10 facts about K-12 education funding. Retrieved
June 23, 2008, from http://www.ed.gov/about/overview/fed/10facts/10facts.pdf
U. S. Department of Education, Web Based Education Commission. (2000). Retrieved
June 28, 2008, from http://www.ed.gov/offices/AC/WBEC/FinalReport/WBECReport
.pdf
Waddoups, G.L. (2004). Technology integration, curriculum, and student achievement: A
review of scientifically based research and implications for EasyTech. Retrieved June
28, 2008, from http://www.learning.com/documents/learning.com_wp_summary.pdf
Wenglinsky, H. (1997). When money matters: How educational expenditures improve
student performance and how they don’t. Retrieved September 01, 2004 from, http://
www.ets.org
Wenglinsky, H. (1998). Does it compute? The relationship between educational
technology and student achievement in mathematics. Princeton, NJ: Policy
Information Center, Educational Testing Service.
WestEd. (2000, July). School funding: From equity to adequacy. Policy Brief. Retrieved
September 01, 2004, from http://www.wested.org/cs/we/view/rs/180
Abstract (if available)
Abstract
The wisdom, expense, and relative cost effectiveness, of employing computers and other technology within California public schools, in an effort to both raise student achievement and equip our students with 21st century skills, has been and continues to be, a contentious issue. Researchers, educational professionals, school board members and stakeholders often remain at odds with one another about the presence, best practice, and effectiveness of computers, and other aspects of technology, within our schools.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Adequate funding for educational technology
PDF
The impact of resource allocation on professional development for the improvement of teaching and student learning within a site-based managed elementary school: a case study
PDF
Resource allocation and instructional improvement strategies in rural single-school elementary districts
PDF
Allocation of educational resources to improve student achievement: case studies of six California schools within two school districts
PDF
Evidence-based study of resource usage in California’s program improvement schools
PDF
Allocation of educational resources to improve student achievement: Case studies of non-title I schools
PDF
School-level resource allocation practices in elementary schools to increase student achievement
PDF
Allocation of resources and educational adequacy: case studies of school-level resource use in southern California Title I Program Improvement middle schools
PDF
Allocation of educational resources to improve student learning: case studies of California schools
PDF
Allocation of educational resources to improve student achievement: case studies of five California schools
PDF
Allocation of educational resources to improve student achievement: Case studies of four California charter schools
PDF
Resource use and instructional improvement strategies at the school-site level: case studies from ten southern California elementary schools
PDF
The impact of resource allocation on professional development for the improvement of teaching and student learning within an elementary school in a centrally managed school district: a case study
PDF
Allocation of educational resources to improve student learning: case studies of California schools
PDF
Technology, policy, and school change: the role of intermediary organizations
PDF
Factors influencing technology at a secondary school
PDF
Allocation of educational resources to improve student learning: case studies of California schools
PDF
Adequacy and allocation practices: a cornerstone of program improvement resolution
PDF
School level resource allocation to improve student performance: A case study of Orange County and Los Angeles County Title I elementary schools
PDF
Successful resource allocation in times of fiscal constraint: case studies of school-level resource use in southern California elementary schools
Asset Metadata
Creator
Szeremeta, George Raymond
(author)
Core Title
The role and value of educational technology in California fourth and fifth year (2006-2007) program improvement elementary schools that achieved AYP growth targets
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
04/22/2009
Defense Date
12/11/2008
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
AYP,Educational Technology,OAI-PMH Harvest,program improvement,school technology
Place Name
California
(states)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Picus, Lawrence O. (
committee chair
), Hentschke, Guilbert C. (
committee member
), Nelson, John L. (
committee member
)
Creator Email
grs09@live.com,szeremet@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2100
Unique identifier
UC169608
Identifier
etd-szeremeta-2590 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-224616 (legacy record id),usctheses-m2100 (legacy record id)
Legacy Identifier
etd-szeremeta-2590.pdf
Dmrecord
224616
Document Type
Dissertation
Rights
Szeremeta, George Raymond
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
AYP
program improvement
school technology