Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
The effects of technology on the mathematical achievement of Black and Latinx students
(USC Thesis Other)
The effects of technology on the mathematical achievement of Black and Latinx students
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
The Effects of Technology on the Mathematical Achievement of Black and Latinx Students
Romeo A. Baldeviso
Rossier School of Education
University of Southern California
A dissertation submitted to the faculty
in partial fulfillment of the requirements for the degree of
Doctor of Education
May 2024
© Copyright by Romeo A. Baldeviso 2024
All Rights Reserved
The Committee for Romeo A. Baldeviso certifies the approval of this Dissertation
Alan G. Green
Erika A. Patall, Committee Co-chair
Adam Kho, Committee Co-chair
Rossier School of Education
University of Southern California
2024
iv
Abstract
Utilizing technology to enhance mathematical instruction is crucial for improving academic
outcomes among Black and Latinx students in the United States, ensuring equitable access to
innovative learning tools and opportunities for achievement in STEM fields. This research
synthesis built on Hattie’s (2023) Visible Learning, specifically focusing on the effects of
technology on the mathematical achievement of Black and Latinx K–12 students in the United
States. To be included in the current synthesis, a study had to have been originally included in
the meta-analyses in Hattie’s (2023) Visible Learning, with the targeted sample being at least
40% Black and Latinx students. This meta-analysis of 50 studies, which focused on the effect of
technology on mathematics achievement among Black and Latinx students, revealed that the
pooled average effect size was positive (g = 0.523) and statistically significant. Analyzing for the
percentage of Black and Latinx students as a moderator, the effect of technology was more
positive for samples with a greater percentage of Black students and less positive for samples
with a greater percentage of Latinx students, both statistically significant. Furthermore, of the
types of technology examined, there were no statistically significant differences of effectiveness
in the types of technology interventions used. Lastly, the percentage of students with special
education status did not statistically significantly moderate the effect of technology on
mathematics achievement.
v
Dedication
To José (Joe) Cipreano Ortega, 94, my beloved stepfather who passed away peacefully at his
home in Porter Ranch, California, on February 2, 2024. He was a constant source of
encouragement for me during this journey. After disappointing events, Dad would often ask his
children, “Did you do the best that you could do?” If the answer was yes, he would say, “Then,
that’s all you can ask of yourself.” We not only remember the comfort of these words but learned
to ask the same question of our spouses, partners, and children. Joe was known for being gentle
but strong, and friends of his children still speak highly of him. The members of his large family
and circle of friends knew him as a kind, humble, and grateful person who always treated us with
love and respect.
vi
Acknowledgments
Completing this doctoral dissertation has been a journey filled with challenges, growth,
and invaluable support from numerous individuals whom I wish to acknowledge with profound
gratitude.
First and foremost, I extend my deepest appreciation to the co-chairs of my doctoral
committee, Drs. Adam Kho and Erika Patall. Your expertise, guidance, and commitment to my
academic achievement have been instrumental in shaping this research synthesis. Your insightful
feedback, scholarly mentorship, and dedication to excellence have propelled me forward, and for
that, I am immensely grateful.
I also extend my gratitude to Dr. Green, the esteemed member of my doctoral committee.
Your expertise in the field, thoughtful insights, and commitment to addressing educational
inequities have ignited my interest in this important area of study.
To my life-partner, Andrew, your unwavering love, support, and encouragement have
sustained me through the highs and lows of this doctoral journey. Your belief in me has been a
constant source of strength, and I am endlessly thankful for your presence in my life.
To my beloved family—my brothers, sister, mother, father, stepparents, and stepson—
your love, encouragement, and belief in my abilities have been the bedrock of my journey, and I
am deeply grateful for your presence in my life.
vii
Table of Contents
Abstract.......................................................................................................................................... iv
Dedication....................................................................................................................................... v
Acknowledgments.......................................................................................................................... vi
List of Tables ................................................................................................................................. ix
List of Figures................................................................................................................................. x
Review of the Prior Literature ............................................................................................ 3
Technology and Mathematical Education .............................................................. 3
Definition of Terms and Types of Technology....................................................... 4
Theoretical Foundations for the Effects of Technology on Mathematics
Achievement ....................................................................................................................... 7
Constructivism........................................................................................................ 7
Cognitive Load Theory........................................................................................... 8
Zone of Proximal Development.............................................................................. 9
Authentic Learning and STEM Identity Formation.............................................. 11
Effects of Technology on Mathematical Achievement ........................................ 14
Factors Contributing to Variation in the Effects of Technology on
Mathematics Achievement.................................................................................... 16
Methods............................................................................................................................. 22
Literature Search................................................................................................... 22
Inclusion Criteria .................................................................................................. 26
Data Extraction ..................................................................................................... 27
Computing Effect Sizes ........................................................................................ 29
Analysis Strategy .................................................................................................. 29
Results............................................................................................................................... 30
Overall Effects for Technology Interventions ...................................................... 30
viii
Publication Bias.................................................................................................... 31
Moderator Analysis: Percentage of Black and Latinx Students ........................... 31
Moderator Analysis: Percentage of Special Education Students.......................... 32
Moderator Analysis: Types of Technology .......................................................... 32
Discussion......................................................................................................................... 33
Summary of Key Findings.................................................................................... 33
Alignment of Findings With Theory and Prior Research ..................................... 34
Implications for Theory and Practice.................................................................... 36
Limitations............................................................................................................ 37
Recommendations for Future Research................................................................ 38
Conclusions........................................................................................................... 39
References..................................................................................................................................... 40
Tables............................................................................................................................................ 63
Appendix A: Coding Guide .......................................................................................................... 73
ix
List of Tables
Table 1: Overall Effect of Technology Interventions 63
Table 2: Moderator Analysis, Publication Type 63
Table 3: Moderator Analysis, Percentage of Black and Latinx Students 64
Table 4: Moderator Analysis, Percentage of Special Education Students 64
Table 5: Moderator Analysis, Type of Technology 65
Table 6: Table of Studies and Characteristics Included in This Meta-Analysis 66
Appendix A: Coding Guide 73
x
List of Figures
Figure 1: Step 1 Prisma Chart, Number of Meta-Analyses Retrieved 24
Figure 2: Step 2 Prisma Chart, Number of Studies Screened and Coded 25
1
The Effects of Technology on the Mathematical Achievement of Black and Latinx Students
The United States currently faces a shortage of science, technology, engineering, and
mathematics (STEM) majors and graduates (Granovskiy, 2018; National Science Board, 2016).
At the same time, STEM occupations are expected to grow (U.S. Bureau of Labor Statistics,
2018). It was estimated by the U.S. Bureau of Labor Statistics in 2018 that close to 10,000 jobs
were currently unfilled in STEM fields. This twofold challenge requires STEM education in the
United States to be a top priority. According to the National Research Council (2011), this
priority must include expanding minorities’ participation in STEM and increasing STEM literacy
for all students. As the United States rapidly approaches a “majority-minority” tipping point
where non-Hispanic Whites will make up less than half of the U.S. population, projected by 2044
(Colby & Ortman, 2015), the need to address the “low participation, representation, engagement,
and inclusion in engineering and related STEM fields among underrepresented students” is
critical, because to do so will “enrich the intellectual capacity of the U.S. STEM workforce”
(Long & Mejia, 2016, p. 216).
The rapid advancement of technology in the 21st century has brought about significant
changes to the educational landscape, providing both new opportunities and challenges for
students and educators alike. As technology continues to permeate educational settings, a
growing need exists to understand its impact on students’ academic achievement, particularly in
underserved communities such as populations of students of color.
The National Council of Teachers of Mathematics (2000) has identified the importance of
investigating the potential benefits of technology on mathematics education as a critical focus for
student success and workforce readiness in the STEM fields. Past research has shown that Black
and Latinx students, as a group, have consistently underperformed compared with their White
2
and Asian counterparts in mathematical achievement, as demonstrated by standardized test
scores, course completion rates, and college enrollment (Lee et al., 2020). This has led to an
underrepresentation of Black and Latinx individuals, who comprise collectively only 13% of the
STEM workforce and 16% of all STEM undergraduate degree recipients (Rivers, 2017). This
achievement gap is rooted in a complex interrelationship of social, economic, and educational
factors, which has persisted for decades and presents a critical challenge to educational equity in
the United States (Ladson-Billings, 2006).
This research synthesis is based on existing research included in Hattie’s (2023) Visible
Learning synthesis of meta-analyses focused on the influence of technology on mathematics
achievement. Understanding how to reduce disparities in STEM achievement requires an
understanding of how technology influences the mathematical achievement of different groups in
isolation. Hattie (2023) acknowledged that the use of technology is effective for all students.
Across all the 260+ meta-analyses, there is more similarity in the effects across all students than
there are moderation effects of student background. For example, the effects are similar across
preschool (d = 0.54), elementary (d = 0.44), and high schools (d = 0.30), as well as in math (d =
0.41), science (d = 0.33), language (d = 0.55), reading/literacy (d = 0.35), and writing (d = 0.42;
Hattie, 2023). Building on Hattie’s findings, this research synthesis aimed to provide a
comprehensive understanding of the relationship between technology interventions and
mathematical achievement specifically for Black and Latinx students. This research synthesis
also sought to better understand the extent to which technology can serve as a powerful tool to
address existing disparities and foster educational equity by examining the effects of technology
on the mathematical achievement of underrepresented minorities, and the types of technology
3
interventions that are more effective in raising mathematical achievement among Black and
Latinx students.
In the following sections, I begin by discussing the extant literature on the wide variety
and different types of technologies used in mathematics instruction to identify their impact on
Black and Latinx students’ mathematical achievement. Next, I provide an overview of the
theoretical frameworks underpinning the effects of technology on mathematics achievement. I
then review the empirical research and moderating features of the effects of technology on
student achievement, with special attention to the importance of and dependence on the learning
environment in which the technology is applied. Last, I describe the current synthesis, with a
further explanation of the methods I used for the empirical meta-analysis, including the process
for identifying literature, the criteria for inclusion in the study, the plan for data extraction, and
the analysis strategy.
Review of the Prior Literature
The use of technology in mathematics education has evolved over time, with various
tools and software being developed to enhance teaching and learning. These tools range from
simple calculators to more sophisticated computer-based programs such as GeoGebra, Desmos,
and Wolfram Alpha. The integration of technology in mathematics education has been
influenced by several factors, including the need to improve students’ mathematical
achievement, the changing nature of mathematics, and technological advancement (National
Council of Teachers of Mathematics, 2000).
Technology and Mathematical Education
Various technological tools and approaches have been used to support different aspects of
mathematics education. Technology enables educators to create and deliver content through
4
online platforms to access information and learning materials more flexibly and at the student’s
own pace (Means, 2010). Interactive software, online quizzes, and other digital tools provide
students with opportunities to practice mathematical concepts and skills in a more engaging and
personalized manner. These tools often provide immediate feedback, enabling learners to
identify their errors and make corrections (Roschelle et al., 2010). Adaptive learning systems
tailor instruction to individual learners based on their unique needs and learning preferences.
This helps to ensure that students receive the appropriate level of support and challenge,
enhancing their overall learning experience (Woolf, 2010). Technology can also foster
collaboration and communication by providing online forums, chat rooms, and other virtual
spaces where students can discuss and solve mathematical problems.
Definition of Terms and Types of Technology
The term technology used here refers to computer technology (CT), which is computer
software rather than computer hardware. Four-function calculators and graphing calculators are
not included in this study because previous research has already systematically reviewed these
studies. For example, Hembree and Dessart’s (1986) meta-analysis examined the effects of K–12
students’ calculator use. Focusing on students’ achievement and attitude, the researchers
analyzed 79 primary studies and concluded that the use of calculators increases student
achievement and confidence levels. Smith et al. (1997) conducted another meta-analysis that
supported and extended those results.
In the past several decades, different types of technology, ranging from computer-assisted
instruction to the Internet, have been developed and applied to enhance mathematics teaching
and learning. Researchers (Lou et al., 2001; Means, 1994) have classified various types of
5
computer technology into five main categories: (a) tutorial, (b) communication media, (c)
exploratory environment, (d) software tools, and (e) programming.
Tutorial describes technology programs that directly teach mathematics by setting up an
interactive environment where information, demonstration, and drill and practice are provided to
students (Lou et al., 2001). This type of technology includes, but is not limited to, computerassisted instruction, mathematics games such as Math Blaster, and drill and practice software
such as A+Math, Math Facts in a Flash, Maple 13, and Math Realm (Ash, 2005; Clark, 2005;
Din & Caleo, 2000; Kalyuga & Sweller, 2005; Martindale et al., 2005; Phillips, 2001; Quinn &
Quinn, 2001; Schpilberg & Hubschman, 2003; Smith, 2002; Soeder, 2001; Zumwalt, 2001).
Communication media refers to tools or platforms that facilitate the exchange of
information, ideas, and knowledge. In the context of mathematical instruction, these technologies
can be used to enhance collaboration, problem solving, and learning among students and
teachers. For example, online forums and discussion boards allow students and teachers to
discuss mathematical concepts, share resources, and collaborate on problem solving. Another
example of communication media is learning management systems (LMS), which enable
teachers to share mathematical resources, facilitate discussions, and assess student progress.
LMS examples include Blackboard, Schoology, Canvas, and Moodle (Clark & Mayer, 2023).
Video conferencing tools, such as Zoom, Microsoft Teams, and Google Meet, are other
examples of communication media tools that allow students and teachers to interact in real time
through video and audio communication, fostering collaboration and problem solving (Oztok et
al., 2014).
Exploratory environments seek to encourage active learning through discovery and
exploration (Lou et al., 2001). Logo, simulations, and hypermedia-based learning programs are
6
examples of this type of technology (Berryman, 1999; Connell, 1997; Forde, 2003; Funkhouser,
2002; Shyu, 2000). Dynamic geometry software, such as GeoGebra and Cabri Geometry II,
facilitates the exploration of geometric concepts through interactive and dynamic visual
representations (Hoenwarter & Fuchs, 2004). An example of a hypermedia-based learning
program in mathematics is the “Adventures of Jasper Woodbury,” which uses video and
multimedia to provide problem scenarios that help students enhance their problem-solving and
critical-thinking skills (McKibbin, 2010).
Software tools serve the technological purpose of making teaching and learning fun,
effective, and efficient (Lou et al., 2001). Word processors, PowerPoint, spreadsheets,
Geometer’s Sketchpad, Cabri Jr., data analysis software, and various virtual manipulatives are
some examples of this type of technology (Carter & Smith, 2001; Clariana, 1996; Iskander &
Curtis, 2005; Lewis, 2004; Ling, 2004; Olkun, 2003; Page, 2002; Reimer & Moyer, 2005;
Ysseldyke & Tardrew, 2007). Included in this category is instructional management software,
which is software used for instructional purposes rather than solely for assessment purposes. For
example, Ysseldyke and his colleagues have studied the use of Accelerated Math, which allows
teachers to match instruction to an individual student’s skill level, provide appropriate practice,
monitor student progress, and give corrective feedback (Ysseldyke & Tardrew, 2007).
The use of programming (also referred to as coding) in mathematics instruction can help
students develop computational thinking skills and enhance problem-solving abilities. For
example, the Scratch programming language, developed by the MIT Lab, is a block-based visual
programming language designed for students to create interactive stories, games, and animations
(Resnick et al., 2009). Another example is the Wolfram Language, used in Mathematica and
Wolfram Alpha, which combines symbolic and numerical computation with programming
7
capabilities, making it a powerful tool for exploring and visualizing mathematical concepts
(Torrence & Torrence, 2019; Wolfram, 2013). Another popular example is Logo, which is a
language designed to teach programming concepts and geometry through the manipulation of onscreen “turtles” that draw shapes and patterns (Clements et al., 2001).
Theoretical Foundations for the Effects of Technology on Mathematics Achievement
This review discusses the theoretical foundations of using technology to increase
students’ achievement in mathematics. Key theories and approaches include constructivism,
cognitive load theory, the zone of proximal development, and authentic learning and positive
STEM identity formation.
Constructivism
Constructivism, a learning theory developed by Piaget (2005) and Vygotsky (1978),
asserts that learners construct their own knowledge based on their experiences and interactions
with their environment. In the context of mathematical education, constructivism suggests that
students should be actively engaged in exploring mathematical concepts and solving problems,
rather than simply memorizing formulas and procedures. Constructivism involves students’
interpretation of knowledge and understanding from the experiences they encounter as active
learners (Slavin, 2000). Independent thinking can increase students’ understanding by allowing
them to reach beyond the goal of obtaining answers. Promotion of independent thinking includes
invented algorithms, solving one’s own problems, and question-asking. Invented ideas encourage
problem solving through student engagement (Warrington & Kamii, 1998). When integrated into
mathematics education, technology can provide interactive and dynamic learning environments
that promote the construction of knowledge and facilitate problem solving (Papert, 1980).
Several studies have demonstrated the effectiveness of technology in promoting constructivist
8
learning in mathematics (e.g., Hohenwarter & Fuchs, 2004; Taber et al., 2017). Examples
include virtual manipulatives and dynamic geometry software, such as GeoGebra, which have
been found to enhance students’ understanding of mathematical concepts and problem-solving
skills (Hohenwarter & Fuchs, 2004).
Another example of constructivism in the use of technology to improve mathematical
achievement is the use of interactive simulations. Simulations, such as those available on PhET
Interactive Simulations, provide students with a virtual laboratory for exploring mathematical
concepts. For example, students can use the simulation “Balancing Act” to explore the concept
of torque, or “Function Builder” to investigate functions and their graphs. In a study by Taber et
al. (2017), high school students who used PhET simulations to learn about electricity showed
significant gains in their conceptual understanding of the topic compared with a control group.
Cognitive Load Theory
Cognitive load theory, proposed by Sweller (1988), suggests that learning is most
effective when the cognitive load imposed on the learner’s working memory is minimized.
Cognitive load refers to the mental effort and resources required for information processing and
problem-solving tasks (Sweller, 1988). Cognitive load theory posits that individuals have limited
working memory capacity, which is responsible for actively processing and manipulating
information. Working memory acts as a temporary storage system that allows one to hold and
manipulate a small amount of information at a time. When cognitive load imposed by a task
exceeds the capacity of working memory, it can lead to difficulties in learning, comprehension,
and problem solving. Optimizing cognitive load is crucial to facilitating effective learning and
performance. When cognitive load is too high, learners may experience cognitive overload,
resulting in reduced understanding, increased errors, and diminished performance. By managing
9
and minimizing cognitive load, learning outcomes are enhanced and cognitive processing is
more efficient (Clark et al., 2011; Paas et al., 2003; Sweller, 2002).
Technology can help to manage cognitive load in mathematics education by reducing
extraneous cognitive load and promoting germane cognitive load. This can be achieved through
the use of visual representations, worked examples, and adaptive feedback (Paas et al., 2003).
Visual representations help students understand mathematical concepts by providing a visual and
interactive way to explore abstract ideas. For example, the app Desmos allows students to create
and manipulate graphs, which can help them understand relationships between variables.
Worked examples are step-by-step solutions to mathematical problems that can help students
understand problem-solving strategies. The app Mathspace provides students with interactive
worked examples that adapt to their individual learning needs. If a student is struggling with a
particular step in a problem, the app can provide additional hints or explanations. Adaptive
feedback helps students understand their mistakes and guide them toward better problem-solving
strategies. An app called ASSISTments provides students with adaptive feedback based on their
responses to homework problems. If a student makes a common mistake, the app can provide
targeted feedback to help them understand why the mistake was made and how to avoid it.
Several studies have shown the effectiveness of using technology in reducing cognitive load in
mathematical problem solving (e.g., Ayers & Sweller, 2005; Kalyuga, 2007).
Zone of Proximal Development
The zone of proximal development (ZPD) was introduced by the Soviet psychologist Lev
Vygotsky (1896–1934) in the early 20th century. It refers to the difference between what a
learner can do independently and what they can do with the guidance and support of a more
knowledgeable other, such as a teacher or peer (Vygotsky, 1978). According to Vygotsky,
10
learning and development occur through social interaction and collaboration. The ZPD is the
range of tasks or skills that a learner is not yet able to accomplish independently but can achieve
with appropriate guidance or assistance. It encompasses the skills and knowledge that are within
the learner’s reach but still require support to master. The role of a more knowledgeable “other,”
such as a teacher, tutor, peer, or technology, is crucial in facilitating learning within the ZPD.
This knowledgeable “other” provides scaffolding, which involves tailoring instruction to the
learner’s current abilities and gradually reducing support as the learner becomes more proficient.
Through this process, learners can internalize and master new skills or concepts that were
initially beyond their independent capabilities (Tharp & Gallimore, 1991; Vygotsky, 1978;
Wertsch, 1991; Wood et al., 1976).
Technology can serve as a means to provide individualized scaffolding and guidance for
students, enabling them to work within their ZPD. Adaptive learning technologies and intelligent
tutoring systems can diagnose a student’s current level of understanding and provide targeted
instruction or feedback to help them progress. For example, researchers have found that
technology-enhanced scaffolding can be effective in improving students’ mathematical
achievement (Roschelle et al., 2000). Technology can facilitate the process of providing tailored
feedback, hints, or explanations in real time, allowing students to work through problems at their
own pace while still receiving the necessary support. Technology can also support educators with
valuable insights into each student’s learning process, enabling them to adjust their instructional
strategies to support students within their ZPD. Data collected from digital learning platforms
can help teachers identify areas where individual students may be struggling and modify their
instruction accordingly (Baker & Siemens, 2014). Kusmaryono and Kusumaningsih (2021)
acknowledged that mathematics is difficult to learn for students and argued that when students’
11
levels of cognitive development are still at a concrete stage, the abstract nature of mathematical
objects may cause learning difficulties. Every child has a zone of proximal development, and
they have the potential to develop toward optimal cognitive development as long as they are
directed and assisted by teachers, parents, and the use of technology (Kusmaryono &
Kusumaningsih, 2021).
Authentic Learning and STEM Identity Formation
STEM identity is defined as the way individuals make meaning of science experiences
and how society structures possible meanings (Carlone & Johnson, 2007; Hughes et al., 2013).
People who develop STEM identities are described as those who “think about themselves as
science learners and develop an identity as someone who knows about, uses, and sometimes
contributes to science” (Roberts et al., 2018, p. 3). STEM identity has been shown to have a
powerful role in an individual’s success in STEM educational environments, as well as on their
career goals and trajectories (Chemers et al., 2011; Perez et al., 2014; Simpson & Bouhafa,
2020). Historically, however, STEM identity formation for Black and Latinx students has been
hindered by the lack of representation in STEM fields, which predominantly consists of White
males (Alegria & Branch, 2015; Bernard & Cooperdock, 2018).
To better foster learning and living environments that help students develop positive
STEM identities, Singer et al. (2020) identified three important contributors to STEM identity
formation: (a) teaching for diversity and inclusion through exposure to role models (Johnson,
2012), (b) an individual’s sense of belonging to an educational institution and to the STEM
field(s) (Maton et al., 2016; Rainey et al., 2018), and (c) authentic learning experiences (Chi et
al., 2015; Wallace & Bodzin, 2017). With the emergence of a new set of technology tools,
12
students are better equipped to engage in more authentic learning experiences based on
experimentation and action.
Authentic learning experiences are those developed to align better with the way learning
is achieved in real-world environments (Herrington et al., 2014). Herrington and Oliver (2000)
suggested that authentic learning experiences should be designed around characteristics that
focus on real-world relevance over a long period of time (Bransford et al., 2012; Reeves &
Reeves, 1997); the exposure to and collaborating with multiple roles and perspectives; reflection;
articulation of knowledge and learning, particularly through public presentation (Edelson, 1996);
scaffolding; and authentic assessment. These experiences focus on solving real-world problems
through role-playing, participation in virtual or face-to-face communities of practice, and case
studies. Authentic learning intentionally introduces multiple perspectives and interdisciplinarity
(Lombardi & Oblinger, 2007).
Through the use of the Internet and a variety of communication, visualization, and
simulation technologies, students are able to engage in authentic learning by reconstructing the
past, observing phenomena using remote instruments, and making valuable connections with
mentors around the world. With access to online research communities, learners are able to gain
a deeper sense of a discipline as a special “culture” shaped by specific ways of seeing and
interpreting the world. Learning becomes as much social as cognitive, as much concrete as
abstract, and becomes intertwined with judgment and exploration (Seely Brown, 1999).
For example, Singer et al. (2020) designed a course to introduce ideas of diversity and
inclusion in STEM, through various activities, as well as a semester-long authentic learning
experience class. The treatment class was more ethnically diverse, with approximately 40% of
the students identifying as non-White. Both classes had between 55% and 60% of the students
13
identify as female (Singer et al., 2020). The treatment course created an authentic learning
experience via a semester-long collaboration between students and the director of the Abrams
Planetarium. The course was designed to focus on technology skill development and student-led
research experiences that mirrored real life, with a particular emphasis on diversity in science
and the space program in particular. Students worked in learning teams of four to five students to
develop their initial ideas and then pitched their proposals as an Adobe Flash presentation. The
final products were given in digital form to the planetarium to incorporate into its public displays
and/or for future youth programming.
To assess the effectiveness of the course, Singer et al. (2020) developed a STEM identity
survey that asked a series of short-answer questions throughout the semester. They also
administered the STEM identity survey to a control group that did not have a focus on diversity
and inclusion or an authentic learning project. Their results suggested that the focus on diversity
and the authentic learning project administered in the treatment group helped contribute to a
stronger STEM identity compared with the control group. Students in the treatment group
consistently showed more change in their STEM identities, with particular increases around their
identities as not only scientists, but scientists for whom their genders and ethnicities are an
important part of who they are (Singer et al., 2020).
Authentic learning can help students better judge the reliability of information, develop
and follow longer arguments, identify patterns, and become more flexible and dynamic
collaborators across disciplines and cultural boundaries (Lombardi & Oblinger, 2007). Studies
have also shown that authentic learning experiences may contribute to increased empathy
(Donnelly et al., 2019), increased understanding of the theoretical underpinnings of a field
(Smith et al., 2015), and the development of stronger STEM identities (Anthony et al., 2017;
14
Martin-Hansen, 2018; Mraz-Craig et al., 2018). Authentic learning experiences also contribute to
increased engagement and self-efficacy measures, particularly for Black and Latinx students
(Chemers et al., 2011). Tipton et al. (1996) further suggested that authentic learning experiences
using technology can provide educators with tools to address equity issues among Black and
Latinx students, improve student achievement, and provide support for students who learn in
different ways (Tipton et al., 1996). In summary, technology interventions that utilize authentic
learning, such as programming, foster real-world problem solving with projects that put
mathematical concepts in context and allow for deeper understanding and retention (Billings,
1986; Emihovich & Miller, 1988; Lehrer & Randle, 1987; Yusuf, 1991, 1995).
Effects of Technology on Mathematical Achievement
Technology has been gaining acceptance as one set of instructional tools that can be used
effectively in schools (Liu, 2008). Compared with traditional teaching methods such as lectures
and discussions, technology-enhanced instruction offers the advantage of adapting materials to
each student’s needs (Gu et al., 2010). The National Council of Teachers of Mathematics (2000)
emphasized the importance of the use of technology in mathematics education, stating that
“technology is essential in teaching and learning mathematics; it influences the mathematics that
is taught and enhances students’ learning” (p. 11).
Although technology has considerable potential to influence mathematics achievement,
the presence of technology alone does not automatically produce desirable learning outcomes in
mathematics education (Clark, 1983; Li, 2004). Successful and effective use of technology for
the teaching and learning of mathematics depends on proven teaching and learning strategies that
come from a thorough understanding of the effects of technology on mathematics education
(Albright & Graf, 1992; Coley et al., 1997).
15
The increasingly widespread use of technology in mathematics education has created a
flurry of research studies focusing on technology’s success and effectiveness in elementary and
secondary education. Exploratory environments incorporating multimedia and the Internet have
been found to improve student engagement and motivation (Sokolowski et al., 2015). Dynamic
simulation environments, such as GeoGebra and Cabri Geometry, have been shown to enhance
students’ conceptual understanding and engagement with mathematical concepts (Hohenwarter
& Fuchs, 2004). Visualization environments such as Desmos help students better grasp abstract
concepts by providing interactive visual representations (Ebert, 2014). Simulation and
visualization environments offer additional opportunities for students to explore mathematical
ideas actively, further promoting engagement (Kohen et al., 2022).
By using technology as a tool to communicate and enhance learning, teachers are able to
create learning environments that work alongside traditional teaching methods to support student
learning (Wiske et al., 2005, as cited in Eyyam & Tartan, 2014). The use of technology in
mathematics has shown small to moderate effects of technology on students’ mathematical
achievement. An earlier study, Hartley’s (1977) dissertation, was the first meta-analysis
specifically looking at mathematical achievement of elementary and secondary students and
found that technology-based instruction raised student achievement by 0.4 standard deviation, or
from the 50th percentile to the 66th percentile (Hartley, 1977). Burns and Bozeman (1981) later
examined 40 primary studies focusing on mathematical achievement and concluded with similar
findings. Other extensive reviews of technology-based mathematics instruction (Kulik et al.,
1983; Niemiec & Walberg, 1985) each included a separate analysis for mathematical
achievement and also reported improved student learning. Research has also revealed that
technology can be a powerful tool when used to promote critical, analytic, and higher-order
16
thinking skills; provide drill and practice; and engage students in real-world problem solving
(Bitter & Pierson, 2001; Cemal Nat et al., 2011; Wiske et al., 2005).
Software tools incorporating multimedia and the Internet, such as virtual manipulatives,
are being used extensively to help build a mathematical foundation for students to understand
abstract concepts. Virtual manipulatives are usually replicas of real manipulatives and are
accessed via the Internet. A variety of studies have examined virtual manipulative tools in
mathematics classrooms and found positive impacts on student achievement and attitudes toward
mathematics (Char, 1989; Kaput, 2000; Kieran & Hillel, 1990). An advantage of virtual
manipulatives, according to Reimer and Moyer (2005), is the capability of connecting dynamic
visual images with abstract symbols, a limitation of regular manipulatives. For example, virtual
manipulatives could be effectively used to teach fraction concepts to elementary students (Suh et
al., 2005). Others have found that junior high school students benefited from using virtual pattern
blocks, platonic solids, and geoboards to explore geometric concepts (Reimer & Moyer, 2005)
Incorporating technology into mathematics instruction provides a variation in modes of
representation and delivery formats such as computer-assisted instruction, software, and video
anchors to promote engagement, improve student motivation and attitudes, and improve
mathematics achievement (Barron et at., 2002; Mulcahy et al., 2014). Educators now believe that
using a variety of technology interventions in the classroom is necessary and can improve
mathematics achievement and motivation (Eyyam & Yaratan, 2014).
Factors Contributing to Variation in the Effects of Technology on Mathematics
Achievement
This section addresses the factors contributing to the variation in the effects of
technology on mathematics achievement and specific areas examined in this meta-analysis.
17
Special Education Status
Mathematics competence is essential for all students, but a large number of students in
the United States demonstrate poor mathematics achievement (Swanson, 2006). As with
students’ reading disabilities, when math disabilities are present, they range from mild to severe.
We know that students with learning disabilities (LD) often experience a wide range of
mathematics difficulties ranging from basic skills, such as computation, to higher-order skills,
such as problem solving (Bryant et al., 2000; Geary, 2003). There is also evidence that children
manifest different types of disabilities in mathematics, including mastering basic number facts,
arithmetic weakness, trouble with the written symbol and concrete materials, mastering the
language of math, and challenges with the visual-spatial aspect of math (Garnett, 1998). The
effect of technology on the mathematical achievement of students with learning disabilities in
mathematics will vary according to the range and type of mathematical difficulties experienced
by these students.
In addition to the wide range and specific types of mathematical difficulties experienced
by LD students, it is also important to highlight that students of color are diagnosed at a higher
rate and are overrepresented in special education. School systems developed “solutions” in an
effort to help struggling Black and Latinx students become successful in schools. Instead of
looking at cultural differences or inequality of resources in schools, the idea of using special
education placements arose. This practice continued through time and has led to the growing
problem of overrepresentation of minorities in special education programs (Kreskow, 2013). As
Ferri and Connor (2005) pointed out, what is most troubling is that special education, although
conceived as a way to provide support and access for previously excluded students,
18
“paradoxically participates in maintaining rather than minimizing inequities” (Ferri & Connor,
2005, p. 94).
Technology can be used separately or as a supplementary tool when combined with
traditional teacher-directed instruction (Devisir & Kalaimathi, 2016) and is commonly delivered
in modes such as drill and practice, tutorials, and simulations (Bhalla, 2013). Findings from
previous studies have indicated that technology can be a valuable supplementary teaching
method for students who are struggling to learn mathematics, including students with LD. For
example, technology provides students with additional practice opportunities with immediate
feedback (Bouck & Flanagan, 2009), helps them to develop a more positive attitude toward
learning mathematics (Parkhurst et al., 2010), and allows teachers ways to tailor instruction to
meet individual student needs (Slavin & Lake, 2008).
Technology affects mathematical achievement differently for students with LD relative to
students without because of the wide range and different types of mathematical difficulties
experienced by students with LD. For example, an extremely handicapping math disability
derives from significant visual-spatial-motor disorganization and results in weak or lack of
understanding of math concepts, very poor number sense, and difficulty with pictorial
representations of numerals and signs on the page. For such students to develop an understanding
of math concepts, technology can be used to make repeated use of concrete teaching materials
(e.g., visual manipulatives). And because understanding visual relationships and organization is
difficult for these students, it is important to anchor visual constructions in repeated experiences
with structured materials that can be seen and moved around as they are talked about, such as in
the use of tutorials and exploratory environments (Garnett, 1998). Consistent with the hypothesis
that technology has a positive effect on special education students, Langone (1998) also studied
19
the effects of technology-based anchored instruction and found that the quality of examples used
by special education students to support their answers on essay tests improved. Special education
students were also able to maintain visual images over time and better understand the
instructional strategies learned during the course (Langone, 1998).
Types of Technology
Another factor that may contribute to the variation of the effects of technology on
mathematical achievement is the type of technology used in the intervention. As described
already, researchers (Lou et al., 2001; Means, 1994) have classified educational technology
interventions into five main categories: (a) tutorial, (b) communication media, (c) exploratory
environments, (d) software tools, and (e) programming.
The effects of technology on mathematical achievement may depend on the learning
environment to which the technology is applied. Some researchers have attributed student
academic success and attitudinal change to not only the use of technology itself but also the
embedded method of teaching developed by pedagogical reform (Toh, 2016). Two distinct
pedagogical approaches have been cited most frequently in the research studies: traditional and
constructivist teaching (Li & Ma, 2010). Li and Ma (2010) defined the traditional approach of
teaching as teacher-centered whole-class instruction and the constructivist approach as studentcentered instruction that emphasizes strategies such as exploratory-based (inquiry-oriented)
learning, problem-based (application-oriented) learning, and situated cognition based on
constructivism. Types of technology that promote and incorporate this constructivist approach,
such as exploratory environments, software tools, and programming, will be more effective in
advancing mathematical achievement. Furthermore, software tools that work to reduce
extraneous cognitive load and promote germane cognitive load (Sweller, 1988) will also be more
20
effective in increasing mathematical achievement. Examples include virtual manipulatives,
worked examples, and adaptive feedback (Paas et al., 2003). Finally, types of technology that
facilitate learning within the zone of proximal development (Vygotsky, 1978), such as tutorial
and communication media, which can be used to incorporate the role of a knowledgeable
“other,” can facilitate learning within the ZPD and boost mathematical achievement.
Percentage of Black and Latinx Students
The percentage of Black and Latinx students was also used as a moderator. Because this
study used the criteria that 40% of the sample population had to be Black, Latinx, or a
combination of both, this moderator is important to analyze. For example, learning strategies,
such as authentic learning, not only promote positive STEM identity but can also contribute to
increased engagement and self-efficacy measures, particularly for Black and Latinx students
(Chemers et al., 2011). Tipton et al. (1996) further suggested that authentic learning experiences
using technology can provide educators with the tools needed to address equity issues among
Black and Latinx students, improve student achievement, and provide support for students who
learn in different ways (Tipton et al., 1996). In this meta-analysis, I hypothesized that technology
interventions would have a positive impact on the mathematical achievement of Black and
Latinx students as their sample population increases because they would benefit from the active
engagement and real-world problem solving inherent in technology applications such as
programming and tutorials.
The Current Synthesis
A significant theme from the 70 meta-analyses done by Hattie (2023) on the effects of
technology on mathematics achievement is that the explicit and intentional high-impact teaching
methods utilizing effective learning strategies are dominant in terms of the most effective
21
interventions. Hattie (2023) argued that it is less the technology programs and textbooks, and
more the effective integration of technology into the daily teaching practices and student
interactions that have larger impacts on students’ achievement (Hattie, 2023).
This research synthesis was based on existing research included in Hattie’s (2023) Visible
Learning. By aggregating and analyzing the findings of multiple empirical studies, this research
synthesis aimed to provide a comprehensive understanding of the relationship between
technology interventions and mathematical achievement for Black and Latinx students. This
meta-analysis extends the work of prior researchers by focusing on the effects of technology on
Black and Latinx students in the United States. The research questions this meta-analysis
addressed were as follows:
1. To what extent does the use of technology affect the mathematical achievement of
Black and Latinx students in the United States?
2. To what extent does the effect of technology on mathematical achievement among
Black and Latinx students vary depending on the type of technology-based
interventions used (tutorial, communication media, exploratory environments,
software tools, or programming)?
3. To what extent does the effect of technology on mathematical achievement vary
across special education status among Black and Latinx students?
4. To what extent does the percentage of Black and Latinx students moderate the effects
of technology on their mathematical achievement?
I hypothesized that racially diverse students’ mathematical achievement would be positively
influenced by the use of technology because it allows for the utilization of learning and teaching
strategies. I also hypothesized that the effects of technology among students identified as having
22
learning disabilities or receiving special education would be positively influenced because
technology allows students to be appropriately challenged while receiving necessary support. I
also hypothesized that technology would have a positive effect on mathematics achievement as
the percentage of Black and Latinx student population increases. And finally, I hypothesized that
programming and tutorials would be more effective than the others, again because of the
opportunities to utilize learning and teaching strategies.
Methods
This research synthesis was a meta-analysis based on Hattie’s (2023) groundbreaking
book Visible Learning, which resulted in the synthesis of more than 800 meta-analyses relating
to the influences on the achievement of school-aged students. This research synthesis was
specifically aimed toward the analysis and integration of research findings relative to the
effectiveness of technology interventions in the mathematical achievement of K–12 Black and
Latinx students in the United States. A comprehensive literature search and data screening was
performed to obtain data later analyzed in this meta-analysis.
Literature Search
The studies included in this meta-analysis were included in Hattie’s (2023) meta-analysis
of evidence-based research on what works in schools to improve learning. A Visible Learning
website organized the meta-analyses into areas that affect learning in the categories of students,
home, school, curricula, teachers, and teaching strategies. The Visible Learning website lists 19
meta-analyses in the Technology in Mathematics section, which fell into the domains of
Technology, School, and Out-of-School, and defined the influence as the use of computer
technologies in the teaching of mathematics. Five meta-analyses were excluded from this
synthesis because they did not list the studies used in their meta-analyses (Athappilly et al.,
23
1983; Burns & Bozeman, 1981; Hsu, 2003; King, 1997; Kulik, 1983). Two meta-analyses were
omitted because they could not be located through the USC Library Database, Google Scholar,
ProQuest, ERIC, PsychINFO, or Google (Leong, 1981). This included exhausting the resources
of the USC Interlibrary Loan and Document Services (ILL). Two meta-analyses were omitted
because they were conducted exclusively outside the United States: Xie et al. (2020) in China
and Demir and Basol (2014) in Turkey. This resulted in the review of studies included in 10
meta-analyses.
These 10 meta-analyses were located using the USC Library Database. A list of the
studies included in each of the 10 meta-analyses was gathered in a Google Sheet on a Google
Drive folder for the technology in mathematics influence. Each study was searched first using
the USC Library Database. If the study could not be found in the USC Library Database, the
study was further searched using Google Scholar, ProQuest, ERIC, and PsychINFO. When
studies could not be located using these databases, an Interlibrary Loan and Document Delivery
(ILL) request was made through the USC library to locate the specific study. Most of the studies
were found using this method; however, there were a few that exhausted the search capabilities
of the ILL request. When a study was located, it was hyperlinked (or indicated as a hard copy) in
the “Data Screening - Technology in Mathematics” Google Sheets document, which also
included the influence (technology in mathematics), author(s), year of publication, and whether a
copy of the study was added to the “Individual Studies” Google folder.
The Visible Learning website specified that 911 studies were included in Hattie’s (2023)
meta-analyses for this influence. However, due to the reasons stated previously, 609 studies, or
66.8% of the original studies, were located on the list of meta-analyses on the Visible Learning
website. Of those 609 studies, 94 (or 14.4%) were duplicate studies, meaning that they were
24
found in more than one meta-analysis, and 29 studies (4.8%) were unavailable. Examples of
unavailable studies included an unpublished dissertation in Canada (Bamberger, 1984), a study
investigating the effects of Arabic-logo instruction in Kuwaiti (Bu-Zebar, 1989), and a study
investigating the impact of personalized computer-assisted mathematics on Taiwanese students
(Chen & Liu, 2007). ILL requests were denied, were not returned, or exhausted all possible
searches for all of the unavailable studies. Of the 609 studies, 46 studies (7.6%) did not use a
sample of students in the United States. Only 108 (17.7%) of the studies included sample race
and ethnicity data. Of those 108 studies, 50 (8.2%) met the inclusion criteria for race and
ethnicity. Figure 1 and Figure 2 provide Prisma Charts of the literature search process (Tricco et
al., 2018).
Figure 1
Step 1 Prisma Chart, Number of Meta-Analyses Retrieved
25
Figure 2
Step 2 Prisma Chart, Number of Studies Screened and Coded
26
Inclusion Criteria
To be included in the current synthesis, a study had to have been originally included in a
meta-analysis in Hattie’s (2023) Visible Learning, the targeted sample had to have been located
in the United States, and the sample population had to have been at least 40% Black and Latinx
students. This information was commonly found in the abstract or methods section of the study.
If this information was missing or could not be found, the study was not included in this
synthesis. This information was recorded in the “Data Screening, Technology in Mathematics”
Google Sheets document and included whether the study sample were U.S. students, the sample
size, and the percentage of White, Black, Hispanic, Asian, and other races in the study sample
population. Of the 609 studies, 50 (8.2%) met these criteria. Excluded were duplicate studies
(54), studies that did not include sample/race ethnicity data (372), samples that were outside the
United States (46), USC ILL performing an exhaustive search but not locating the studies (29),
studies that could not be retrieved (40), and 18 additional studies removed at the coding stage
that were not experimental or quasi-experimental, did not fit the sample criteria, did not assess
technology achievement, or did not have enough information to calculate an effect size.
Technology was operationalized to include only computer technology (i.e., disregarding
graphing calculators) and only software and not hardware. The various types of computer
technology were categorized as (a) tutorial, which are programs that directly teach mathematics
by setting up an environment where information, demonstration, and drill and practice are
provided; (b) communication media, which are tools or platforms that facilitate the exchange of
information, ideas, and knowledge; (c) exploratory environments, which encourage active
learning through discovery and exploration; (d) tools, which serve the technological purpose of
making teaching and learning fun, effective, and efficient; and (e) programming, which enhances
27
students’ computational thinking skills and problem-solving abilities. Note that there were no
studies in this meta-analysis that included communication media as the technology type, so this
type of technology was not analyzed.
Mathematical achievement was operationalized by using both standardized and nonstandardized testing. Examples of standardized tests and outcomes measured used include the
Stanford Achievement Test, the Comprehensive Test of Basic Skills (Billings, 1986), the
California Standards Test (CST), and the California Achievement Test (CAT6; Boster et al.,
2005). Non-standardized tests used to measure outcomes primarily consisted of pre- and posttest
instruments (Anand & Ross, 1987; Bai et al., 2012; Barrow et al., 2009) and end-of-unit or endof-grade examinations (Brown, 2000). In addition, there had to be enough information to
compute an effect size for a study to be included in this research synthesis. This included studies
with two or more comparison group experiments, and two or more comparison group quasiexperiments. Included studies and characteristics are provided in Table 6, Table of Studies and
Characteristics Included in This Meta-Analysis.
Data Extraction
Graduate researchers and I extracted a variety of information from the studies that met
the inclusion criteria. The coding guide used in this synthesis is being used as part of a broader
project to re-synthesize all of Hattie’s influences and may be reviewed in full in Appendix A.
The coding guide consists of a variety of items related to the meta-analysis characteristics, report
characteristics, participant and sample characteristics, predictor influences, outcome measures,
research design, and the calculation of effect sizes. Meta-analysis characteristics included the
meta-analysis’s name. Coded report characteristics included the publication type, data sources,
the year the data was collected, and whether the report used an overlapping dataset. Coded
28
setting characteristics included the location of the study by region and school level. Coded
participant and sample characteristics include information related to whether the sample was
analyzed overall or as a subgroup or both, percentages related to race and ethnic groups, grade
level, gender, percentage of the sample that was low-income or economically disadvantaged,
percentage of the sample that was special education, and percentage of the population that was
English language learners.
Coded influence and predictor measures included the influence definition in the report
and how the influence was manipulated by the researcher. Coded outcome measures included the
outcome types, such as state standardized tests or GPA, outcome descriptions, the domain of the
outcome, the unit of analysis, timing of the influence, and whether it was a simultaneous or
longitudinal collection. Choices for these categories may be seen in the “Outcome Measures”
section in the appendix. Coded research design and effect size information included the sample
size, the direction of the relationship between the influence and the outcome, and the type of
research design.
For experiments and quasi-experiments, the means, standard deviations, and N sizes were
entered for both the treatment groups and the control groups. Because all of the studies were
experiments or quasi-experiments, correlational studies were not coded. An online effect size
calculator was used to calculate d and v. All studies included in this meta-analysis provided
means, standard deviations, and sample sizes (N) for effect size calculations.
Coders and I were trained for several weeks preceding the commencement of coding the
articles included in this meta-analysis. Training included weekly meetings in which research
supervisors who are experts in meta-analysis ensured understanding of the coding guide and
practiced coding several papers as a group and then as individuals. Once an 80% agreement rate
29
between graduate student coders and research supervisors was established, we were allowed to
code independently. Following the weekly training sessions, I was the primary coder, and
another set of coders validated my codes for all reports that were included in this meta-analysis.
All discrepancies were noted and resolved through discussion. Any disputes were resolved with
further discussion with the dissertation chairs. Interrater reliability was assessed, and the error
rate was .87%, which was used to evaluate the degree of agreement between the primary coder
(myself) and the validators.
Computing Effect Sizes
Effect sizes for experimental and quasi-experimental studies were calculated as
standardized mean differences (SMD) on mathematical achievement between treatment and
control groups. Effect sizes were calculated directly from the means, standard deviations, and
sample sizes for the intervention and control groups whenever possible. For studies that included
multiple treatment conditions compared with a single control condition, the effect size for each
intervention condition was calculated separately. All intervention effect sizes were converted to
bias-corrected Hedge’s g, a standardized effect size that corrects for a slight positive bias in
effects present with small group samples (Hedges, 1981).
Analysis Strategy
Data were meta-analyzed using the metafor and clubSandwich R packages (Pustejovsky,
2019; Viechtbauer, 2010). Random-effects modeling was used throughout the analyses. To
account for dependency between multiple effect size estimates within studies and guard against
potential model misspecification, a multi-level modeling approach was adopted in conjunction
with a robust variance estimator (RVE; Pustejovsky & Tipton, 2020). A random-effects model
was used to estimate the pooled effect size for the relationship between the technology
30
intervention and mathematics achievement. We also assessed the heterogeneity among effects,
indicated by Q, τ2
, and I
2 statistics. Lastly, we reported 95% confidence intervals (CI) for the
weighted average effect (Borenstein, 2019).
To further explain heterogeneity in the effect size estimates, we utilized mixed-effects
meta-regression models. We examined the effect of moderators in separate models. The
moderators we examined included the types of technology interventions used, the percentage of
Black and Latinx students, and the percentage of special education students. We also examined
the possibility of publication bias and funnel plot asymmetry by conducting an Egger’s
regression test (Egger et al., 1997), and used publication status as a moderator test.
Results
There were 50 studies (25 published and 25 unpublished) included in this research
synthesis of technology intervention studies containing 143 samples and 188 effect sizes. These
studies appeared between 1975 and 2010. The sample sizes ranged from seven to 3,625 with a
total sample size of 70,764.
Overall Effects for Technology Interventions
The first research question asked if the use of technology interventions has an effect on
the mathematical achievement of Black and Latinx students in the United States.
Based on the technology interventions, the pooled average effect size of the difference
between groups receiving technology interventions and control groups was positive and
statistically significant for mathematical achievement (g = 0.52, p < .001). The results shown in
Table 1 indicate that technology interventions had a positive effect on the mathematical
achievement of Black and Latinx students in the United States and are statistically significant,
31
which is consistent with my hypothesis. This is a medium-size effect typical of education
interventions (Hattie, 2023).
Publication Bias
Publication bias was addressed in this study by using the publication status as a
moderator test and also using Egger’s test (Egger et al., 1997). Using Egger’s test, a contourenhanced funnel plot was created with the funnel centered at 0. The y-axis was the Standard
Error, and the x-axis was the average effect sizes. The plot suggests that many studies are
centered around the pooled average effect size, 0.523. The plot also shows that the studies with
the larger effect sizes also have smaller standard errors (i.e., they are more precise). These data
indicate the opposite of what would be expected if there were publication bias. The results from
the modified Egger’s regression model suggested that there was evidence of funnel plot
asymmetry for the experimental dataset (b = 1.73, SE = .33, t[186] = 5.33, p < .001) and is
statistically significant. Additionally, the moderator analysis comparing the published and
unpublished reports indicated that the pooled effect sizes for unpublished reports were higher
than that of published reports and were statistically significant. The results of this moderator
analysis are reported in Table 2.
Moderator Analysis: Percentage of Black and Latinx Students
When looking at the percentage of Black students as a moderator, the average effect was
0.0753 when the percentage of Black students was zero, and there was a .0092 change in the
effect size for every 1% increase of Black students, which was statistically significant. For the
Latinx population, the average effect size was 0.6564 when the Latinx population was zero, and
there was a –0.0071 change in the effect size for every 1% increase in Latinx students, which
was also statistically significant. This is consistent with the hypothesis that the effect of
32
technology on the mathematical achievement of Black students would be more positive.
However, in the case of Latinx students, the effect was more negative, which was inconsistent
with the hypothesis. The results of these moderator analyses are reported in Table 3.
Moderator Analysis: Percentage of Special Education Students
When examining the percentage of special education status of students as another
moderator, the average effect size was 0.1434 when the percentage of special education students
was zero, and there was a 0.0022 change in the effect size for every 1% increase in the special
education student population. However, this slope was not statistically different from zero.
Because it was not statistically significant, this was also inconsistent with the hypothesis that
technology would have a positive effect on special education students. The results of this
moderator analysis are reported in Table 4.
Moderator Analysis: Types of Technology
The types of technology interventions examined in this meta-analysis included tutorial
programs, exploratory environments, software tools, and programming (also known as coding).
Missing information prevented a reliable analysis for the moderator communication media. In
order to address the research question of what types of technology are most effective in
promoting mathematical achievement compared with the other types of technology used, I used
one of the technology interventions as a reference group (i.e., tutorial) and compared it with the
other technology interventions (i.e., exploratory environments, software tools, and
programming). This analysis showed that there were no statistically significant differences
between the types of technology used in the intervention. In other words, comparing the effects
of the type of technology used did not indicate that one type of technology intervention was more
effective than any of the others. The results of this moderator analysis are reported in Table 5.
33
Discussion
This meta-analysis reports the effect of technology on the mathematical achievement of
Black and Latinx students in the United States. Using the evidence provided in 50 technology
intervention studies, the use of technology had varying effects on the mathematics achievement
of Black and Latinx students, but the differences in types were not statistically significant. In
other words, no one technology intervention was better than the others. However, when using the
percentage of Black and Latinx students as a moderator, effect sizes increased as the percentage
of Black students increased, and effect sizes decreased as the percentage of Latinx students
increased.
Summary of Key Findings
This meta-analysis makes several broad contributions to the current literature. First, it
shows that technology interventions have a positive and statistically significant average pooled
effect size on the mathematics achievement of Black and Latinx students in the United States. It
also establishes a positive link with the percentage of Black students and a negative link with the
percentage of Latinx students and the effects of the technology interventions and their
mathematics achievement. In other words, as the percentage of Black students increases, the
effect size increases; however, as the percentage of Latinx students increases, the effect size
decreases. Third, it found no statistically significant differences between the types of technology
used in the interventions and their effect sizes relative to one another; that is, there were no
statistically significant differences in which some technology interventions were better than
others. Last, it demonstrates that the relationship between technology interventions and the
mathematics achievement of special education students was no different from that of general
education students.
34
Alignment of Findings With Theory and Prior Research
Although the findings did not result in showing that one type of technology was more
effective than the others, as hypothesized, the findings did indicate that the effect sizes for each
technology intervention were quite different. Programming had an effect size of g = 0.899 and
was statistically significant. Tutorials had an effect size of g = 0.549 and was also statistically
significant. Software tools had an effect size of g = 0.371 and was also statistically significant.
However, exploratory environments had an effect size of g = 0.339 and was not statistically
significant.
Programming interventions were hypothesized to be more effective because of their
potential to reduce cognitive load by providing students with structured tasks and logical
frameworks within which to operate. This reduction in cognitive load was thought to be
especially beneficial for Black and Latinx students, who may face additional cognitive
challenges due to various socioeconomic and educational factors (Chambers-Richardson, 2023).
Additionally, programming fosters authentic learning experiences by immersing students
in real-world problem-solving scenarios, based on ideas of diversity and inclusion, which helps
students’ sense of belonging and the formation of positive STEM identities (Singer et al., 2020).
Through programming projects and applications, students encounter mathematical concepts in
context, allowing for deeper understanding and retention, aligning with the learners’ sense of
life, and building upon the cultural assets of the students.
For example, Emihovich and Miller (1988) used Logo as the programming language with
Black first graders. Logo is different from other computer languages in that it places the student
in control by having the student program the actions of an object that is personalized and
familiar: the “turtle.” Logo gives students freedom to make discoveries on their own, and by
35
programming the “turtle,” students “acquire a sense of mastery over a piece of the most modern
and powerful technology and establish an intimate contact with some of the deepest ideas from
science, from mathematics, and from the art of intellectual model building” (Papert, 1980, p. 5).
Emihovich and Miller (1988) utilized all learning strategies (i.e., constructivism, cognitive load
theory, ZPD scaffolding, and authentic learning) to produce an effect size of d = 1.2523,
considered a large effect size in educational interventions (Hattie, 2023). The study also built
upon the learning strengths of Black students, such as high responsiveness to visual and auditory
stimuli and the desire to collaborate with and pass on information to peers. By encouraging
Black students to talk and share their ideas and to use the “turtle” as a concrete representation of
their thinking, a learning environment can be constructed that makes these students more aware
of their thought processes (Emihovich & Miller, 1988).
Further research should continue to explore and unpack the specific ways in which the
culturally responsive use of technology promotes positive STEM identity formation and why
Black students benefit from this and Latinx students do not. For example, Sharkness et al. (2010)
found that students entering college with stronger high school preparation in both mathematical
achievement and study skills/time management appears to set the stage for success in STEMrelated majors. Similarly, participating in academic activities such as undergraduate research
programs, tutoring other students, programs to prepare for graduate school, and clubs related to
academic STEM majors also facilitated the promotion of a positive STEM identity. On the other
hand, facing socioemotional challenges, such as feeling overwhelmed, or feeling strong
competition for grades, had a negative impact on STEM identity formation (Sharkness et al.,
2010). This further research would provide a direct connection to Black and Latinx students.
36
Implications for Theory and Practice
Programming interventions were hypothesized to be effective in increasing mathematics
achievement because they align with the use of ZPD. By presenting students with tasks that are
slightly beyond their current level of proficiency but achievable with guidance, programming
activities scaffold learning experiences, facilitating incremental skill development. This targeted
approach to instruction ensures that students are appropriately challenged while receiving
necessary support, thereby maximizing learning outcomes. The results of this analysis indicated
that programming interventions that utilized one or more of these learning strategies (Billings,
1986; Emihovich & Miller, 1988; Lehrer & Randle, 1987; Yusuf, 1991, 1995) were more
effective than those that used none of these learning strategies. The other types of technology
interventions—tutorial programs, exploratory environments, and software tools—also exhibited
varying degrees of effectiveness. Tutorial programs, the most commonly used technology
intervention in this meta-analysis, often relied on ZPD scaffolding as the primary learning
strategy. The results from this analysis showed that tutorial programs that utilized ZPD
scaffolding and authentic learning (Dellario, 1987; Dynarski et al., 2007) were more effective
than just using ZPD scaffolding alone (Din & Caleo, 2000; Howell, 1996; Stevens, 1991;
Webster, 1990; Ysseldyke & Bolt, 2007; Ysseldyke et al., 2003).
Although ZPD scaffolding can be beneficial, its effectiveness may be hindered by the
lack of active engagement and authentic problem solving inherent in more interactive
interventions. Such environments offer opportunities for discovery and exploration, yet their
effectiveness appears to fluctuate. For example, Forde (2003) utilized all learning strategies and
had a large effect size. Others who used exploratory environments as the technology intervention
and used none or some of the other learning strategies had fewer effective results (Lester, 1996;
37
Ogbuehi & Fraser, 2007; Rutherford et al., 2010). Exploratory environments are highly complex
technology interventions that require extensive teacher training, and results from this analysis
may reflect inadequate or insufficient teacher training, but further research needs to be done to
better understand this. Software tools, while potentially effective when combined with
appropriate learning strategies, also demonstrated mixed results. Software tools that utilized the
reduction of cognitive load along with ZPD scaffolding (Anand & Ross, 1987; Ku et al., 2007;
Page, 2002) were more effective than those that utilized only ZPD scaffolding (Berlin & White,
1986).
Limitations
This study was limited to a specific set of technology interventions, which included
tutorial programs, software tools, exploratory environments, and programming. This potentially
overlooks other innovative approaches that could influence mathematical achievement among
Black and Latinx students. For example, assistive technology tools for math, which are designed
to help students who struggle with computing, organizing, aligning, and copying math problems
down on paper, were not examined. Another example is artificial intelligence (AI), which has the
potential to offer personalized learning experiences. By analyzing data on each student’s
strengths and weaknesses, AI has the potential to create customized learning plans that cater to
each student’s needs.
Additionally, the small number of studies, particularly for exploratory environments (6),
software tools (5), and programming (5), was another limitation. A larger number of studies in
each of these technology interventions could have resulted in different effect outcomes.
38
Recommendations for Future Research
The findings of this meta-analysis have several implications for educational practice and
future research. Educators and policymakers should prioritize the integration of programming
into mathematical curricula, particularly for Black students, to capitalize on their effectiveness in
promoting mathematical achievement and positive STEM identity formation. Additionally,
practitioners should carefully consider the design and implementation of other technology
interventions, ensuring alignment with evidence-based learning strategies to maximize the
impact of technology interventions.
A notable finding from this meta-analysis was that there were no statistically significant
differences in the effectiveness of the different technology interventions used. There appeared to
be a trend showing differences in effectiveness, but it was not statistically significant, and further
research could better reveal these reasons.
Future research could also investigate longitudinal studies examining the sustained
impact of technology integration on mathematical achievement among Black and Latinx
students. For example, is it more effective to apply technology intervention throughout the
school year, or only for a semester? Or perhaps the technology intervention is effective for a
certain period but over time diminishes.
Finally, future research could also investigate other potential moderators. For example,
investigating gender could be invaluable, particularly because only a third of the STEM
workforce is female (Rivers, 2017). Future research could also investigate the effect of
technology on the mathematical achievement of low-income students to again see if this would
carry across to increasing their participation in the STEM workforce (Rivers, 2017). Future
research could also investigate the effect of technology on English language learners, perhaps
39
revealing the reasons why the effects of technology interventions have a negative effect as the
percentage of Latinx students increases, as they may also be English language learners. The
language barrier may present a challenge to positive STEM identity formation for Latinx
students, but further research needs to be done to provide a better understanding.
Conclusions
By leveraging the insights gleaned from this meta-analysis, educators can make informed
decisions regarding the selection and implementation of technology interventions to support the
mathematical achievement of Black and Latinx students, ultimately fostering greater equity and
inclusion within mathematical education.
Examining the effects of technology on the mathematical achievement of Black and
Latinx students is key to understanding how to increase their participation and success in STEM
classrooms and careers. One could argue that, based upon current data and conditions, our most
pressing challenges and our most potent opportunities require cultivating problem-solving skills
among Black and Latinx students rooted in STEM. Ten of the top 14 fastest-growing industries
require some kind of STEM training (Deming & Noray, 2018), Yet, according to researchers
Bell et al. (2019) and Raj Chetty, a Stanford economist, we are losing innovators and their
breakthroughs every day, because those who could have highly impactful innovations are not
being given the opportunities they deserve. Black and Latinx students are missing out on STEM
opportunities, so the planet is missing out on their brilliance (Milgrom-Elcott, 2020).
40
References
Albright, M. J., & Graf, D. (1992). Teaching in the information age: The role of educational
technology. New Directions for Teaching and Learning, 51 (Jossey-Bass Higher and
Adult Education Series). Jossey-Bass.
Alegria, S. N., & Branch, E. H. (2015). Causes and consequences of inequality in the STEM:
Diversity and its discontents. International Journal of Gender, Science and Technology,
7(3), 321–342.
*Anand, P. G., & Ross, S. M. (1987). Using computer-assisted instruction to personalize
arithmetic materials for elementary school children. Journal of Educational Psychology,
79(1), 72.
Anthony, A. K., Walters, L., & McGrady, P. (2017). Creating connections between authentic
research and the development of science identities in undergraduate marine biology
experiences. Florida Scientist, 80(2/3), 61–76.
Ash, J. E. (2005). The effects of computer-assisted instruction on middle school mathematics
achievement. ETD Collection for Tennessee State University (Paper AAI3187584).
https://digitalscholarship.tnstate.edu/dissertations/AAI3187584
Athappilly, K., Smidchens, U., & Kofel, J. W. (1983). A computer-based meta-analysis of the
effects of modern mathematics in comparison with traditional mathematics. Educational
Evaluation and Policy Analysis, 5(4), 485–493.
Ayres, P., & Sweller, J. (2005). The split-attention principle in multimedia learning. The
Cambridge Handbook of Multimedia Learning, 2, 135–146.
41
*Bai, H., Pan, W., Hirumi, A., & Kebritchi, M. (2012). Assessing the effectiveness of a 3‐D
instructional game on improving mathematics achievement and motivation of middle
school students. British Journal of Educational Technology, 43(6), 993–1003.
Baker, R., & Siemens, G. (2014). Learning analytics and educational data mining. In R. K.
Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 253–272). Cambridge
University Press.
Barron, A. E., Orwig, G. W., Ivers, K. S., & Lilavois, N. (2002). Technologies for education: A
practical guide. Libraries Unlimited.
Bamberger, H. J. (1984). The effect of Logo (turtle graphics) on the problem solving strategies
used by fourth grade children. Logo Foundation. https://el.media.mit.edu/logofoundation/resources/papers/pdf/research_logo.pdf
*Barrow, L., Markman, L., & Rouse, C. E. (2009). Technology’s edge: The educational benefits
of computer-aided instruction. American Economic Journal: Economic Policy, 1(1), 52–
74.
Bell, A., Chetty, R., Jaravel, X., Petkova, N., & Van Reenen, J. (2019). Who becomes an
inventor in America? The importance of exposure to innovation. The Quarterly Journal
of Economics, 134(2), 647–713.
*Berlin, D., & White, A. (1986). Computer simulations and the transition from concrete
manipulation of objects to abstract thinking in elementary school mathematics. School
Science and Mathematics, 86(6), 468–479.
Bernard, R. E., & Cooperdock, E. H. (2018). No progress on diversity in 40 years. Nature
Geoscience, 11(5), 292–295.
42
Berryman, H. G. (1999). The effects of technology education labs on third-grade math scores
[Doctoral dissertation, University of Sarasota]. LearnTechLib.
https://www.learntechlib.org/p/129314/
Bhalla, J. (2013). Computer use by school teachers in teaching-learning process. Journal of
Education and Training Studies, 1(2), 174–185.
*Billings, L. J., Jr. (1986). Development of mathematical task persistence and problem-solving
ability in fifth and sixth grade students through the use of Logo and heuristic
methodologies. Doctoral dissertation, Northern Arizona University.
Bitter, G. G., & Pierson, M. E. (2001). Using technology in the classroom. Allyn & Bacon.
Borenstein, M. (2019). Heterogeneity in meta-analysis. In H. Cooper, L. V. Hedges, & J. C.
Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 453–470).
Russell Sage Foundation.
*Boster, F. J., Yun, J. A., Strom, R., & Boster, L. J. (2005). Evaluation of New Century
Education Software 7th Grade Mathematics Academic Year 2004-2005 at Grant Joint
Union High School District. Policy Information Report.
Bouck, E. C., & Flanagan, S. (2009). Assistive technology and mathematics: What is there and
where can we go in special education. Journal of Special Education Technology, 24(2),
17–30.
Bransford, J. D., Sherwood, R. D., Hasselbring, T. S., Kinzer, C. K., & Williams, S. M. (2012).
Anchored instruction: Why we need it and how technology can help. In D. Nix & R.
Spiro (Eds.), Cognition, education, and multimedia (pp. 115–141). Routledge.
*Brown, F. (2000). Computer assisted instruction in mathematics can improve students’ test
scores: A study. Research Report.
43
Bryant, D. P., Bryant, B. R., & Hammill, D. D. (2000). Characteristic behaviors of students with
LD who have teacher-identified math weaknesses. Journal of Learning Disabilities,
33(2), 168–177.
*Burns, P. K., & Bozeman, W. C. (1981). Computer-assisted instruction and mathematics
achievement: Is there a relationship? Educational Technology, 21(10), 32–39.
Bu-Zebar, A. M. (1989). The effects of Arabic Logo programming and problem-solving
instruction of mathematical and non-mathematical problem-solving strategies used by
Kuwaiti sixth-grade students. Doctoral Dissertation, University of Oregon.
*Campuzano, L., Dynarski, M., Agodini, R., & Rall, K. (2009). Effectiveness of reading and
mathematics software products: Findings from two student cohorts (NCEE 2009-4041).
National Center for Education Evaluation and Regional Assistance.
Carlone, H. B., & Johnson, A. (2007). Understanding the science experiences of successful
women of color: Science identity as an analytic lens. Journal of Research in Science
Teaching, 44(8), 1187–1218.
Carter, C. M., & Smith, L. R. (2001). Does the use of learning logic in algebra I make a
difference in algebra II? Journal of Research on Technology in Education, 34(2), 157–
161.
Cemal Nat, M., Walker, S., Bacon, L., Dastbaz, M., & Flynn, R. (2011). Impact of metacognitive
awareness on learning in technology enhanced learning environments. Available from
https://www.researchgate.net/publication/275638584_Impact_of_Metacognitive_Awaren
ess_on_Learning_in_Technology_Enhanced_Learning_Environments
44
Chambers-Richardson, I. L. (2023). Improving mathematical outcomes for African American and
Latinx students [Doctoral dissertation, University of Dayton]. OhioLINK.
http://rave.ohiolink.edu/etdc/view?acc_num=dayton1689681486541559
Char, C. A. (1989). Computer graphic feltboards: New software approaches for young children’s
mathematical exploration. San Francisco: American Educational Research Association,
7.
Chemers, M. M., Zurbriggen, E. L., Syed, M., Goza, B. K., & Bearman, S. (2011). The role of
efficacy and identity in science career commitment among underrepresented minority
students. Journal of Social Issues, 67(3), 469–491.
Chen, C. J., & Liu, P. L. (2007). Personalized computer-assisted mathematics problem-solving
program and its impact on Taiwanese students. Journal of Computers in Mathematics
and Science Teaching, 26(2), 105–121.
Chi, B., Dorph, R., & Reisman, L. (2015). Evidence & impact: Museum-managed STEM
programs in out-of-school settings. National Research Council Committee on Out-ofSchool Time STEM. National Research Council.
Clariana, R. B. (1996). Differential achievement gains for mathematics computation, concepts,
and applications with an integrated learning system. Journal of Computers in
Mathematics and Science Teaching, 15(3), 203–15.
Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational
Research, 53(4), 445–459.
Clark, D. L., Sr. (2005). The effects of using computer assisted instruction to assist high school
geometry students achieve higher levels of success on the Florida Competency
Achievement Test (FCAT). Doctoral Dissertation. Union Institute and University.
45
Clark, R. C., & Mayer, R. E. (2023). E-learning and the science of instruction: Proven
guidelines for consumers and designers of multimedia learning. John Wiley & Sons.
Clark, R. C., Nguyen, F., & Sweller, J. (2011). Efficiency in learning: Evidence-based guidelines
to manage cognitive load. John Wiley & Sons.
Clements, D. H., Battista, M. T., & Sarama, J. (2001). Logo and Geometry (Journal for Research
in Mathematics Education Monograph 10) [Unpublished manuscript]. Reston, VA.
Colby, S. L., & Ortman, J. M. (2015). Projections of the size and composition of the US
population: 2014 to 2060 (Report No. P25-1143). U.S. Census Bureau.
https://www.census.gov/library/publications/2015/demo/p25-1143.html
Coley, R., Cradler, J., & Engel, P. K. (1997). Computers and classrooms: The status of
technology in US schools. Policy Information Report.
Connell, M. L. (1997). Technology in constructivist mathematics classrooms. In Society for
Information Technology & Teacher Education International Conference (pp. 601–604).
Association for the Advancement of Computing in Education (AACE).
*Dellario, T. E. (1987). The effects of computer-assisted instruction in basic skills courses on
high-risk ninth-grade students. Doctoral Dissertation, Western Michigan University.
Deming, D. J., & Noray, K. L. (2018). STEM careers and the changing skill requirements of
work (No. w25065). National Bureau of Economic Research.
Demir, S., & Basol, G. (2014). Effectiveness of Computer-Assisted Mathematics Education
(CAME) over academic achievement: A meta-analysis study. Educational Sciences:
Theory and Practice, 14(5), 2026–2035.
Devisir, K., & Kalaimathi, H. D. (2016). Effect of CAI on the achievement of information
technology. Lulu.com.
46
*Din, F. S., & Caleo, J. (2000). Playing computer games versus better learning. Research
Report.
Donnelly, S., Dean, S., Razavy, S., & Levett-Jones, T. (2019). Measuring the impact of an
interdisciplinary learning project on nursing, architecture and landscape design students’
empathy. PloS One, 14(10), e0215795.
*Dynarski, M., Agodini, R., Heaviside, S., Novak, T., Carey, N., Campuzano, L., Means, B.,
Murphy, R., Penuel, W., Javitz, H., Emery, D., & Sussex, W. (2007). Effectiveness of
reading and mathematics software products: Findings from the first student cohort:
Report to Congress. National Center for Education Evaluation and Regional Assistance.
https://ies.ed.gov/ncee/pdf/20074005.pdf
Ebert, D. (2014). Graphing projects with Desmos. The Mathematics Teacher, 108(5), 388–391.
Edelson, D. C. (1996). Learning from cases and questions: The Socratic case-based teaching
architecture. The Journal of the Learning Sciences, 5(4), 357–410.
Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by
a simple, graphical test. BMJ, 315(7109), 629–634.
*Emihovich, C., & Miller, G. E. (1988). Effects of Logo and CAI on Black first graders’
achievement, reflectivity, and self-esteem. The Elementary School Journal, 88(5), 473–
487.
Eyyam, R., & Yaratan, H. S. (2014). Impact of use of technology in mathematics lessons on
student achievement and attitudes. Social Behavior and Personality, 42(1), 31S–42S.
Ferri, B. A., & Connor, D. J. (2005). In the shadow of Brown: Special education and
overrepresentation of students of color. Remedial and Special Education, 26(2), 93–100.
47
*Forde, T. B. (2003). The effects of technology-supported cognitive mathematical instruction on
African-American students characterized as at-risk for school failure [Doctoral
dissertation, Vanderbilt University]. LearnTechLib.
https://www.learntechlib.org/p/123847/
Funkhouser, C. (2002). The effects of computer-augmented geometry instruction on student
performance and attitudes. Journal of Research on Computing in Education, 35(2), 163–
175.
Garnett, K. (1998). Math learning disabilities. LD Online. https://www.ldonline.org/ldtopics/math-dyscalculia/math-learning-disabilities
*Gatti, G. G., & Petrochenkov, K. (2010). Pearson SuccessMaker math efficacy study: 2009–10
final report. Gatti Evaluation.
Geary, D. C. (2003). Learning disabilities in arithmetic: Problem-solving differences and
cognitive deficits. Handbook of Learning Disabilities, 199–212.
Granovskiy, B. (2018). Science, technology, engineering, and mathematics (STEM) education:
An overview (CRS Report R45223, Version 4). Updated. Congressional Research
Service.
*Gourgey, A. F. (1987). Coordination of instruction and reinforcement as enhancers of the
effectiveness of computer-assisted instruction. Journal of Educational Computing
Research, 3(2), 219–230.
Gu, J., Fu, J., & Tong, Y. (2010, August). Potential uses of computer to enhance reform of
mathematics teaching. In 2010 5th International Conference on Computer Science &
Education (pp. 828–830). IEEE.
48
Hartley, S. S. (1977). Meta-analysis of the effects of individually paced instruction in
mathematics. Doctoral Dissertation, University of Colorado at Boulder.
Hattie, J. (2023). Visible learning: The sequel: A synthesis of over 2,100 meta-analyses relating
to achievement. Routledge.
Hedges, L. V. (1981). Distribution theory for Glass's estimator of effect size and related
estimators. Journal of Educational Statistics, 6(2), 107–128.
Hembree, R., & Dessart, D. J. (1986). Effects of hand-held calculators in precollege mathematics
education: A meta-analysis. Journal for Research in Mathematics Education, 17(2), 83–
99.
Herrington, J., & Oliver, R. (2000). An instructional design framework for authentic learning
environments. Educational Technology Research and Development, 48(3), 23–48.
Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J.
Spector, M. Merrill, J. Elen, & M. Bishop (Eds.), Handbook of research on educational
communications and technology (pp. 401–412). Springer New York.
Hohenwarter, M., & Fuchs, K. (2004). Combination of dynamic geometry, algebra, and calculus
in the software system GeoGebra. In Computer Algebra Systems And Dynamic Geometry
Systems in Mathematics Teaching Conference (pp. 1–6). Available from
https://www.researchgate.net/publication/228398347_Combination_of_dynamic_geometr
y_algebra_and_calculus_in_the_software_system_GeoGebra
*Howell, C. A. (1996). A comparison of Chapter One middle school students who received
Jostens Integrated Learning instruction and those who received Chapter One services
only. Doctoral Dissertation, University of Georgia.
49
Hsu, Y. C. (2003). The effectiveness of computer-assisted instruction in statistics education: A
meta-analysis. Doctoral Dissertation, The University of Arizona.
Hughes, R. M., Nzekwe, B., & Molyneaux, K. J. (2013). The single sex debate for girls in
science: A comparison between two informal science programs on middle school
students’ STEM identity formation. Research in Science Education, 43, 1979–2007.
*Hunter, C. T. (1994). A study of the effect of instructional method on the reading and
mathematics achievement of Chapter One students in rural Georgia. Doctoral
Dissertation, South Carolina State University.
Iskander, W., & Curtis, S. (2005). Use of colour and interactive animation in learning 3D
vectors. Journal of Computers in Mathematics and Science Teaching, 24(2), 149–156.
Johnson, D. R. (2012). Campus racial climate perceptions and overall sense of belonging among
racially diverse women in STEM majors. Journal of College Student Development, 53(2),
336–346.
*Johnson-Scott, P. L. (2006). The impact of Accelerated Math on student achievement. Doctoral
Dissertation, Mississippi State University.
Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction.
Educational Psychology Review, 19, 509–539.
Kalyuga, S., & Sweller, J. (2005). Rapid dynamic assessment of expertise to improve the
efficiency of adaptive e-learning. Educational Technology Research and Development,
53(3), 83–93.
Kaput, J. (2000). Technology as a transformative force in math education: Transforming
notations, curriculum structures, content and technologies. Technology and the NCTM
standards. National Council of Teachers of Mathematics.
50
Kieran, C., & Hillel, J. (1990). “It’s tough when you have to make the triangles angle”: Insights
from a computer-based geometry environment. Journal of Mathematical Behavior, 9(2),
99–127.
King, H. J. (1997). Effects of computer-enhanced instruction in college-level mathematics as
determined by a meta-analysis. Doctoral Dissertation, The University of Tennessee.
Kohen, Z., Amram, M., Dagan, M., & Miranda, T. (2022). Self-efficacy and problem-solving
skills in mathematics: The effect of instruction-based dynamic versus static visualization.
Interactive Learning Environments, 30(4), 759–778.
Kreskow, K. (2013). Overrepresentation of minorities in special education. Master’s Thesis, St.
John Fischer University.
*Ku, H. Y., Harter, C. A., Liu, P. L., Thompson, L., & Cheng, Y. C. (2007). The effects of
individually personalized computer-based instructional program on solving mathematics
problems. Computers in Human Behavior, 23(3), 1195–1210.
Kulik, J. A. (1983). Effects of computer-based teaching on learners. Research Report.
Kulik, J. A., Bangert, R. L., & Williams, G. W. (1983). Effects of computer-based teaching on
secondary school students. Journal of Educational Psychology, 75(1), 19.
Kusmaryono, I., & Kusumaningsih, W. (2021). Construction of students’ mathematical
knowledge in the zone of proximal development and zone of potential construction.
European Journal of Educational Research, 10(1), 341–351.
Ladson-Billings, G. (2006). From the achievement gap to the education debt: Understanding
achievement in US schools. Educational Researcher, 35(7), 3–12.
51
Langone, J. (1998). The effects of technology-enhanced anchored instruction and situated
learning on preservice teachers in a special education methods course: An exploratory
study. Journal of Developmental and Physical Disabilities, 10, 35–54.
*Lehrer, R., & Randle, L. (1987). Problem solving, metacognition and composition: The effects
of interactive software for first-grade children. Journal of Educational Computing
Research, 3(4), 409–427.
Lee, M., Collins, J., Harwood, S., Mendenhall, R., & Huntt, M. (2020). “If you aren’t White,
Asian, or Indian, you aren’t an engineer”: Racial microaggressions in STEM education.
International Journal of STEM Education, 7(48). https://doi.org/10.1186/s40594-020-
00241-4
Leong, C. L. (1981). Meta-analysis of research on the adjunctive use of computers in secondary
mathematics [Doctoral dissertation, University of Toronto].
*Lester, M. L. (1996). The effects of The Geometer’s Sketchpad software on achievement of
geometric knowledge of high school geometry students. Doctoral Dissertation, University
of San Francisco.
Lewis, S. K. (2004). The relationship of full-time laptop computer access to student achievement
and student attitudes in middle school. Doctoral Dissertation, Florida Atlantic University.
Li, Q. (2004). Technology and mathematics education: Any impact. In The Eleventh
International Literacy and Education Research Network Conference on Learning, La
Havana.
Li, Q., & Ma, X. (2010). A meta-analysis of the effects of computer technology on school
students’ mathematics learning. Educational Psychology Review, 22, 215–243.
52
Ling, S. (2004). Enhancing the learning of conics with technology. Doctoral Dissertation,
California State University, Dominguez Hills.
Liu, G. Z. (2008). Innovating research topics in learning technology: Where are the new blue
oceans? British Journal of Educational Technology, 39(4), 738–747.
Lombardi, M. M., & Oblinger, D. G. (2007). Authentic learning for the 21st century: An
overview. Educause Learning Initiative, 1(2007), 1–12.
Long, L., III, & Mejia, J. A. (2016). Conversations about diversity: Institutional barriers for
underrepresented engineering students. Journal of Engineering, 105(2), 211.
Lou, Y., Abrami, P. C., & d’Apollonia, S. (2001). Small group and individual learning with
technology: A meta-analysis. Review of Educational Research, 71(3), 449–521.
*Mahoney, S. J. (1990). A study of the effects of EXCEL Math on mathematics achievement of
second and fourth-grade students. Doctoral Dissertation, Northern Arizona University.
*Malouf, D. B. (1990). Evaluation of instructional model applied to functional math. Project on
Effective Computer Instruction for Effective Special Education, Prince George’s County
Public Schools.
Martindale, T., Pearson, C., Curda, L. K., & Pilcher, J. (2005). Effects of an online instructional
application on reading and mathematics standardized test scores. Journal of Research on
Technology in Education, 37(4), 349–360.
Martin-Hansen, L. (2018). Examining ways to meaningfully support students in STEM.
International Journal of STEM Education, 5(1), 53.
Maton, K. I., Beason, T. S., Godsay, S., Sto. Domingo, M. R., Bailey, T. C., Sun, S., &
Hrabowski, F. A., III. (2016). Outcomes and processes in the Meyerhoff Scholars
53
Program: STEM PhD completion, sense of community, perceived program benefit,
science identity, and research self-efficacy. CBE—Life Sciences Education, 15(3), ar48.
McKibbin, J. (2010). Video-based instruction and its role in today’s classroom: A study of
Square One TV and The Adventures of Jasper Woodbury. Research Report.
Means, B. (1994). The technology and education reform: The reality behind the promise. JosseyBass.
Means, B. (2010). Technology and education change: Focus on student learning. Journal of
Research on Technology in Education, 42(3), 285–307.
Milgrom-Elcott, T. (2020). Students of color are missing out on STEM opportunities, so the
planet is missing out on their brilliance. Here’s how we finally achieve equity in high
school STEM. Forbes.
https://www.forbes.com/sites/taliamilgromelcott/2020/09/24/students-of-color-aremissing-out-on-stem-opportunities-so-the-planet-is-missing-out-on-their-brilliance-hereshow-we-finally-achieve-equity-in-high-school-stem/?sh=149878745148
Mraz-Craig, J. A., Daniel, K. L., Bucklin, C. J., Mishra, C., Ali, L., & Clase, K. L. (2018).
Student identities in authentic course-based undergraduate research experience. Journal
of College Science Teaching, 48(1), 68–75.
Mulcahy, C. A., Maccini, P., Wright, K., & Miller, J. (2014). An examination of intervention
research with secondary students with EBD in light of common core state standards for
mathematics. Behavioral Disorders, 39(3), 146–164.
National Council of Teachers of Mathematics. (2000). Standards for school mathematics. The
National Council of Teachers of Mathematics.
National Research Council, Division of Behavioral, Board on Testing, Board on Science
54
Education, & Committee on Highly Successful Schools or Programs for K-12 STEM
Education. (2011). Successful K-12 STEM education: Identifying effective approaches in
science, technology, engineering, and mathematics. National Academies Press.
National Science Board. (2016). Science and Engineering Indicators Digest 2016. Research
Report.
Niemiec, R., & Walberg, H. (1985). Computers and achievement in elementary schools. Journal
of Educational Computing Research, 1, 435–440.
*Ogbuehi, P. I., & Fraser, B. J. (2007). Learning environment, attitudes and conceptual
development associated with innovative strategies in middle-school mathematics.
Learning Environments Research, 10, 101–114.
Olkun, S. (2003). Comparing computer versus concrete manipulatives in learning 2D geometry.
Journal of Computers in Mathematics and Science Teaching, 22(1), 43–56.
Oztok, M., Wilton, L., Lee, K., Zingaro, D., Mackinnon, K., Makos, A., Phirangee, K., Brett, C.,
& Hewitt, J. (2014). Polysynchronous: Dialogic construction of time in online learning.
E-learning and Digital Media, 11(2), 154–161.
Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent
developments. Educational Psychologist, 38(1), 1–4.
*Page, M. S. (2002). Technology-enriched classrooms: Effects on students of low
socioeconomic status. Journal of Research on Technology in Education, 34(4), 389–409.
Papert, S. (1980). Teaching children to be mathematicians us. teaching about mathematics.
Research Report.
Parkhurst, J., Skinner, C. H., Yaw, J., Poncy, B., Adcock, W., & Luna, E. (2010). Efficient classwide remediation: Using technology to identify idiosyncratic math facts for additional
55
automaticity drills. International Journal of Behavioral Consultation and Therapy, 6(2),
111.
*Patterson, D. (2005). The effects of Classworks in the classroom. Doctoral Dissertation,
Tarleton State University.
Perez, T., Cromley, J. G., & Kaplan, A. (2014). The role of identity development, values, and
costs in college STEM retention. Journal of Educational Psychology, 106(1), 315.
Phillips, C. K. (2001). The effects of an integrated computer program on math and reading
improvement in grades three through five. Doctoral Dissertation, The University of
Tennessee.
*Phillips, J., & Soule, H. (1992). A comparison of fourth graders’ achievement: Classroom
computers versus no computers. Research Report.
Piaget, J. (2005). The psychology of intelligence. Routledge.
*Pike, D. C. (1992). Computer-assisted instruction and student achievement in Chapter I
schools. Doctoral Dissertation, University of Georgia.
*Portis, L. B. (1991). The effect of computer-managed instruction on Algebra I achievement. The
Doctoral Dissertation, University of North Carolina at Chapel Hill.
Pustejovsky, J. E. (2019). Procedural sensitivities of effect sizes for single-case designs with
directly observed behavioral outcome measures. Psychological Methods, 24(2), 217.
Pustejovsky, J. E., & Tipton, E. (2022). Meta-analysis with robust variance estimation:
Expanding the range of working models. Prevention Science, 23(3), 425–438.
Quinn, D. W., & Quinn, N. W. (2001). PLATO evaluation series. Grades 1–8, Apache Junction
Unified School District 43, Apache Junction, Arizona. PLATO Learning.
56
Rainey, K., Dancy, M., Mickelson, R., Stearns, E., & Moller, S. (2018). Race and gender
differences in how sense of belonging influences decisions to major in STEM.
International Journal of STEM Education, 5, 1–14.
Reeves, T. C., & Reeves, P. M. (1997). Effective dimensions of interactive learning on the
World Wide Web. Web-Based Instruction, 59–66.
Reimer, K., & Moyer, P. S. (2005). Third-graders learn about fractions using virtual
manipulatives: A classroom study. Journal of Computers in Mathematics and Science
Teaching, 24(1), 5–25.
Resnick, M., Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan, K., Millner,
A., Rosenbaum, E., Silver, J., Silverman, B., & Kafai, Y. (2009). Scratch: Programming
for all. Communications of the ACM, 52(11), 60–67.
Rivers, E. (2017). Women, minorities, and persons with disabilities in science and engineering.
National Science Foundation.
Roberts, T., Jackson, C., Mohr-Schroeder, M. J., Bush, S. B., Maiorca, C., Cavalcanti, M., &
Cremeans, C. (2018). Students’ perceptions of STEM learning after participating in a
summer informal learning experience. International Journal of STEM Education, 5, 1–
14.
Roschelle, J. M., Pea, R. D., Hoadley, C. M., Gordin, D. N., & Means, B. M. (2000). Changing
how and what children learn in school with computer-based technologies. The Future of
Children, 10(2), 76–101.
Roschelle, J., Shechtman, N., Tatar, D., Hegedus, S., Hopkins, B., Empson, S., & Gallagher, L.
P. (2010). Integration of technology, curriculum, and professional development for
57
advancing middle school mathematics: Three large-scale studies. American Educational
Research Journal, 47(4), 833–878.
*Ross, S. M., & Nunnery, J. A. (2005). The effect of school renaissance on student achievement
in two Mississippi school districts. Center for Research in Education Policy and
Education (ED500028). ERIC. https://files.eric.ed.gov/fulltext/ED500028.pdf
*Rutherford, T., Kibrick, M., Burchinal, M., Richland, L., Conley, A., Osborne, K., & Martinez,
M. E. (2010). Spatial temporal mathematics at scale: An innovative and fully developed
paradigm to boost math achievement among all learners. Available from
https://www.researchgate.net/publication/234743706_Spatial_Temporal_Mathematics_at
_Scale_An_Innovative_and_Fully_Developed_Paradigm_to_Boost_Math_Achievement_
among_All_Learners
Schpilberg, B., & Hubschman, B. (2003). Face-to-face and computer mediated tutoring: A
comparative exploration on high school students’ math achievement. Research Report.
Seely Brown, J. (1999). Learning, working and playing in the Digital Age. In Talk presented at
the 1999 Conference on Higher Education of the American Association for Higher
Education (Vol. 18, p. 2007).
Sharkness, J., Eagan Jr, M. K., Hurtado, S., Figueroa, T., & Chang, M. J. (2010). Academic
achievement among STEM aspirants: Why do Black and Latino students earn lower
grades than their White and Asian counterparts? In Annual Meeting of the Association for
Institutional Research, Toronto, CA.
Shyu, H. Y. (2000). Using video-based anchored instruction to enhance learning: Taiwan’s
experience. British Journal of Educational Technology, 31, 57–69.
58
Simpson, A., & Bouhafa, Y. (2020). Youths’ and adults’ identity in STEM: A systematic
literature review. Journal for STEM Education Research, 3, 167–194
Singer, A., Montgomery, G., & Schmoll, S. (2020). How to foster the formation of STEM
identity: studying diversity in an authentic learning environment. International Journal of
STEM Education, 7(1), 1–12.
Slavin, R. E. (2000). Educational psychology: Theory and practice (6th ed.). Allyn & Bacon.
*Slavin, R. E., & Lake, C. (2008). Effective programs in elementary mathematics: A best
evidence synthesis. Review of Educational Research, 78(3), 427−455.
Smith, B. D. (2002). The impact of the utilization of advantage learning systems’ technology on
students’ academic achievement. Doctoral Dissertation, Tennessee State University.
Smith, C., McLaughlin, M., et al. (1997). Conduct control on Usenet. Journal of Computer
Mediated Communication, 2(4), 1–11.
Smith, W., Butcher, E., Litvin, S. W., & Frash, R. (2015). Incorporating an instructional
scaffolding approach into the classroom: Teaching for authentic learning in hospitality
and tourism education. Journal of Teaching in Travel & Tourism, 15(3), 264–277.
Soeder, K. L. (2001). The effect of computer-aided instruction on mathematics achievement.
Doctoral Dissertation, Immaculata College.
Sokolowski, A., Li, Y., & Willson, V. (2015). The effects of using exploratory computerized
environments in grades 1 to 8 mathematics: A meta-analysis of research. International
Journal of STEM Education, 2, 1–17.
*Spivey, P. M. (1985). The effects of computer-assisted instruction on student achievement in
addition and subtraction at first grade level. Research Report.
59
*Stevens, J. W. (1991). Impact of an integrated learning system on third-, fourth-, and fifthgrade mathematics achievement. Doctoral Dissertation, Baylor University.
Suh, J., Moyer, P. S., & Heo, H. J. (2005). Examining technology uses in the classroom:
Developing fraction sense using virtual manipulative concept tutorials. Journal of
Interactive Online Learning, 3(4), 1–21.
Swanson, H. L. (2006). Cross-sectional and incremental changes in working memory and
mathematical problem solving. Journal of Educational Psychology, 98(2), 265–281.
Sweller, J. (1988). Cognitive load during problem-solving: Effects on learning. Cognitive
Science, 12(2), 257–285.
Sweller, J. (2002). Visualisation and instructional design. In Proceedings of the International
Workshop on Dynamic Visualizations and Learning (Vol. 18, pp. 1501–1510). Citeseer.
Taber, K. S., Garcia-Franco, A., Dawson, V. M., & Claxton, G. (2017). The role of interactive
simulations in the development of conceptual understanding in physics. Journal of
Research in Science Teaching, 54(1), 66–83.
Tharp, R. G., & Gallimore, R. (1991). Rousing minds to life: Teaching, learning, and schooling
in social context. Cambridge University Press.
Tipton, P. E., Bennett, C. K., & Bennett, J. A. (1996). Using technology in the classroom to
maximize the advantages of diversity. Technology and Teacher Education Annual, 24–
27.
Toh, Y. (2016). Leading sustainable pedagogical reform with technology for student-centered
learning: A complexity perspective. Journal of Educational Change, 17(2), 145–169.
Torrence, B. F., & Torrence, E. A. (2019). The student’s introduction to Mathematica and the
Wolfram Language. Cambridge University Press.
60
Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K. K., Colquhoun, H., Levac, D., Moher, D., Peters,
M. D. J., Horsley, T., Weeks, L., Hempel, S., Akl, E. A., Chang, C., McGowan, J.,
Stewart, L., Hartling, L., Aldcroft, A., Wilson, M. G., Garritty, C., . . . Straus, S. E.
(2018). PRISMA extension for scoping reviews (PRISMA-ScR): Checklist and e
explanation. Annals of Internal Medicine, 169(7), 467–473.
*Turner, L. G. (1985). An evaluation of the effects of paired learning in a mathematics
computer-assisted-instruction program (cooperative, team, attitudes). Doctoral
Dissertation, Arizona State University.
U.S. Bureau of Labor Statistics. (2018). Occupational outlook handbook: Fastest growing
occupations. https://www.bls.gov/emp/tables/fastest-growing-occupations.htm.
Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of
Statistical Software, 36(3), 1–48.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.
Harvard University Press.
*Waite, R. D. (2000). A study of the effects of Everyday Mathematics on student achievement of
third-, fourth-, and fifth-grade students in a large north Texas urban school district.
Doctoral Dissertation, University of North Texas.
Wallace, D. E., & Bodzin, A. M. (2017). Developing scientific citizenship identity using mobile
learning and authentic practice. Electronic Journal of Science Education, 21(6), 46–71.
Warrington, M. A., & Kamii, C. (1998). Multiplication with fractions: A Piagetian, constructivist
approach. Mathematics Teaching in the Middle School, 3, 339–343.
61
*Webster, A. H. (1990). The relationship of computer-assisted instruction to mathematics
achievement, student cognitive styles, and student and teacher attitudes. Doctoral
Dissertation, Delta State University.
Wertsch, J. V. (1991). Voices of the mind: Sociocultural approach to mediated action. Harvard
University Press.
Wiske, M. S., Franz, K. R., & Breit, L. (2005). Teaching for understanding with technology.
Jossey-Bass.
Wolfram, S. (2013). An elementary introduction to the Wolfram Language. Wolfram Media.
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of
Child Psychology and Psychiatry, 17(2), 89–100.
Woolf, B. P. (2010). Building intelligent interactive tutors: Student-centered strategies for
revolutionizing e-learning. Morgan Kaufmann.
Xie, C., Cheung, A. C., Lau, W. W., & Slavin, R. E. (2020). The effects of computer-assisted
instruction on mathematics achievement in mainland China: A meta-analysis.
International Journal of Educational Research, 102, 101565.
*Xin, J. F. (1999). Computer-assisted cooperative learning in integrated classrooms for students
with and without disabilities. Information Technology in Childhood Education Annual,
1999(1), 61–78.
*Ysseldyke, J., & Bolt, D. M. (2007). Effect of technology-enhanced continuous progress
monitoring on math achievement. School Psychology Review, 36(3), 453–467.
*Ysseldyke, J. E., & Bolt, D. (2006). Effect of technology-enhanced progress monitoring on
math achievement. Research Report, University of Minnesota.
62
Ysseldyke, J., Spicuzza, R., Kosciolek, S., Teelucksingh, E., Boys, C., & Lemkuil, A. (2003).
Using a curriculum-based instructional management system to enhance math
achievement in urban schools. Journal of Education for Students Placed at Risk, 8(2),
247–265.
Ysseldyke, J., & Tardrew, S. (2007). Use of a progress monitoring system to enable teachers to
differentiate mathematics instruction. Journal of Applied School Psychology, 24(1), 1–28.
*Yusuf, M. M. (1991). Logo based instruction in geometry. Doctoral Dissertation, University of
Cincinnati.
*Yusuf, M. M. (1995). The effects of Logo-based instruction. Journal of Educational Computing
Research, 12(4), 335–362.
Zumwalt, D. B. (2001). The effectiveness of computer-aided instruction in eighth-grade prealgebra classrooms in Idaho. Doctoral Dissertation, Idaho State University.
Note: Studies marked with an asterisk (*) are included in the synthesis.
63
Tables
Table 1
Overall Effect of Technology Interventions
95% CI
Outcome NStud NS NES g Low/high τ2 I
2 Q
Achievement 50 143 188 0.5230*** 0.3933/0.6527 0.3016 97.86 2221.7***
Note. NStud = number of studies. NS = number of samples, NES = number of effect sizes. g =
Hedge’s g (average pooled effect). CI = confidence interval. Low = lower estimate. High = upper
estimate. τ2 = magnitude of the variation in true effects. I
2 = proportion of the heterogeneity. Q =
total variation.
*p < .05, **p < .01, ***p < .001.
Table 2
Moderator Analysis, Publication Type
95% CI
Moderator NStud NS NES b(SE) g Low/high
Published 21 51 61 - 0.3033*** 0.1909/0.4158
Unpublished 25 81 115 0.629(.1725)*** 0.9327*** 0.6043/1.2612
Note. NStud = number of studies. NS = number of samples, NES = number of effect sizes. g =
Hedge’s g (average pooled effect). b = slope. SE = standard error. CI = confidence interval. Low
= lower estimate. High = upper estimate.
*p < .05, **p < .01, ***p < .001.
64
Table 3
Moderator Analysis, Percentage of Black and Latinx Students
95% CI
Moderator NStud NS NES b(SE) Low/high g
% Black 48 138 179 0.0092(0.0024)*** .0045/0.014 0.0753
% Latinx 47 132 173 –0.0071(0.0022)* –0.0116/–0.0026 0.6564***
Note. NStud = number of studies. NS = number of samples, NES = number of effect sizes. g =
Hedge’s g (intercept, when % is zero). b = slope, SE = standard error. CI = confidence interval.
Low = lower estimate. High = upper estimate.
*p < .05, **p < .01, ***p < .001.
Table 4
Moderator Analysis, Percentage of Special Education Students
95% CI
Moderator NStud NS NES b(SE) Low/high g
% Special
education 8 17 19 0.0022(0.0024) 0.0030/0.0072 0.14340
Note. NStud = number of studies. NS = number of samples, NES = number of effect sizes. g =
Hedge’s g (intercept, when % is zero). b = slope. SE = standard error. CI = confidence interval.
Low = lower estimate. High = upper estimate.
*p < .05, **p < .01, ***p < .001.
65
Table 5
Moderator Analysis, Type of Technology
95% CI
Moderator NStud NS NES b(SE) g Low/high
Tutorial 36 100 117 - 0.549*** 0.3897/0.7085
Exp Env 6 9 16 –0.210(0.1766) 0.339 –0.0243/0.7026
SW Tools 5 29 35 –0.178(0.1716) 0.371* 0.0592/0.6826
Prog 5 6 20 0.350(0.2783) 0.899* 0.2141/1.5841
Exp Env 6 9 16 - 0.339 –0.0243/0.7026
Tutorial 36 100 117 0.210(0.177) 0.549*** 0.3897/0.7085
SW Tools 5 29 35 0.0318(0.219) 0.371* 0.0592/0.6826
Prog 5 6 20 0.5599(0.309) 0.899* 0.2141/1.5841
SW Tools 5 29 35 - 0.371* 0.0592/0.6826
Tutorial 36 100 117 0.1782(0.172) 0.549*** 0.3897/0.7085
Exp Env 6 9 16 –0.0318(0.219) 0.339 –0.0243/0.7026
Prog 5 6 20 0.5282(.289) 0.899* 0.2141/1.5841
Prog 5 6 20 - 0.899* 0.2141/1.5841
Tutorial 36 100 117 –0.350(0.278) 0.549*** 0.3897/0.7085
Exp Env 6 9 16 –0.560(0.309) 0.339 –0.0243/0.7026
SW Tools 5 29 35 –0.528(0.289) 0.371* 0.0592/0.6826
Note. NStud = number of studies. NS = number of samples, NES = number of effect sizes. g =
Hedge’s g (average pooled effect). b = slope. SE = standard error. CI = confidence interval. Low
= lower estimate. High = upper estimate.
*p < .05, **p < .01, ***p < .001.
66
Table 6
Table of Studies and Characteristics Included in This Meta-Analysis
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
1987 Dellario, T. E. (1987). The effects of computer-assisted
instruction in basic skills courses on high-risk ninth-grade
students. Western Michigan University.
Unpublished 35 1 – tutorial 0.4846
1991 Portis, L. B. (1991). The effect of computer-managed
instruction on Algebra I achievement. The University of
North Carolina at Chapel Hill.
Unpublished 50 0 – tutorial 1.236
1988 Emihovich, C., & Miller, G. E. (1988). Effects of Logo and
CAI on Black first graders’ achievement, reflectivity, and
self-esteem. The Elementary School Journal, 88(5), 473–
487.
Published 33.33 0 – programming 0.7407
1987 Gourgey, A. F. (1987). Coordination of instruction and
reinforcement as enhancers of the effectiveness of
computer-assisted instruction. Journal of Educational
Computing Research, 3(2), 219–230.
Published –99
(mostly)
–99
(mostly)
– tutorial 0.2859
1990 Malouf, D. B. (1990). Evaluation of instructional model
applied to functional math. Project on Effective Computer
Instruction for Effective Special Education, Prince
George’s County Public Schools.
Published 88 0 100 tutorial 0.6473
1992 Phillips, J., & Soule, H. (1992). A comparison of fourth
graders’ achievement: classroom computers versus no
computers.
Published 94.6 0 – tutorial 0.6797
1985 Spivey, P. M. (1985). The effects of computer-assisted
instruction on student achievement in addition and
Published 83 0 – exploratory
environments
0.0397
67
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
subtraction at first grade level.
1991 Stevens, J. W. (1991). Impact of an integrated learning
system on third-, fourth-, and fifth-grade mathematics
achievement. Baylor University.
Published 6.5 10.5 – tutorial 0.1983
1991 Yusuf, M. M. (1991). Logo based instruction in geometry. Published 76.1 0 – programming 3.2157
1995 Yusuf, M. M. (1995). The effects of Logo-based instruction.
Journal of Educational Computing Research, 12(4), 335–
362.
Published 76.1 0 – programming 1.3318
2009 Barrow, L., Markman, L., & Rouse, C. E. (2009).
Technology’s edge: The educational benefits of computeraided instruction. American Economic Journal: Economic
Policy, 1(1), 52–74.
Published 83 13 9 tutorial 0.128
2009 Campuzano, L., Dynarski, M., Agodini, R., & Rall, K.
(2009). Effectiveness of reading and mathematics
software products: Findings from two student cohorts
(NCEE 2009-4041) National Center for Education
Evaluation and Regional Assistance.
Published 33 42 – tutorial –0.0881
2007 Dynarski, M., Agodini, R., Heaviside, S., Novak, T., Carey,
N., Campuzano, L., ... & Sussex, W. (2007). Effectiveness
of Reading and Mathematics Software Products: Findings
from the First Student Cohort. Report to Congress.
National Center for Education Evaluation and Regional
Assistance.
Published 33 42 10.61 tutorial 0.4999
2010 Gatti, G. G., & Petrochenkov, K. (2010). Pearson
SuccessMaker math efficacy study: 2009–10 final report.
Gatti Evaluation.
NA 11 47 – tutorial 0.5447
68
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
1996 Howell, C. A. (1996). A comparison of Chapter One middle
school students who received Jostens Integrated Learning
instruction and those who received Chapter One services
only. University of Georgia.
Unpublished 52.5 0 – tutorial –0.2977
1994 Hunter, C. T. (1994). A study of the effect of instructional
method on the reading and mathematics achievement of
Chapter One students in rural Georgia. South Carolina
State University.
Unpublished 76.7 0 – tutorial 4.2828
2006 Johnson-Scott, P. L. (2006). The impact of Accelerated Math
on student achievement. Mississippi State University.
Unpublished 100 0 – tutorial 0.0654
1992 Pike, D. C. (1992). Computer-assisted instruction and
student achievement in Chapter I schools.
Unpublished 94.85 0 – tutorial 0.2843
2005 Ross, S. M., & Nunnery, J. A. (2005). The effect of school
renaissance on student achievement in two Mississippi
school districts. Center for Research in Education Policy
and Education Innovations.
Published 40 0 – tutorial 0.2292
2010 Rutherford, T., Kibrick, M., Burchinal, M., Richland, L.,
Conley, A., Osborne, K., & Martinez, M. E. (2010).
Spatial temporal mathematics at scale: An innovative and
fully developed paradigm to boost math achievement
among all learners.
Published 2.3 81.8 – exploratory
environments
–0.3142
1985 Turner, L. G. (1985). An evaluation of the effects of paired
learning in a mathematics computer-assisted-instruction
program (cooperative, team, attitudes). Arizona State
University.
Unpublished 6.5 43.8 – tutorial 0.0488
1990 Webster, A. H. (1990). The relationship of computer-assisted
instruction to mathematics achievement, student cognitive
Unpublished 100 0 – tutorial –0.2108
69
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
styles, and student and teacher attitudes. Delta State
University.
2007 Ysseldyke, J., & Bolt, D. M. (2007). Effect of technologyenhanced continuous progress monitoring on math
achievement. School Psychology Review, 36(3), 453–467.
Published 35.27 25.53 – tutorial 0.0215
2003 Ysseldyke, J., Spicuzza, R., Kosciolek, S., Teelucksingh, E.,
Boys, C., & Lemkuil, A. (2003). Using a curriculumbased instructional management system to enhance math
achievement in urban schools. Journal of Education for
Students Placed at Risk, 8(2), 247–265.
Published -99
(mostly)
-99
(mostly)
10.78 tutorial 0.1374
1986 Berlin, D., & White, A. (1986). Computer simulations and
the transition from concrete manipulation of objects to
abstract thinking in elementary school mathematics.
School Science and Mathematics, 86(6), 468–479.
Published 47.37 0 – software tools –0.673
1986 Billings Jr, L. J. (1986). Development of mathematical task
persistence and problem-solving ability in fifth and sixth
grade students through the use of Logo and heuristic
methodologies. Northern Arizona University.
Unpublished 5 42 – programming 1.7083
1987 Lehrer, R., & Randle, L. (1987). Problem solving,
metacognition and composition: The effects of interactive
software for first-grade children. Journal of Educational
Computing Research, 3(4), 409–427.
Published 82.05 0 – programming 1.2575
1987 Anand, P. G., & Ross, S. M. (1987). Using computerassisted instruction to personalize arithmetic materials for
elementary school children. Journal of Educational
Psychology, 79(1), 72.
Published 52.08 0 – software tools 0.6951
2012 Bai, H., Pan, W., Hirumi, A., & Kebritchi, M. (2012). Published 25.9 22.2 – exploratory 0.1838
70
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
Assessing the effectiveness of a 3D instructional game on
improving mathematics achievement and motivation of
middle school students. British Journal of Educational
Technology, 43(6), 993–1003.
environments
2007 Ku, H. Y., Harter, C. A., Liu, P. L., Thompson, L., & Cheng,
Y. C. (2007). The effects of individually personalized
computer-based instructional program on solving
mathematics problems. Computers in Human Behavior,
23(3), 1195–1210.
Published 0 55 – software tools 0.4037
2007 Ogbuehi, P. I., & Fraser, B. J. (2007). Learning environment,
attitudes and conceptual development associated with
innovative strategies in middle-school mathematics.
Learning Environments Research, 10, 101–114.
Published 46 51 – exploratory
environments
0.5376
2000 Din, F. S., & Caleo, J. (2000). Playing computer games
versus better learning.
Published 100 0 – tutorial 0.1283
2003 Forde, T. B. (2003). The effects of technology-supported
cognitive mathematical instruction on African-American
students characterized as at-risk for school failure.
Vanderbilt University.
Unpublished 72 0 – exploratory
environments
1.0371
1996 Lester, M. L. (1996). The effects of The Geometer’s
Sketchpad software on achievement of geometric
knowledge of high school geometry students. University of
San Francisco.
Unpublished 40 9 – exploratory
environments
0.151
2002 Page, M. S. (2002). Technology-enriched classrooms:
Effects on students of low socioeconomic status. Journal
of Research on Technology in Education, 34(4), 389–409.
Published 74.57 0 – software tools 0.3317
1999 Xin, J. F. (1999). Computer-assisted cooperative learning in Published 40 25 0 tutorial 0.9462
71
Year Author/title Pub stat %
Blk
%
Latinx
%
SE
Tech type g
integrated classrooms for students with and without
disabilities. Information Technology in Childhood
Education Annual, 1999(1), 61–78.
2006 Johnson-Scott, P. L. (2006). The impact of Accelerated Math
on student achievement. Mississippi State University.
Unpublished 100 0 – tutorial 0.0654
1990 Mahoney, S. J. (1990). A study of the effects of EXCEL Math
on mathematics achievement of second and fourth-grade
students. Northern Arizona University.
Unpublished 7.02 31.9 – tutorial 0.1154
2005 Patterson, D. (2005). The effects of Classworks in the
classroom. Tarleton State University.
– 6 37 – tutorial 0.7755
2000 Waite, R. D. (2000). A study of the effects of Everyday
Mathematics on student achievement of third-, fourth-,
and fifth-grade students in a large north Texas urban
school district. University of North Texas.
Unpublished 54.83 20.78 – tutorial 0.2746
1990 Webster, A. H. (1990). The relationship of computer-assisted
instruction to mathematics achievement, student cognitive
styles, and student and teacher attitudes. Delta State
University.
Unpublished 100 0 – tutorial –0.211
2007 Ysseldyke, J. E., & Bolt, D. (2006). Effect of technologyenhanced progress monitoring on math achievement.
University of Minnesota.
Published 35.27 25.53 – tutorial 0.0249
2000 Brown, F. (2000). Computer assisted instruction in
mathematics can improve students’ test scores: A study.
NA 43.92 1.73 – tutorial –0.0248
72
Note. Pub Stat = publication status. % Blk = percentage of Black student population. % Latinx = percentage of Latinx student
population. % SE = percentage of special education student population. Tech Type = type of technology intervention. g = Hedge’s g
(average pooled effect).
73
Appendix A: Coding Guide
Coding guide: effect of technology on mathematical achievement
Report characteristics
R-1 report ID __ __ __ (pre-assigned)
R-2 author’s last name ________________
R-3 year __ __ __ __
R-4 title ________________
R-5 APA reference ________________
R-6 publication type 1. Journal article
2. Book or book chapter
3. Dissertation
4. Master’s thesis
5. Policy report
6. Government report
7. Conference paper
8. Other
–99. Can’t tell
R-7 data sources 1. Independent study
2. Regional/national data set
3. Other
–99. Can’t tell
Setting characteristics
S-1 study number 0. Single study
1. Study 1
2. Study 2
3. Study 3
etc.
S-2 school level 1. Preschool
2. Elementary school: K–5
3. Middle school: 6–8
4. High school: 9–12
5. Undergraduate
6. Graduate school
7. Other (specify)
–99. Can’t tell
S-3 location [State]
Participant and sample characteristics
P-1 sample 0. Overall sample
1. Subgroup
P-2 subgroup overlap
0. No
74
Coding guide: effect of technology on mathematical achievement
1. Yes
–99. N/A
P-3 sample size (at start) ___
P-4 sample
characteristics
1. Sample at start
2. Analysis sample
3. Both, but they are the same
4. Both, and they are not the same
5. Neither
–99. Can’t tell/not applicable
P-5 % White ___
P-6 % Black ___
P-7 % Latinx ___
P-8 % Asian or Pacific
Islander
___
P-9 % Native American
or American Indian
___
P-10 % other ___
P-11 grade level -1. Preschool 8. Grade 8
0. Kindergarten 9. Grade 9
1. Grade 1 10. Grade 10
2. Grade 2 11. Grade 11
3. Grade 3 12. Grade 12
4. Grade 4 13. Undergraduate
5. Grade 5 14. Graduate
6. Grade 6 15. Other (Specify)
7. Grade 7 –99. Can’t tell
P-12 % female ___
P-13 % low income/
economically
disadvantaged
___
P-14% special education ___
P-15 % English learner ___
Predictor influence
I-1 report’s name for
influence
________________
75
Coding guide: effect of technology on mathematical achievement
I = 2 Technology influence
type
1. Tutorial
2. Communication media
3. Exploratory environments
4. Software tools
5. Programming
I-3 influence definition ________________
I-4 How is the influence
measured?
________________
I-5 reliability 0. No
1. Yes
–99. Unsure, N/A
I-6 alpha coefficient ___
I-7 How is the influence
manipulated by the
researcher?
________________
Outcome measures
O-1 outcome type 1. State standardized tests (state-wide
testing)
2. National standardized tests
(SAT/ACT/NAEP/PISA/TIMSS)
3. GPA
4. Knowledge diagnostic test (e.g.,
researcher/instructor developed test)
5. Other achievement
O-2 outcome name ________________
O-3 outcome description ________________
O-4 domain of outcome 1. Mathematics
2. English language arts
3. Science
4. Social science
5. General academics
6. Other (specify)
O-5 What is the unit of
analysis?
1. Student
2. Teacher
3. Classroom
4. School
5. Other (specify)
–99. Unsure/not applicable
O-6 timing of influence
and outcome measure
collection
1. Simultaneously
2. Longitudinally
–99. Unsure
76
Coding guide: effect of technology on mathematical achievement
Research design and effect sizes
E-1 sample size (for
relationship/effect)
___
E-3 direction of
relationship between
influence and outcome
0. Null/no relationship
1. Positive
2. Negative
3. Mixed
–99. Unclear
E-7 type of research
design
1. Descriptive
2. Correlational
3. One-group/single-group
4. Quasi-experimental
5. Randomized control trial/true
experiments
E-8 Is there a treatment
group and control
group?
1. Yes
2. No
E-11 What is the level of
assignment?
__
Treatment, control, classroom, schools?
E-13 matching
characteristics
___
1. Descriptive
2. Correlational
3. One-group/single group
–99
EE-1 What is Nt? How many participants
were in the treatment
group?
___
EE-4 What is Nc? How many participants
were in the control
group?
___
EE-7 What is Mₜ? What is the mean of the
outcome variable for the
treatment group?
___
EE-10 What is SDₜ? What is the standard
deviation of the outcome
variable for the treatment
group?
___
EE-13 What is M? What is the mean of the
outcome variable for the
control group?
__
77
Coding guide: effect of technology on mathematical achievement
EE-16 What is SD? What is the standard
deviation of the outcome
variable for the control
group?
___
EE-19 What is the effect
size (d)?
___
EE-20 What is the
variance (v)?
EE-22 What is the SEt? What is the standard
error of the outcome
variable for the treatment
group?
EE-25 What is the SEc? What is the standard
error of the outcome
variable for the control
group?
EE-37 What is the tstatistic?
EE-37 What is the pvalue of the t-test?
EE-43 How many groups
are compared in the Ftest?
EE-44 What is the Fstatistic of the F-Test?
EE-47 What is the effect
size (d)?
EE-48 What is the
variance?
EE-59 d-index
calculated? 0. No
1 Yes
–99 N/A
EE-60 effect size from
original meta-analysis
text entry
–99 missing/can’t tell/not applicable
Abstract (if available)
Abstract
Utilizing technology to enhance mathematical instruction is crucial for improving academic outcomes among Black and Latinx students in the United States, ensuring equitable access to innovative learning tools and opportunities for achievement in STEM fields. This research synthesis built on Hattie’s (2023) Visible Learning, specifically focusing on the effects of technology on the mathematical achievement of Black and Latinx K–12 students in the United States. To be included in the current synthesis, a study had to have been originally included in the meta-analyses in Hattie’s (2023) Visible Learning, with the targeted sample being at least 40% Black and Latinx students. This meta-analysis of 50 studies, which focused on the effect of technology on mathematics achievement among Black and Latinx students, revealed that the pooled average effect size was positive (g = 0.523) and statistically significant. Analyzing for the percentage of Black and Latinx students as a moderator, the effect of technology was more positive for samples with a greater percentage of Black students and less positive for samples with a greater percentage of Latinx students, both statistically significant. Furthermore, of the types of technology examined, there were no statistically significant differences of effectiveness in the types of technology interventions used. Lastly, the percentage of students with special education status did not statistically significantly moderate the effect of technology on mathematics achievement.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The relationship between working memory and achievement among Black and Latinx students
PDF
Unveiling the visible impact: a meta-analysis on inquiry-based teaching and the effects on Black and Latinx student achievement
PDF
Bridging gaps, building futures: a meta-analysis of collaborative learning and achievement for Black and Latinx students
PDF
Learning together: a meta-analysis of the effect of cooperative learning on achievement among Black and Latinx students
PDF
A meta-analysis of formative assessment on academic achievement among Black and Hispanic students
PDF
The effects of inclusive education on academic achievement for special education and general education students of color: a meta-analysis
PDF
The consequences of stereotype threat on Black and Latinx students in science and engineering
PDF
The underrepresentation of women in science, technology, engineering, and mathematics (STEM) leadership positions
PDF
Effective school leadership: practices that promote a culture of high student achievement
PDF
Effects of teacher autonomy support with structure on marginalized urban student math achievement
PDF
Persistence of first-generation Latinx engineering students: developing a better understanding of STEM classroom experiences and faculty interactions
PDF
Problems and solutions for school counselors supporting Black and Latinx students in the 21st century
PDF
Examining perspectives of academic autonomy in community college students: a quantitative study
PDF
For DEI practitioners of color who’ve considered leaving when the rainbow isn’t enough: the impact of DEI fatigue on the retention of Black women in big tech
PDF
Examining the impact of LETRS professional learning on student literacy outcomes: a quantitative analysis
PDF
The relationship between Latinx undergraduate students’ mental health and college graduation rates
PDF
Mathematics, Engineering, Science, Achievement (MESA) and student persistence in science, technology, engineering, and mathematics (STEM) activities and courses: the perceptions of MESA teacher a...
PDF
Échale ganas: the transition experiences of first-generation Latinx students and their parents to college
PDF
Motivational, parental, and cultural influences on achievement and persistence in basic skills mathematics at the community college
PDF
Addressing the education debt: how community college educators utilize culturally relevant pedagogy to support Black and Latinx student success
Asset Metadata
Creator
Baldeviso, Romeo Amado
(author)
Core Title
The effects of technology on the mathematical achievement of Black and Latinx students
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Educational Leadership
Degree Conferral Date
2024-05
Publication Date
04/24/2024
Defense Date
04/23/2024
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Black,Hattie,Latinx,mathematical achievement,meta-analyses,OAI-PMH Harvest,STEM,Technology,Visible Learning
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Kho, Adam (
committee chair
), Patall, Erika A. (
committee chair
), Green, Alan G. (
committee member
)
Creator Email
baldevis@usc.edu,romeobal@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC113892989
Unique identifier
UC113892989
Identifier
etd-BaldevisoR-12856.pdf (filename)
Legacy Identifier
etd-BaldevisoR-12856
Document Type
Dissertation
Format
theses (aat)
Rights
Baldeviso, Romeo Amado
Internet Media Type
application/pdf
Type
texts
Source
20240422-usctheses-batch-1143
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
Hattie
Latinx
mathematical achievement
meta-analyses
STEM
Visible Learning