Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Navigating the future of teacher professional development: Three essays on teacher learning in informal and digital context
(USC Thesis Other)
Navigating the future of teacher professional development: Three essays on teacher learning in informal and digital context
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
NAVIGATING THE FUTURE OF TEACHER PROFESSIONAL DEVELOPMENT:
THREE ESSAYS ON TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
by
Jingxian Li
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfilment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(EDUCATION)
August 2024
Copyright 2024 Jingxian Li
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
ii
Acknowledgements
The Ph.D. journey is inherently long and challenging, especially for me as an
international student studying abroad. This dissertation would not be possible with the support of
many individuals. First and foremost, I must express my deepest gratitude to my advisor Dr.
Yasemin Copur-Gencturk. She provided invaluable, constructive feedback and revised my
manuscript word by word many times throughout my dissertation writing process. I am lucky
enough to have Yasemin as my mentor and a role model, who has supported me through every
small step I have taken toward becoming an independent researcher. Her unwavering guidance,
patience, and belief in my abilities have been instrumental in helping me reach this milestone.
I also want to extend my sincere gratitude to the rest of my dissertation committee, all of
whom have supported me from the qualifying exam to dissertation proposal, and now the final
dissertation. Dr. Stephen Aguilar introduces me to the field of self-regulated learning and
educational technology in the first year of my Ph.D. program, which inspired me to pursue an
interdisciplinary work between teacher education and psychology. I have been collaborating with
Dr. Allan S. Cohen on various research projects for the past four years. Al has consistently been
responsive and helpful whenever I have sought his advice about statistical analyses. I’m grateful
to have AI on my committee as a mentor who witnesses my growth as a scholar. Dr. Han Du has
provided extensive, invaluable feedback on the methodology of my dissertation. My
understanding of multilevel modeling has improved greatly through my consultation with her
regarding the analytical approaches I used in this work. Big thanks to the best program directors
Alex and Laura, for keeping me on track through the Ph.D. program.
As a first-generation college student, I am deeply grateful to my parents, Bao Jianghong
and Li Bin, who have prioritized my education for as long as I can recall. I’ve never been the
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
iii
best in anything, but they always unreservedly love me, believe in me, and encourage me as I
pursue my small and big dreams over the past thirty years. My ability to continue my education
to this point is a direct result of the continuous support of my best friend, therapist, life partner,
Yan Haoruo. He has shouldered the majority of responsibilities in our family, allowing me to
fully dedicate myself to my academic career and being physically health and emotionally happy.
Also, I thank Dobby and Winky, for bringing so much joy to me. To my old friends thousands of
miles away in China, thank you for being there for me and supporting me through ups and
downs, despite the distance and the 15-hour difference.
Finally, I would like to thank the National Science Foundation (NSF, Grant Number
1751309) and Institute of Education Sciences (IES, Grant Number R305A180392) for supporting
this work. Any opinions, findings, conclusions, or recommendations expressed in this material
are those of the author(s) and do not necessarily reflect the views of NSF and IES.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
iv
TABLE OF CONTENTS
Acknowledgements......................................................................................................................... ii
List of Tables ................................................................................................................................. vi
List of Figures............................................................................................................................... vii
Abstract........................................................................................................................................ viii
Introduction..................................................................................................................................... 1
Conceptualizing Content-Specific Knowledge for Teaching Mathematics................................ 5
Chapter 1 : Learning Through Teaching: The Development of Pedagogical Content
Knowledge Among Novice Mathematics Teachers ....................................................................... 8
Abstract....................................................................................................................................... 8
Introduction................................................................................................................................. 9
Conceptualizing Teacher Learning Through Teaching ............................................................ 12
Literature Review...................................................................................................................... 14
Present Study ............................................................................................................................ 20
Methods..................................................................................................................................... 21
Results....................................................................................................................................... 29
Discussion................................................................................................................................. 32
Conclusions............................................................................................................................... 35
Chapter 2 : The Impact of An Interactive, Personalized Computer-Based Teacher
Professional Development Program on Student Performance: A Randomized Controlled
Trial............................................................................................................................................... 36
Abstract..................................................................................................................................... 36
Introduction............................................................................................................................... 37
Conceptual Framework............................................................................................................. 40
Literature Review...................................................................................................................... 43
The Current Study..................................................................................................................... 48
Method ...................................................................................................................................... 53
Results....................................................................................................................................... 62
Discussion................................................................................................................................. 63
Conclusion ................................................................................................................................ 66
Chapter 3 : Perceptions versus Performance: Assessing Teacher Learning in Asynchronous
Online Professional Development ................................................................................................ 68
Abstract..................................................................................................................................... 68
Introduction............................................................................................................................... 69
Conceptualizing Self-Regulated Learning................................................................................ 71
Literature Review...................................................................................................................... 73
Method ...................................................................................................................................... 77
Results....................................................................................................................................... 87
Discussion................................................................................................................................. 94
Conclusion ................................................................................................................................ 99
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
v
Conclusion .................................................................................................................................. 101
References................................................................................................................................... 102
Appendix..................................................................................................................................... 124
Appendix A............................................................................................................................. 124
Appendix B............................................................................................................................. 128
Appendix C............................................................................................................................. 138
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
vi
List of Tables
Table 1.1 Background Characteristics of Teachers in the Present Study .................................... 22
Table 1.2 Descriptive Statistics for the Variables Used in the Reported Analyses..................... 27
Table 1.3 Estimates of the Growth Trajectory for Teachers’ PCK.............................................. 31
Table 2.1 Content of the Program................................................................................................ 49
Table 2.2 Background Characteristics of Teachers in the Study Compared with a
Nationwide Sample of U.S. Secondary Public School Teachers.......................................... 54
Table 2.3 Descriptive Statistics and Balance for the Analytic Sample........................................ 55
Table 2.4 Results for the Estimated Effects of the Program on Students' Mathematics
Performance .......................................................................................................................... 63
Table 3.1 Teacher and School Characteristics in the Analytic sample and National
Representative Sample.......................................................................................................... 78
Table 3.2 Descriptive Statistics for the Measures Used in the Analyses..................................... 83
Table 3.3 Self-Regulated Learning and Teachers’ Knowledge Gains in CK .............................. 89
Table 3.4 Self-Regulated Learning and Teachers’ Knowledge Gains in PCK............................ 90
Table 3.5 Relationship Between Teachers’ Learning of CK and Instructional Practice ............. 92
Table 3.6 Relationship Between Teachers’ Learning of PCK and Instructional Practice ........... 93
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
vii
List of Figures
Figure 1.1 A Visual Conceptual Framework ................................................................................ 14
Figure 1.2 Growth of PCK Among Teachers with Less and More Robust Mathematics
Content Knowledge at the Beginning of the Study .............................................................. 31
Figure 2.1 Illustration of the Mechanism of An Interactive, Automated Feedback
Cycle through the Expectation-Misconceptions Framework................................................ 43
Figure 2.2 An Example of CK Activity ....................................................................................... 50
Figure 2.3 Sample PCK Activity Focusing on Analyzing the Teaching and Reflecting
on the Teaching Practice....................................................................................................... 51
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
viii
Abstract
Teachers’ content-specific knowledge for teaching mathematics is crucial for quality
mathematics teaching. In the teaching profession, teachers have many learning opportunities to
improve this knowledge. In the dissertation, I present three papers that explore the development
of teachers’ content-specific knowledge for teaching within two relatively underexplored
learning contexts: their own teaching practice and an asynchronous online professional
development program (OPD). In the first paper, I examined the development of teachers’
pedagogical content knowledge through their own teaching practice over three years. Drawing
on longitudinal data from over 200 novice mathematics teachers, the study found that teachers
were able to learn PCK through their own teaching practice. Teachers with a strong content
knowledge about school mathematics showed faster yearly growth in their PCK through
teaching. In the second paper, my coauthors and I described the design of an asynchronous OPD
program implemented in an intelligent tutoring system. We measured its impact on the
mathematics performance of students whose teachers completed the program by conducting a
randomized experiment. Our findings revealed that the program significantly enhanced students’
mathematics performance. In the third paper, I investigated the alignment between teachers’ selfreported learning and directly-assessed learning from an asynchronous OPD program and
explored how teachers’ use of self-regulated learning strategies related to teacher learning from
the program. The study found what teachers believed they learned from the OPD program did
not align with what direct assessments could capture what they learned. Teachers who regularly
monitored and evaluated their learning progress demonstrated greater improvement in content
knowledge, as measured by direct assessments.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
1
Introduction
Content-specific knowledge for teaching, including content knowledge (CK) and
pedagogical content knowledge (PCK), is widely recognized as a crucial component of teacher
competence in policy and professional standards documents (Association of Mathematics
Teacher Educators, 2017; Council for the Accreditation of Educator Preparation, 2018; National
Board for Professional Teaching Standards, 1989) and by scholars in teacher education (Ball et
al., 2008; Blömeke et al., 2022; Copur-Gencturk & Tolar, 2022; Li & Kaiser, 2011; Shulman,
1986, 1987; Tatto et al., 2008). The crucial role teachers’ content-specific knowledge plays in
mathematics instruction has been established in earlier qualitative studies by showing the
affordance and limitations of teacher knowledge on teachers’ decisions and actions in the
classroom (e.g., repertoire of instructional strategies) (Ball, 1990; Borko et al., 1992; Lloyd &
Wilson, 1998; Ma, 1999; Putnam et al., 1992; Stein et al., 1990). Recent research, supported by
the empirically validated measures of teacher knowledge (e.g., Hill et al., 2004; Kersting, 2008;
Krauss et al., 2008), has identified a positive relationship between teachers’ content-specific
knowledge and the quality of their mathematics instruction (Baumert et al., 2010; CopurGencturk, 2015; Hill et al., 2015; Hill & Chin, 2018; Kersting et al., 2012; Krauss et al., 2020;
Lee & Santagata, 2020).
Within their profession, teachers have access to a range of learning opportunities to
develop this knowledge, either through structured formal support, such as traditional professional
development programs (PD), or through informal learning opportunities, such as peer
collaboration on teaching. In this three-paper dissertation, I seek to understand the development
of teachers’ content-specific knowledge for teaching within two relatively underexplored
learning contexts: the practice of teaching itself, conceptualized as a source for informal teacher
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
2
learning, and asynchronous OPD implemented in a dialogue-based intelligent tutoring system
(ITS), as an innovative form of formal teacher learning opportunity. Throughout the dissertation,
I paid special attention to the measure being used to assess teachers’ content-specific knowledge
in different contexts. Emphasizing empirically validated measures is important as it will help us
better identify effective learning opportunities that develop teachers’ content-specific knowledge.
Further, it allows us to better explore the influential factors in teachers’ acquisition of this
knowledge across various learning contexts.
The first paper of my dissertation investigated novice mathematics teachers’ learning of
PCK through their own teaching practice. It also explored the role of a few teacher-level factors,
such as teachers’ mathematical content knowledge, in supporting teachers’ PCK development
through teaching. In the past, teacher learning through teaching practice is often assumed, yet
what exactly being learned is not well understood (Leikin & Zazkis, 2010). Most of the prior
work on teacher learning through teaching concentrated primarily on teachers’ continual inquiry
into curriculum and students’ thinking (Davis & Krajcik, 2005; Lloyd, 2008; Marcus & Chazan,
2010; Remillard & Bryans, 2004). While these case studies have been essential in identifying
teacher learning opportunities within teachers’ own practices, they were not clear regarding the
specific changes in teacher knowledge due to teaching. It is also unclear what teacher-level
factors (e.g., cognitive abilities and professional background) are necessary for supporting
teacher learning through teaching. The first paper addressed these questions, in which teacher
learning through teaching is defined as changes in teachers’ PCK through involving in daily
teaching activities without systematic external support for that learning, such as PD programs
and structured coaching (Kyndt et al., 2016). Drawing on three years of longitudinal data from
207 novice elementary and middle school teachers across states and using linear growth
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
3
modeling, this study found teachers increased their PCK through their own teaching practice
during early years of their careers in the profession, albeit at different rates. The result was robust
when adjustments were made for external support teachers’ received during the study period,
such as PD and peer collaboration. Furthermore, teachers’ content knowledge about school
mathematics significantly predicted the pace at which teachers developed PCK through teaching.
In the second paper, my coauthors and I explored how an asynchronous OPD program
operationalized in a dialogue-based ITS, designed to enhance teachers’ content-specific
knowledge, influenced students’ mathematics performance. The unique contribution of this paper
is that it provides a working prototype to scale up high-quality teacher PD by integrating
evidence of effective face-to-face PD with the advantages of dialogue-based ITSs that interact
with teachers and deliver timely, individualized feedback. In addition, compared to the
substantial body of research on studying the impact of in-person PD programs, the field presents
weak evidence regarding whether participating in asynchronous OPD programs is related to
changes in teacher knowledge and skills, as well as in their students’ learning outcomes by using
objective measures such as direct assessments (Copur-Gencturk, Li, Cohen, et al., 2024; Lay et
al., 2020). In this paper, we provided detailed descriptions of OPD content and learning
activities, as well as the design of this intelligent teacher OPD system. We also measured the
impact on students’ mathematics performance of this PD program designed for their teachers by
conducting a randomized experiment. Based on data collected from 1727 middle school students
in an experiment in which teachers of these students were randomly assigned to the PD program
(N = 29) or the business-as-usual condition (N = 24), we found the program had a statistically
significant impact on students’ mathematics performance, adjusted for teachers’ baseline
content-specific knowledge for teaching mathematics.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
4
In the third paper, I analyzed teacher learning of content-specific knowledge, as measured
by self-reported instruments and direct assessments, through participating in an asynchronous
OPD program (i.e., the same program as described in paper 2). Asynchronous OPD has been
increasingly popular as a solution to scale up traditional face-to-face PD due to its flexibility in
time and location (Bragg et al., 2021; Dede et al., 2009). Given their large-scale nature, teacher
learning, defined as changes in teacher knowledge and skills, in these programs is typically
measured using self-reported instruments, despite doubts about the accuracy of teachers’ selfassessments of their knowledge improvement. Additionally, research on teacher learning in
asynchronous online learning contexts has dominantly focused on understanding how programlevel factors such as design and implementation features influence teachers’ learning experience
(Bragg et al., 2021; Brennan et al., 2018; Littenberg-Tobias & Slama, 2022; Reeves & Pedulla,
2011). However, inadequate attention has been paid to how teachers regulate their learning by
using different learning strategies in these contexts. To address these limitations, in the third
paper, I compared teachers’ self-reported learning of their content-specific knowledge and their
observed learning, as measured by direct assessments. I also explored how each outcome
measure was related to the quality of teachers’ instructional practice, as measured by teaching
artifacts along with student work samples, after participating in the program. Additionally, I
investigated how teachers’ use of self-regulated learning strategies (SRL) (i.e., organization,
elaboration, and metacognitive monitoring) was related to their self-reported and observed
learning in the program, respectively. I also explored how the use of SRL strategies was related
to the accuracy of teachers’ self-assessments of their learning. Using data from 57 middle school
mathematics teachers who completed the asynchronous OPD program, I did not find a
statistically significant correlation between teachers’ self-reported learning of content-specific
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
5
knowledge for teaching and the learning measured by direct assessments. Teachers frequently
involving in monitoring and evaluating their understanding of the learning content in the PD
demonstrated more learning gains of CK, as measured by direct assessments. However, it seems
that none of the SRL strategies helped teachers be more accurate in their self-assessment of
knowledge improvement from the PD. Lastly, direct assessments of CK seemed to be a better
approach to capture the knowledge and skills required for high-quality mathematics instruction
overall, as measured by teaching artifacts along with student work samples.
Conceptualizing Content-Specific Knowledge for Teaching Mathematics
As emphasized in the introduction, this dissertation focuses on teachers’ learning of
content-specific knowledge for teaching mathematics including CK and PCK as two key
components. Since three papers in this dissertation focuses on either PCK (paper 1) or CK and
PCK together (paper 2 and 3), it will be helpful to first outline a common framework shared by
three papers to illustrate the conceptualization of CK and PCK.
The components of teachers’ content-specific knowledge required for effective
mathematics teaching and learning remain a subject of ongoing debate in teacher education
research (Ball et al., 2008; Copur-Gencturk & Tolar, 2022; Charalambous et al., 2020; Depaepe
et al., 2013). Despite differences in conceptualizations, teachers’ CK and PCK are two
theoretically and empirically distinct elements that have been recognized as essential elements of
content-specific knowledge across national and cross-cultural studies (e.g., MKT, see Ball et al.,
2008; Mathematics Teaching Expertise, see Copur-Gencturk & Tolar, 2022; COACTIV, see
Krauss, Baumert, et al., 2008; MT-21, see Schmidt et al., 2007; TEDS-M, see Tatto et al., 2008).
In the dissertation, I define content-specific knowledge for teaching mathematics as including
CK and PCK, informed by existing conceptualizations of teacher knowledge and research in
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
6
mathematics (e.g., Ball et al., 2008; Copur-Gencturk & Tolar, 2022; Shulman, 1986; Tatto et al.,
2008).
In this dissertation, teachers’ CK is conceptualized as a conceptual understanding of the
mathematical content (e.g., rules and definition of math concepts; National Research Council,
2001) being taught at school (Copur-Gencturk, 2021; Tatto et al., 2008) and the ability to solve
mathematics problems in the school curriculum by reasoning and evaluating different
mathematical situations (Blömeke et al., 2014; NRC, 2001; Copur-Gencturk & Doleck, 2021;
Copur-Gencturk & Tolar, 2022; Tatto et al., 2008). Conceptual understanding, in contrast to
procedural knowledge, involves the knowledge of meaning behind rules and definitions. For
example, knowing why the application of invert-and-multiply algorithm can be employed to
solve fraction division problems such that it reflects teachers’ conceptual understanding of the
measurement meaning of division and the reference units (Copur-Gencturk, 2021). Problem
solving encompasses the skills needed to solve a word-problem by making sense of the problem
and then translating it to a mathematical expression (Copur-Gencturk & Doleck, 2021). For
example, setting up a proportion to solve for an unknown quantity is a manifestation of teachers’
word-problem solving proficiency. This is because teachers need to decide what units need to be
coordinated and in order to determine the setup of the proportion based on their sense-making of
a word problem (Copur-Gencturk & Doleck, 2021). Mathematical reasoning and evaluating
refers to one’s ability to “think logically about the relationships between concepts and situations”
(NRC, 2001, p.129), such as justifying one’s solution to a mathematics problem with appropriate
reasoning (NRC, 2001).
The conceptualization of PCK was grounded in Shulman’s (1986) definition of PCK and
included the mathematics-specific notion of PCK as elaborated across various models (e.g., Ball
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
7
et al., 2008; Krauss, Baumert, et al., 2008; Copur-Gencturk & Tolar, 2022; Schmidt et al., 2007;
Tatto et al.,2008). PCK, defined as the knowledge that makes mathematics content
comprehensible for students (Ball et al., 2008; Shulman, 1986), includes the knowledge about
students’ mathematical thinking and instructional strategies for effective mathematics teaching
(Ball et al., 2008; Copur-Gencturk & Tolar, 2022; Krauss, Baumert, et al., 2008; Tatto et al.,
2008). In terms of the knowledge of students’ mathematical thinking, a teacher should know
common conceptions and misconceptions and common patterns students have as they are
learning certain mathematical concepts (Ball et al., 2008; Copur-Gencturk & Tolar, 2022). For
example, teachers should know that some students might think increasing one quantity will
always increase the other quantity by the same amount (addictive thinking) instead of
understanding that in proportional relationships, quantities increase at the same rate
(multiplicative thinking). In terms of the knowledge of instructional strategies for effective
mathematics teaching, a teacher should know how instructional tools and practices can promote
or hinder students' understanding of the mathematics content as well as how to select or adapt
activities and respond to students' mathematical needs based on the students' level of
understanding (e.g., Ball et al., 2008; Copur-Gencturk & Tolar, 2022). For example, to help
students overcome the confusion between additive and multiplicative thinking, a teacher could
use interactive simulations that allow students to manipulate variables and observe how changes
in one quantity affect another in both proportional and additive scenarios.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
8
Chapter 1 : Learning Through Teaching: The Development of Pedagogical Content
Knowledge Among Novice Mathematics Teachers
Abstract
Pedagogical content knowledge (PCK) has been widely recognized as an important
aspect of the expertise for teaching. However, the extent to which teachers’ own teaching
practice can be a learning resource for them to develop PCK has not been systematically
explored. This empirical study aimed to explore the contribution of teachers’ own teaching
practice in the development of their PCK by taking into account the influence of other external
on-the-job learning opportunities teachers may receive during the same period. Using
longitudinal data from 207 elementary and middle school teachers in the United States, we found
that teachers increased their PCK through teaching on their own, albeit at different rates. Our
findings were robust when other formal and informal support teachers received were taken into
account. Our findings underscored the importance of teachers’ robust knowledge of school
mathematics in the development of their PCK through teaching.
Key words: pedagogical content knowledge, learning through teaching, mathematical
knowledge for teaching, novice teachers, longitudinal study
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
9
Introduction
For decades, policy makers and school districts in the United States have striven to
improve student learning outcomes by improving teacher effectiveness. Each year, the federal
and state governments spend billions of dollars on teacher professional development (PD;
(Birman et al., 2007; Picus & Odden, 2011). Yet most PD programs do not appear to have
positive impacts on teacher and student outcomes (Jacob et al., 2017). Prior work has suggested
that PD programs such as coaching, which provide teachers with content-focused support in the
context of their own classrooms, are more likely to lead to changes in teachers’ knowledge and
instruction. However, such support is not available for most teachers in the United States (Kraft
& Blazar, 2018). Given that teaching in the United States has historically been an isolated work
and that high-quality PD at scale is limited, seeking out and leveraging learning opportunities in
one’s own classroom can be crucial to teachers’ continuous growth, especially for novice
teachers, who have limited experience with students and teaching. Thus, understanding the
extent to which teachers are able to learn in their own classrooms and what factors influence
such learning have great implications for teacher education.
To date, researchers have taken different approaches to understanding how the experience
of teaching may lead to teachers’ professional growth. Some have attempted to identify the
mechanism through which teachers learn from their teaching practice, such as studying
curriculum materials, inquiring about students’ thinking during interactions with students in
class, reflecting on students’ responses and their own instructional practice, and trying out new
teaching strategies (Collopy, 2003; Hoekstra et al., 2007; Lloyd, 2008; Lohman & Woolf, 2001;
Remillard & Bryans, 2004). Others take a holistic approach to estimate the effect of teachers’
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
10
years of teaching experience in the profession on student achievement as an indicator of teacher
effectiveness (Blazar, 2015; Harris & Sass, 2011; Papay & Kraft, 2015).
Although it seems self-evident that teachers learn from their teaching experience,
limitations in two areas have prevented scholars from better understanding how teaching can
support teachers’ professional growth. First, prior work on teachers’ learning opportunities in
their own practice has often been accompanied by researchers as instructors promoting certain
curriculum reforms or has been undertaken in the context of student teaching prior to graduation.
Thus, we do not know whether changes in teachers’ PCK can be attributed to their practice of
teaching or other components of the interventions they received. In addition, changes in teachers’
PCK have mostly been based on observations or interviews after one or a few teaching episodes.
It is unclear whether such changes can be sustained over time. Second, some economic research
has examined the long-term effect of teachers’ years of teaching experience on student learning
outcomes. However, such a holistic approach provides limited information on which learning
experiences teachers had that led to changes in their effectiveness (i.e., formal PD or the teaching
itself). Additionally, using student achievement as a proxy measure does not allow researchers to
know what specific knowledge the teachers obtained from their teaching experience that
contributed to the student learning outcomes. In sum, to understand whether teachers can
continuously learn through their own practice, there is a need to pay attention to other learning
opportunities that teachers receive.
Third, current knowledge about what teachers learn in the classroom is fragmented.
Teachers can develop a wide range of knowledge and skills on the job. Some can be generic,
such as classroom management strategies, whereas others can be specific to the subject matter
being taught, such as learning to select tasks that are cognitively aligned with students’ level of
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
11
mathematical understanding or learning which specific tools or representations to use to teach a
particular concept. Although both generic and subject-specific learning are important for the
overall quality of students’ learning experiences, developing the knowledge and skills specific to
teaching the subject matter is the key to quality mathematics instruction and students’
mathematics learning (Baumert et al., 2010; Copur-Gencturk, 2015; Kersting et al., 2010).
However, when examining the effect of teachers’ years of teaching experience, researchers often
use student achievement as a measure of teacher effectiveness without measuring the changes in
teachers’ knowledge directly. Such an approach is problematic because it limits the implications
of those findings for research and practice in teacher education.
In this study, we aimed to explore whether and to what extent elementary and middle
school teachers developed PCK of mathematics through teaching on their own. Here, PCK refers
to the knowledge of how students learn mathematics concepts as well as their knowledge of the
instructional practices and representations that make the content accessible to students (Shulman,
1986). We specifically focused on teachers’ PCK of fractions and ratios because these two
mathematics domains are major topics in K-8 students’ mathematics education and are
fundamental to students’ advanced mathematics learning (Lobato et al., 2010; Siegler et al.,
2011). However, students continue to struggle with understanding fraction and ratio concepts
(Ayan & Isiksal-Bostan, 2019; Namkung et al., 2018). Thus, teachers’ knowledge of student
thinking and teaching fractions and ratios is crucial for students’ learning of these two
mathematics domains. In addition, because learning on the job is more pronounced among
teachers in the early years of their teaching career (Kini & Podolsky, 2016), in terms of
influencing their students’ achievement in specific subject areas, this study focused on novice
teachers’ learning by collecting data from more than 200 teachers who had less than three years
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
12
of teaching experience at the time of study enrollment. In the following section, we lay out our
conceptual framework for this study and summarize the prior literature on this issue. We then
present our study design and findings. We end the article with a discussion of our findings and
the implications for research, teacher education, and policy.
Conceptualizing Teacher Learning Through Teaching
Within the teaching profession, different on-the-job learning opportunities are available,
including formal learning opportunities, such as professional development, and informal learning
opportunities, such as learning with colleagues and learning on one’s own. We focused on
learning through teaching, which we define as the change in teachers’ PCK through interactions
with students and the curriculum materials around the content without systematic external
support for that learning, such as PD programs or structured mentoring (Kyndt et al., 2016).
Our focus on teachers’ development of PCK is both theoretical and empirical. Scholars
have underscored the importance of PCK as a necessary domain of knowledge for teaching (Ball
et al., 2008; Copur-Gencturk & Tolar, 2022; Shulman, 1986). Empirical work has supported its
importance by documenting its instrumental role in quality teaching and student learning
(Baumert et al., 2010; Copur-Gencturk, 2015). Our conceptualization of PCK focuses on the
components that are recognized across national and cross-cultural studies (Copur-Gencturk &
Tolar, 2022; Schmidt et al., 2007; Tatto et al., 2008) and other subjects (e.g., Jordan et al., 2018).
Pedagogical content knowledge includes teachers’ understanding of content-related issues
around students’ learning, such as knowing students’ common understanding of certain
mathematical concepts and being able to gauge students’ understanding based on their responses
(Ball et al., 2008; Copur-Gencturk & Tolar, 2022; Tatto et al., 2008). It also encompasses the
knowledge and understanding of the affordances and limitations of different representations and
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
13
tools in fostering students’ learning of a particular concept (Copur-Gencturk & Tolar, 2022;
Tatto et al., 2008). Figure 1.1 is a visual framework describing our conceptualization of teachers’
PCK development and informing our study design, with the solid line indicating the focus of this
study.
Our rationale for anticipating that teaching can foster PCK development is because
interactions between a teacher and students around the content theoretically create opportunities
for teachers to learn. Specifically, during lesson planning, teachers may learn from curricular
materials about instructional strategies and using different representations to solve mathematics
problems (Davis & Krajcik, 2005). While implementing a lesson, teachers could gain insights
into students’ mathematical understanding as well as the affordances and limitations of different
representations for facilitating students’ learning (Remillard & Bryans, 2004). Reflection after
teaching could enable teachers to evaluate their teaching practice by comparing their expected
teaching outcomes with students’ real learning outcomes. Any teacher can experience a
discrepancy between expectations and reality, but prior studies have indicated that novices are
more likely to experience this (Flores, 2006; Mintz et al., 2020).
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
14
Figure 1.1
A Visual Conceptual Framework
Literature Review
Prior Work on Teachers’ Development of PCK Through Teaching
Over the past few decades, scholars have devoted their attention to teachers’ learning on
the job (Hoekstra & Korthagen, 2011; Ladd & Sorensen, 2017; Leikin & Zazkis, 2010; Lohman
& Woolf, 2001; Papay & Kraft, 2015). Two distinct approaches have been employed to explore
teacher learning through teaching. One line of research uses an ethnographic or case-study
approach with a small group of teachers (Chazan et al., 2008; Collopy, 2003; Hoekstra et al.,
2007), and the other focuses on the cumulative impact of the teaching experience on student
achievement (Blazar, 2015; Harris & Sass, 2011; Ladd & Sorensen, 2017; Papay & Kraft, 2015),
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
15
which includes not only teachers’ learning through their own teaching but also learning in
interventions such as by participating in PD programs.
In particular, the first line of research has focused on what and how teachers learn from
teaching activities based on observations and interviews, primarily using qualitative methods
with a small number of in-service or prospective teachers (Collopy, 2003; Hoekstra et al., 2007;
Lloyd, 2008; Remillard & Bryans, 2004). Results of these studies suggest that teachers seemed
to enhance their understanding of students’ thinking and develop their knowledge of
mathematics teaching by engaging in planning and enacting the curriculum, reflecting on
students’ responses and instructions, and making adaptions to the curriculum design and
instruction (Lloyd, 2008; Lohman & Woolf, 2001; Remillard & Bryans, 2004). For example, in
an analysis of classroom observations and interviews with eight mathematics teachers in a public
elementary school, Remillard and Bryans (2004) found that three novice teachers increased their
use of students’ explanations and their tendency to make sense of students’ ideas by following
the suggestions in the curriculum material (the teachers’ guide), which emphasized eliciting
students’ explanations of their work. Gaining such insights into students’ thinking further
motivated one teacher (Miss Larson in their study) to implement new practices in response to
diverse and unexpected student ideas, which created further opportunities for the teacher to learn
about mathematics teaching. Similarly, Lloyd (2008) reported that a student teacher named Anne
began emphasizing key mathematical ideas more explicitly in her curriculum design and
instructional practice based on shortcomings she identified after teaching the same lesson four
times (to different groups of students). These studies indicate that novice or prospective teachers
seem to learn through teaching on their own when they engage in teaching activities. However,
prospective teachers or novice teachers might also involve in other learning activities during the
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
16
study period. For example, prospective teachers might take other teaching methods courses, or
in-service teachers might be involved in local reforms on teaching. The changes observed in
these studies might be attributed to both teachers’ learning through their own teaching practice
and any other external learning support the teachers received during the same period. Moreover,
these studies have been conducted with only a small number of teachers, and further research is
required to test the generalizability of these findings.
The other line of research has measured the impact of teaching experience on student
achievement gains (i.e., an indicator of teacher effectiveness), using longitudinal data with
teacher/student fixed effects (Kini & Podolsky, 2016). In Kini and Podolsky’s (2016) review of
this line of research, studies that analyzed longitudinal data with teacher fixed effects showed
consistent evidence that the teaching experience is positively related to teacher effectiveness, as
measured by student achievement gains. In particular, some of these studies found that teacher
effectiveness increased the most during the first few years of teaching (Harris & Sass, 2011;
Papay & Kraft, 2015). For example, using 2001 to 2008 data from teachers and students in
Grades 4 through 8 in a large urban school district in the southern United States, Papay and Kraft
(2015) found significant improvements in teachers’ effectiveness during their early careers
(Years 1–5). In addition, when teachers accumulated experience at the same grade level or in the
same subject area, increases in teacher effectiveness became greater (Blazar, 2015; Ost, 2014),
which suggests teachers might gain some forms of subject-specific by teaching the same content.
As another example, when using teacher–student data from Grades 3 through 8 in North Carolina
between 1995 and 2012, Ost (2014) found that elementary school mathematics teachers with
grade-specific experience outperformed teachers with no experience at that grade level in terms
of improving student achievement. This finding implies that teaching the same subject and
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
17
content over time may have a greater impact on teachers’ effectiveness in increasing student
achievement. However, what knowledge teachers gain through teaching that leads to students’
achievement growth has not been adequately investigated. In the literature mentioned previously,
researchers have often used student achievement as a measure of teacher effectiveness based on
the teachers’ years of teaching experience, which, to some extent, prioritizes teachers’
accountability but provides limited implications for research and practice in teacher education
communities. Knowing what specific knowledge teachers are able to learn through teaching and
which learning opportunities contribute to such knowledge attainment will help in optimizing
different support provided for in-service teachers on the job.
Knowledge and Skills Related to the Development of PCK
In addition to the question of whether teachers develop PCK through teaching on their
own, it remains an open question that how teachers’ professional background prior to teaching
(i.e., their paths to entering the profession and the credential types they hold) or other
components of their content-specific expertise (e.g., their subject matter knowledge, their skill in
noticing mathematics teaching and student learning issues in the work of teaching) might
contribute to their PCK development. One widely accepted condition for developing PCK is
teachers’ content knowledge (Kahan et al., 2003; Krauss, Brunner, et al., 2008; Tröbst et al.,
2018). As Shulman (1986) noted, PCK is about transforming the subject matter from teachers’
content knowledge into forms of knowledge that are accessible to students, which implies that
teachers’ content knowledge is fundamental to the development of PCK. Several prior studies
have noted a positive correlation between teachers’ content knowledge and their PCK (CopurGencturk et al., 2019; Kleickmann et al., 2013; Krauss, Brunner, et al., 2008; Tröbst et al., 2018).
For example, drawing from a nationally representative sample of middle school physical science
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
18
teachers, Sadler (2013) found that teachers who did not know the correct answers to science
questions were also unaware of the most common student misconceptions related to the concepts
involved in the items. Using a randomized controlled trials design, Tröbst and colleagues (2018)
found a significant increase in teachers’ PCK among a group of mathematics teachers who
received an intervention focused on developing only their content knowledge. Thus, it is possible
that teachers’ own understanding of mathematics could help them gain PCK from their teaching
by allowing them to follow the mathematical arguments in students’ responses and link their
choices of representations and instructional strategies to students’ comprehension of the
mathematical ideas. Still, current evidence is limited and is mostly correlational in nature. It
warrants further investigation to understand the extent to which having strong content knowledge
will help teachers continuously develop PCK in the long term. Answering this question is
important because it will help teacher educators understand what content-specific expertise
contribute to teachers’ learning through teaching over a long period of time.
Teachers’ noticing, the act of attending to and interpreting classroom events (Sherin &
Van Es 2009), may also influence what teachers can learn through teaching. A wide range of
events happening in class can catch teachers’ attention, some of which are content specific (e.g.,
students’ explanations of work) and some of which are not (e.g., the classroom climate). Thus,
depending on the extent to which content-specific events are noticed and how they are perceived
by teachers, teachers might have different opportunities to gain knowledge and skills specific to
teaching mathematics. Prior work has documented a positive correlation between teachers’
noticing of content-specific events and their PCK (Carpenter et al., 1989; Copur-Gencturk &
Tolar, 2022; Franke et al., 2001; Meschede et al., 2017). Research has shown that when teachers
continue to attend to their students’ thinking and make productive use of what they notice in their
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
19
teaching, they continue to develop their understanding of students’ mathematical thinking, one
form of PCK, even years after they completed professional development (Franke et al. 2001).
These findings indicate that content-specific noticing can create opportunities for teachers to
improve their PCK, but whether teachers can take up these opportunities in a naturally occurring
teaching setting (i.e., without external prompts to notice specific events) to gain PCK requires
further investigation.
Finally, teachers’ professional background prior to teaching might influence the
development of their PCK. The research literature provides some evidence that teachers’
credentials and their certification pathways may relate to their PCK (Baumert et al., 2010;
Kleickmann et al., 2013; Krauss, Brunner, et al., 2008). For example, Hiebert and colleagues
(2017) tracked the PCK growth of 53 graduates from a traditional teacher education program in
the United States designed to equip prospective teachers with the skills to learn from their
teaching. They found that graduates of this program continued to develop PCK even years after
graduation. This evidence highlights the possibility that well-designed traditional preparation
programs can prepare their students with a knowledge base that allows them to learn effectively
through their own teaching. Credential type is another factor that may be related to teachers
learning through teaching. For instance, a specialized teaching credential in mathematics often
indicates teachers completed additional mathematics coursework and are thereby more likely to
have stronger mathematics backgrounds, which might contribute to the formation of their PCK.
Prior work with U.S. teachers has documented a significant positive, albeit small, correlation
between teachers’ participation in additional mathematics coursework and their mathematics
knowledge for teaching. International studies also have shown that teachers with a credential in
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
20
mathematics display higher proficiency in both content knowledge and PCK than do teachers
with a general teaching credential (Baumert et al., 2010; Kleickmann et al., 2013).
Present Study
As summarized above, considerable research has been done on teachers’ learning on the
job. However, comparatively little has focused on the content-specific teaching expertise that
teachers achieve through day-to-day practice on their own. The present study aimed to address
this gap in the current literature by focusing on the development of PCK, which seems to play a
key role in the quality of teachers’ mathematics teaching and their students’ mathematics
learning (e.g., Baumert et al., 2010; Copur-Gencturk, 2015; Kersting et al., 2012). Further,
existing studies have not directly measured what teachers learned, and we overcame this
limitation by using a direct measure of teachers’ PCK. This direct measure captured teachers’
knowledge by closely simulating the way teachers used PCK in teaching, such as analyzing and
interpreting students’ responses in relation to their understanding of the targeted mathematics
content and by reflecting on instructional practices they observed in video clips from actual
mathematics classrooms. Indeed, PCK as measured by teachers’ observations of authentic
classroom videos has been associated with students’ mathematics outcomes (Kersting et al.,
2012). We also considered other potential sources of teacher learning such as the formal
professional support teachers received from their schools and districts, along with other forms of
informal support they might have received, such as grade-level meetings with fellow teachers in
their schools. By doing so, we were able to more accurately estimate the contribution of
teachers’ own teaching practice on their knowledge development more accurately. Finally, we
investigated the extent to which the growth of teachers’ PCK varied by other elements of teacher
proficiency that have been associated with teachers’ PCK, such as their content knowledge and
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
21
noticing skills. Using data collected from more than 200 teachers in 3 consecutive years, we
explored the following research questions in this study:
(1) To what extent do teachers gain PCK through teaching on their own over time?
(2) To what extent are teachers’ CK and content-specific noticing skills related to the growth in
their PCK?
(3) To what extent are teachers’ professional backgrounds (certification path and credential type)
related to the development of their PCK?
Methods
Sample
The data used in this study were collected for a multisite research project designed to
investigate teachers’ content-specific learning through teaching mathematics (Copur-Gencturk &
Li, 2023; Woods & Copur-Gencturk, 2024). Teachers were asked to complete a set of surveys
annually for 3 academic years, with monetary compensation provided. The study design and data
collection procedures were reviewed and approved by the authors’ Institutional Review Board
before the study. To increase the generalizability of the study findings, teachers across the United
States were invited via email1
to take part in this study. The email included a brief description of
the study and a link to the initial survey. This survey included a consent form that detailed the
purpose of the study, and the activities participants would be expected to complete as well as the
confidential nature, benefits, and potential risks of the study. Data were collected through online
surveys only from those who were eligible for the study (i.e., who were teaching mathematics
and had less than 3 years of teaching experience at the beginning of the study) and who gave
consent. Participation in the study was entirely voluntary, and participants could withdraw from
1Teachers were contacted either by the research team through email addresses we obtained from an education
research company or by our district or educational organizations on our behalf.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
22
the study at any time with no penalty. Table 1.1 shows most teachers in the analytic sample were
identified as White (70.1%) and female (84.1%), which were close to national teacher
demographics (National Center for Education Statistics, 2022). No statistically significant
differences were found between teachers who completed the 3-year study (N = 155) and those
who did not (N = 52) in terms of demographic characteristics including race,
2
(3, N = 207) =
1.03, p = .80 and gender,
2
(2, N = 207) = 3.16, p = .21 based on results of chi-square test of
independence. They were not different in terms of PCK level at the beginning of the study, t
(207) = 0.51, p = 0.61 based on the result of a t-test.
Table 1.1
Background Characteristics of Teachers in the Present Study
Teacher Characteristic
Analytic Sample
(%)
U.S. public elementary and
secondary school teachersa
(%)
Sex
Female 84.1 76.8
Male 15.5 23.2
Race/ethnicity
White 70.1 79.9
Black 8.2 6.1
Hispanic 9.7 9.4
Other (e.g., Asian, Pacific Islander) 12.1 4.6
Credential types
Credential in mathematics 18.4 NA
Credential in multiple subjects 69.6 NA
Credential in other subjects (e.g.,
special education)
12.1 NA
Route entering the profession
Traditional certificationb 72.0 76.8
Alternative certificationc 28.0 23.2
Note. N = 207.
a. From Digest of Education Statistics,
https://nces.ed.gov/programs/digest/d22/tables/dt22_209.10.asp.
b. To obtain traditional certification, an individual must first earn a bachelor’s degree and
complete a teacher preparation program before they can begin teaching.
c. Alternative certification allows individuals with a bachelor’s degree to teach without
necessarily having completed a formal teacher preparation program prior to teaching.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
23
Measures and Procedures
PCK Measure
Teachers’ PCK was measured by having them watch eight video clips of authentic
mathematics instructions and respond to open-ended questions about the videos (Kersting, 2008).
These clips, each lasting between 2–3 minutes, focused on student-teacher interactions around
fraction or ratio concepts in Grades 3–7 (i.e., the grade levels teachers were teaching during the
study period). Teachers were given context for each video to understand the instructional content
shown in the videos. They were then asked to analyze the mathematical understanding of
students and provide suggestions to improve the teaching practices shown in the videos to
increase students’ mathematical understanding. We used the same survey each year to capture
changes in teachers’ PCK. Given the long intervals between measure administrations (one
academic year) and no answer keys provided, we believe changes in teachers’ responses were
not due to the opportunities to practice the tasks. Additionally, a related study (Copur-Gencturk
& Orrill, 2023) using a repeated measure with items similar to those in our study, shown that
retaking the same items did not inflate teachers’ scores.
Teachers’ responses were evaluated using a 4-point rubric (see Appendix Table A.1 and
Table A.2) which captured teachers’ ability to analyze students’ mathematical thinking (1 =
No/incorrect analyses; 4 = Accurate analyses with evidence) and provide ways to improve the
teachers’ mathematics teaching practices (1 = instructional strategy irrelevant to mathematical
issues; 4 = At least one correct instructional strategy with a rationale). To reduce scoring bias,
responses were coded by two raters unaware of the year responses were from. Strong agreement
was reached between two raters (Cohen’s kappa = 0.92). The measure demonstrated a high
internal consistency (i.e., reliability), with the Cronbach’s alpha statistic ranging from 0.79 to
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
24
0.84 across the three years. Teachers’ total scores on the items for each year of administration
indicated their PCK for that year and were used in further data analyses (see Table 1.2 for
descriptive statistics).
Time
To determine if and how much teachers’ PCK changed during the study period, we
created a time variable to denote each data collection point during the study. The variable takes a
value of 0 for Year 1 (i.e., the beginning of the study), a value of 1 for Year 2, and a value of 2
for Year 3. Thus, the variable reflected the number of academic years elapsed since the study
began.
External Professional Support on The Job
Formal Learning Opportunities. To account for the influence of formal learning
opportunities on teachers’ PCK development during the study period, we asked teachers to report
the number of hours of formal support on mathematics teaching and learning they received,
using items modified from prior studies (Copur-Gencturk et al., 2019; Garet et al., 2016). The
formal support included professional development provided by schools or districts, induction,
mentoring, coaching, and structured courses (e.g., college courses). We administered the survey
at the end of Year 2 and Year 3, respectively, to capture support teachers received during that
year (i.e., between the previous and current administration of the survey). We calculated the
average hours of formal support teachers received over the two years by summing the hours of
support received during Year 2 and Year 3, then dividing this total by 2. Then, we created a new
variable to indicate the intensity of support teachers received during our study period. Teachers
who received fewer than 14 hours of professional development were categorized as having a
low-level formal support. Those who received between 14 and 49 hours were classified as
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
25
having a medium level of support, while teachers who received 49 hours or more were
considered to have substantial support (see Table 1.2 for descriptive statistics). The three
categories are based on previous research indicating that professional development programs
lasting over 49 hours on average positively affected student learning, whereas program length
fewer than 14 hours did not (Yoon et al., 2007).
Informal Learning Opportunities (Peer Support). Other than formal support teachers
received around teaching and teacher learning through their own teaching practice, teachers
could have learned from/with their colleagues. Therefore, we asked teachers to report the
numbers of hours they spent in activities involving both structured and unstructured discussions
with colleagues around mathematics teaching and learning. We administered the survey at the
end of Year 2 and Year 3, respectively, to capture peer support teachers received during that year
(i.e., between the previous and current administration of the survey), using items derived from
earlier research (Garet et al., 1999, 2016). These activities include regular meetings with gradelevel teachers, study groups, or professional learning communities. Similar to the approach for
creating the new formal support variable above, we firstly calculated the average hours of peer
support teachers received over the two years. We then created a new variable to indicate the
intensity of the peer support teachers received during the study period on a 3-point scale. Those
who spent 7 or fewer hours in peer learning activities annually were categorized as having a lowlevel peer support. Medium level of peer support was identified as teachers spending between 8
and 27 hours on activities involving learning with peers. Substantial peer support was defined as
teachers spending 27 hours or more on peer-support learning activities. The way we created this
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
26
scale was based on prior nationally representative data (National Council on Teacher Quality,
2022 [NCQT]; Teaching and Learning International Survey [TALIS], 2018)2
.
Teacher-Level Factors
Content Knowledge. To measure teachers’ CK, we administered a survey with fourteen
constructed-response items on fraction, ratios, and proportional relationships at the beginning of
the study. All the items are adapted from prior literature and teacher knowledge assessments
(e.g., Izsák et al., 2019; Siegler & Lortie-Forgues, 2015; Van de Walle et al., 2022) and were
aligned with our conceptualization of CK in this study. Two raters coded the responses to capture
the correctness of teachers’ final answers as well as the accuracy of the reasoning they provided
or the validity of the methods they used to solve these problems (for detailed item descriptions
and the scoring rubric, see Copur-Gencturk et al., 2022; Copur-Gencturk & Doleck, 2021;
Copur-Gencturk & Ölmez, 2022). The Cronbach’s alpha, indicating the internal consistency of
this scale, was 0.74. We used the standardized scores around the mean to indicate teachers’
overall CK (see Table 1.2 for descriptive statistics of the variable).
Content-Specific Noticing. At the beginning of the study (i.e., Year 1), we captured
teachers’ content-specific noticing skills by having them watching four video clips of math
instruction and identify the most notable aspect they observed concerning students’ math
learning and the teachers’ instruction of the specific math concept. We evaluated teachers’
responses by using a 4-point rubric designed to capture what teachers noticed (the teacher’s
pedagogy, the students’ mathematical thinking, the general classroom climate, or other) and how
they interpreted it (purely descriptive, or analytical and interpretive) (see Copur-Gencturk &
2 The NCTQ data includes information from 148 school districts on the collaboration time allotted to teachers. The
TALIS data reflects the frequency of collaborative activities as reported by representative teacher samples in the
U.S.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
27
Rodrigues, 2021 for detailed rubric). Each video was coded independently, with at least two
raters coding twelve percent of the data. A high Cohen’s kappa statistic of .81 demonstrated
strong agreement between raters. The Cronbach’s alpha of the scale was .66. We used the
standardized scores around the mean to indicate teachers’ overall noticing skills.
Teachers’ Professional Background Characteristics. We created two binary variables
to represent teachers’ certification pathways (i.e., alternative certification = 1; traditional
certification = 0) and credentials (i.e., having a math credential = 1; holding a credential in other
subjects = 0).
Table 1.2
Descriptive Statistics for the Variables Used in the Reported Analyses
Variable n M SD Min Max
Pedagogical Content Knowledge 1 207 24.69 5.28 16 44
Pedagogical Content Knowledge 2 170 26.34 5.36 16 43
Pedagogical Content Knowledge 3 166 29.29 6.92 16 50
Formal Support 155 0.88 0.73 0 2
Peer Support 155 1.15 0.81 0 2
Content Knowledgea 207 0.03 0.99 -3 2
Content-Specific Noticing Skillsa 207 0.07 0.99 -2 2
Note.
a. Content knowledge and noticing skills were based standardized scores around the mean. The
minimal and maximum values observed in the table were rounded to the nearest whole number.
Analytic Plan
We employed the linear growth modeling approach to explore growth patterns of
teachers’ PCK. The linear growth model is appropriate for our longitudinal design and can
efficiently manage missing data on the outcome variable. It considers the dependence within
teachers and between-teachers heterogeneity by introducing time-specific and teacher-specific
effects (Rabe-Hesketh & Skrondal, 2008). We applied a two-level model. Specifically, teacher
i’s PCK in year t is modeled at level 1, in which each teacher’s PCK pattern (𝑃𝐶𝐾𝑖𝑡) is modeled
as a trajectory that is defined by the teacher’s initial PCK level (𝛽𝑜𝑖) and the growth trajectory
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
28
parameter (𝛽1𝑖
) in relation to time. 𝑟𝑖𝑡 is the level 1 error term. Level 2 parameters identified
how the teacher-level factors might account for variations in their initial PCK and PCK growth.
A random effect in slope for the time variable was included to allow for variations in the growth
patterns across individual teachers. A random intercept was included to allow for variations in
teachers’ initial level of PCK. See below specification for an unconditional model:
Level 1: 𝑃𝐶𝐾𝑖𝑡 = 𝛽𝑜𝑖 + 𝛽1𝑖𝑇𝑖𝑚𝑒𝑖𝑡 + 𝑟𝑖𝑡,
Level 2: 𝛽𝑜𝑖 = 𝛾00 + 𝜇0𝑖
𝛽1𝑖 = 𝛾10+𝜇1𝑖
Where:
𝑃𝐶𝐾𝑖𝑡 refers to the PCK score for teacher i at time t.
𝑇𝑖𝑚𝑒𝑖𝑡 takes a value of 0 when study began (Year 1), a value of 1 for Year 2, and a value of 2
for Year 3.
𝛽𝑜𝑖 denotes the initial level of PCK for teacher i when study began (i.e., at time 0).
𝛽1𝑖
is the yearly rate of change in PCK for teacher i
𝛾00 is the average initial PCK when study began
𝛾10 is average yearly rate of change
𝜇0𝑖
, 𝜇1𝑖 are random effects.
Before using the proposed model above, several steps were taken to evaluate the
appropriateness of using a multilevel model for the data. We firstly estimated an unconditional
random intercept model to check the proportion of variance in the outcome (PCK) that is
between level 2 units (i.e., teacher-level). The estimated intraclass correlation is 0.56, which
indicates 56% of the variance in PCK is between teachers. Thus, a two-level model is
appropriate. We also conducted a likelihood ratio test to check whether a random slope is needed
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
29
in addition to the random intercept. The result showed that the random coefficient model fit the
data significantly better than the random intercept model (LR = 28.84, p < 0.001).
In the formal analyses, we estimated a few models in sequence. To consider the impact of
formal support and peer support teachers received on average during the study period on
teachers’ initial PCK and PCK growth rate, we added them as time-invariant covariates in the
level 2 model.
See below model specification:
Level 1: 𝑃𝐶𝐾𝑖𝑡 = 𝛽𝑜𝑖 + 𝛽1𝑖𝑇𝑖𝑚𝑒𝑖𝑡 + 𝑟𝑖𝑡,
Level 2: 𝛽𝑜𝑖 = 𝛾00 + 𝛾01𝐹𝑜𝑟𝑚𝑎𝑙𝑖 + 𝛾02𝑃𝑒𝑒𝑟𝑖 + 𝜇0𝑖
,
𝛽1𝑖 = 𝛾10 + 𝛾11𝐹𝑜𝑟𝑚𝑎𝑙𝑖 + 𝛾12𝑃𝑒𝑒𝑟𝑖 + 𝜇1𝑖
,
Next, to explore how teachers’ CK, noticing skills, and their professional background (i.e., the
credential type and certification path) influence their PCK, we then added these time-invariant
covariates in the level 2 model. See below model specification:
Level 1: 𝑃𝐶𝐾𝑖𝑡 = 𝛽𝑜𝑖 + 𝛽1𝑖𝑇𝑖𝑚𝑒𝑖𝑡 + 𝑟𝑖𝑡,
Level 2: 𝛽𝑜𝑖 = 𝛾00 + 𝛾01𝐹𝑜𝑟𝑚𝑎𝑙𝑖 + 𝛾02𝑃𝑒𝑒𝑟𝑖 + 𝛾03𝐶𝐾𝑖 + 𝛾04𝑁𝑜𝑡𝑖𝑐𝑖𝑛𝑔𝑖 + 𝛾05𝐶𝑟𝑒𝑑𝑒𝑛𝑡𝑖𝑎𝑙𝑖 +
𝛾06𝐶𝑒𝑟𝑡𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛𝑖 + 𝜇0𝑖
,
𝛽1𝑖 = 𝛾10 + 𝛾11𝐹𝑜𝑟𝑚𝑎𝑙𝑖 + 𝛾12𝑃𝑒𝑒𝑟𝑖 + 𝛾13𝐶𝐾𝑖 + 𝛾14𝑁𝑜𝑡𝑖𝑐𝑖𝑛𝑔𝑖 + 𝛾15𝐶𝑟𝑒𝑑𝑒𝑛𝑡𝑖𝑎𝑙𝑖 +
𝛾16𝐶𝑒𝑟𝑡𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛𝑖 + 𝜇1𝑖
Results
The result of Model 1 in Table 1.3 showed that teachers significantly improved their PCK
over the three years. Teachers’ PCK scores increased by an average of 2.21 points (p < 0.001).
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
30
The annual growth rate is different across teachers (p < 0.01). To better understand the
magnitude of the relationship, we calculated the effect size3
, which is 0.63 SD in this case.
The growth in PCK seemed to derive from teachers’ learning through teaching the
subject matter on their own, given that formal and informal support teachers received were not
significantly related to their PCK growth (see the results from Model 2). Indeed, the yearly
change in teachers’ PCK was not associated with the intensity level of formal professional or
peer support teachers received overall in the study period (p = 0.40 for formal support and p =
0.61 for peer support). The final model (i.e., Model 4), which included all teacher-level
covariates, indicated that teachers’ CK and noticing skills were statistically significant related to
teachers’ PCK at the beginning of the study (an effect size4 of 0.84 SD, p < .001 for CK; an
effect size of 0.32 SD; p < .001 for noticing skills). In particular, our finding underscored the
importance of CK in the development of PCK. Having a robust understanding of the
mathematics being taught was associated with teachers’ faster growth rate of their PCK per year
(an effect size of 0.24 SD; p = .03). As shown in Figure 1.2, the difference in the growth rate
between teachers with robust mathematical knowledge (i.e., 90th percentile) and those with less
robust mathematical knowledge (i.e., 10th percentile) is 0.60 SD. However, teachers’ noticing
skills were not related to the growth in their PCK. Lastly, having a credential in math and
entering the profession through traditional PD programs were not related to teachers’ baseline
PCK (p = 0.43 for credential type; p = 0.81 for certificate path) and the growth in their PCK (p =
0.73 for credential type; p = 0. 59 for certificate path).
3 We calculated the effect size for level 1 predictors by multiplying the coefficient for the variable by the ratio of the
standard deviation of the variable and the square root of the level 1 residual variance
4 We calculated the effect size by multiplying the coefficient for the content knowledge variable by the ratio of the
standard deviation of the content knowledge variable and the square root of the intercept variance.
We calculated the effect size by multiplying the coefficient for the content knowledge variable by its standard
deviation and dividing by the square root of the slope variance.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
31
Figure 1.2
Growth of PCK Among Teachers with Less and More Robust Mathematics Content Knowledge
at the Beginning of the Study
Note. The error bars indicate one standard deviation below and above the average.
Table 1.3
Estimates of the Growth Trajectory for Teachers’ PCK
Model 1 Model 2 Model 3 Model 4
Fixed effect
Intercept 24.51*** 24.86*** 24.35 *** 24.32***
(0.35) (0.82) (0.65) (0.65)
Time 2.21*** 2.32*** 2.24*** 2.27***
(0.21) (0.46) (0.46) (0.48)
Effect on intercept
Formal support -0.15 -0.36 -0.40
(0.60) (0.46) (0.52)
Peer support -0.11 0.18 0.15
(0.50) (0.41) (0.42)
Content knowledge 2.72 *** 2.59***
(0.31) (0.34)
Noticing 0.99*** 1.03***
(0.32) (0.33)
Traditional certificate -0.21
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
32
(0.88)
Credential in math 0.81
(1.02)
Effect on slope
Formal support 0.27 0.23 0.26
(0.29) (0.28) (0.30)
Peer support -0.22 -0.13 -0.14
(0.28) (0.28) (0.28)
Content knowledge 0.46 * 0.43*
(0.20) (0.20)
Noticing -0.28 -0.26
(0.20) (0.21)
Traditional certificate -0.32
(0.59)
Credential in math 0.22
(0.62)
Random effect
Var (time) 3.07** 3.39 *** 3.14 ** 3.12 **
(1.09) (1.17) (1.17) (1.18)
Var (intercept) 18.59 *** 19.07 *** 9.88 *** 9.80***
(2.69) (3.21) (2.40) (2.37)
Cov (time,intercept) 1.13 0.99 0.03 -0.00
(1.39) (1.55) (1.38) (1.35)
Var (Residual) 8.50 *** 8.72 *** 8.72 *** 8.72 ***
(1.06) (1.12) (1.12) (1.12)
Note. ~ p < .01 *p < .05. ***p < .001.
Discussion
This study examined the extent to which teachers developed PCK of mathematics
through teaching, a type of content-specific knowledge that has been significantly linked to the
quality of instruction and students’ learning of mathematics (Baumert et al., 2010; CopurGencturk, 2015). Before we discuss the study findings, we acknowledge the limitations of our
study. First, we collected data from a national sample of novice mathematics teachers, but this
sample was not nationally representative. Related to this issue, our sample of teachers, all of
whom volunteered to participate in the study, might be different from typical novices. Prior work
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
33
has shown that novice teachers often focus more on classroom management than on teaching and
student learning (Berliner, 1988), especially in the challenging first year, which often involves
many ‘reality shocks’ (Mintz et al., 2020). Yet the teachers in our study increased their PCK of
mathematics, possibly due to the pressure from state-mandated tests that forced them to pay
more attention to teaching and learning mathematics. Future studies with novices teaching
different subjects and grade levels would provide more insight related to this issue. Second, we
did not capture qualitative differences in the professional development opportunities provided to
the teachers. Further research is needed to investigate which features of professional support are
more effective than others in enhancing novice teachers’ continuous learning through teaching.
Finally, our study explored only the role of teacher-level factors (e.g., CK, noticing) in teachers’
PCK development, leaving many contextual factors unexamined. Future studies could investigate
how contextual factors, such as the school environment and administrator support, might
facilitate or hinder teachers’ learning through their teaching practice.
Our findings indicated that novice mathematics teachers were able to develop PCK of
mathematics from teaching, which was robust even after accounting for other learning supports
available to teachers during the same period. Prior research has demonstrated that teachers’
teaching practice offers rich learning resources for their professional growth (Leikin & Zazkis,
2010; Lloyd, 2008). Our study provides additional empirical evidence based on large-scale
longitudinal data. The results have implications for research and practice on teachers’ learning
through teaching. First, given that teaching is teachers’ daily work task and that teachers seem to
learn from teaching on their own, teacher preparation and professional development programs
should shape the curricula around how they might utilize the task of teaching to enhance
teachers’ knowledge and skills. For instance, a promising way for professional development
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
34
programs to help teachers learn from their own teaching is by using video clips of teachers’ own
teaching (e.g., Sherin & Van Es, 2009). Additionally, school leaders and policy makers might
provide teachers with more time to explore essential aspects of teaching (e.g., reflecting on their
own teaching practice), either alone or with peers, ensuring they have sufficient time to enhance
knowledge through teaching practice.
We also found that teachers’ CK was crucial in their PCK development. Teachers with
strong initial level of CK developed PCK through teaching at a faster pace than did their peers
with limited CK. This result is not surprising, given that teachers with a robust understanding of
school mathematics would be able to analyze their students’ mathematical thinking and thus
learn from them. Similarly, teachers with a strong understanding of the content being taught
could analyze instructional practices and their choice of resources, reflect on the appropriateness
of those practices and resources in making the content accessible to their students, and learn from
this experience. Teacher education and professional development programs should devote more
time to unpacking the mathematics taught in school so that teachers could develop an
understanding of the foundational ideas behind the mathematics taught across grade levels and
the conceptual underpinning of the rules and procedures (Copur-Gencturk & Tolar, 2022).
Curricular materials, such as the teachers’ guide, could provide more conceptual explanations of
the content in addition to the pedagogical content to facilitate teachers’ understanding of the
concepts they need to teach.
In line with prior literature, teachers’ noticing skills were related to their PCK (CopurGencturk & Tolar, 2022). However, our findings also indicated that noticing was not associated
with the development of teachers’ PCK. Prior work by Franke et al. (2001) showed that noticing
played a role in the development of teachers’ PCK, but only when teachers consciously viewed
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
35
the noticing of students’ mathematical thinking and their own instruction as learning resources
and leveraged what they noticed in the classroom. Thus, noticing alone may not lead to gains in
teachers’ PCK unless they also consciously reflect on what they notice in class and transform
those fleeting moments into action in their practice.
Conclusions
Teaching is a major component of teachers’ daily activities; therefore, understanding
whether and under which conditions learning occurs through teaching is vital for identifying a
mechanism for teachers to continuously improve their capacity. This is particularly crucial for
novice teachers, who often need more opportunities to enhance their knowledge and skills. We
have documented that novice mathematics teachers generally improved their PCK of
mathematics through their teaching practice and that teachers’ CK was important for the
development of their PCK. Our findings imply that cultivating a school environment that
provides time and support for teachers to focus on often overlooked components of teaching,
such as reflecting, could be a cost-effective way for novice teachers to grow professionally.
Additionally, teacher preparation and professional development programs should provide more
opportunities for teachers to enhance their understanding of the mathematical concepts taught in
school and to equip teachers with the skill to learn from the work of teaching on their own.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
36
Chapter 2 : The Impact of An Interactive, Personalized Computer-Based Teacher
Professional Development Program on Student Performance: A Randomized Controlled
Trial
Abstract
Scholars and practitioners have called for personalized and widely accessible professional
development (PD) for teachers. Yet, a long-standing tension between customizing support and
increasing access to such support has hindered the scale-up of high-quality PD for individual
teachers. This study addresses this challenge by developing an asynchronous online professional
development (OPD) program for middle school mathematics teachers that provides frequent
opportunities for teachers to interact with and obtain personalized and real-time feedback from a
virtual facilitator in an intelligent tutoring system. Based on the data collected from 1727 middle
school students in an experiment in which the teachers of these students were randomly assigned
to the OPD program or the business-as-usual condition (i.e., the control group), we found that the
program had a statistically significant impact on students’ mathematics performance. These
results demonstrate the potential of incorporating an automated, interactive feedback tool
supported by artificial intelligence to create effective, scalable teacher PD.
Keywords: teacher professional development; distance learning and online learning;
teaching/learning strategies; adult learning; Architectures for educational technology system.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
37
Introduction
Providing content-focused PD for mathematics teachers has been a federal policy
pathway intended to improve teacher effectiveness and ultimately boost student mathematics
performance in the United States (e.g., Every Student Succeeds Act, 2015). Prior empirical
evidence also suggests that high-quality PD contributes to mathematics teachers’ content
knowledge for teaching (Copur-Gencturk et al., 2019; Franke et al., 2001; Garet et al., 2016;
Jacob et al., 2017), quality of instructional practice (Carpenter et al., 1989; Garet et al., 2016;
Jacobs et al., 2007; Kraft & Blazar, 2017), and student learning outcomes (Campbell & Malkus,
2011; Carpenter et al., 1989; Jacobs et al., 2007). Teachers typically attend PD programs at
specific locations, outside of their regular instructional hours. Despite efforts to design and
implement high-quality PD programs, the reality is that only a limited number of teachers have
access to them due to barriers including time conflicts, geographic locations, and financial and
human resources (Elliott, 2017; Yoon et al., 2007; Zhang et al., 2020). These barriers have
resulted in an access gap to high-quality PD programs, particularly for teachers from rural areas
and under-resourced school districts.
With expanding access to the internet and advances in technologies, asynchronous OPD,
has gained significant attention as a potential solution to scale up high-quality PD by increasing
teachers’ access to learning opportunities (Bragg et al., 2021; Dede et al., 2009; Fisher et al.,
2010). Indeed, there has been a substantial surge in the number of asynchronous OPD programs
incorporating technology-supported components such as web-based learning platforms (Fisher et
al., 2010; Hollebrands & Lee, 2020; Kraft & Blazar, 2017; Ramsdell & Rose, 2006), digital
libraries (Allen et al., 2011), and online teacher communities (Lantz-Andersson et al., 2018).
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
38
Yet, as one form of OPD, fully asynchronous OPD presents the challenge of teachers engaging
with the program at different times. A common concern for OPD designers revolves around
providing frequent opportunities for teachers to interact with the PD facilitators (i.e., instructors
of PD whose role may vary across programs) and receive timely feedback during the learning
process, considering the limitations of human facilitators’ ability to support all learners in real
time. Many existing asynchronous OPD programs struggle to offer individualized, real-time,
interactive feedback to participating teachers (Ginsburg et al., 2004). However, research has
demonstrated that high-quality feedback is a critical element for successful PD, whether
delivered online or face-to-face (Darling-Hammond et al., 2017; Desimone & Pak, 2017; Kraft et
al., 2018). The absence of interaction and high-quality feedback can compromise teachers’
learning experience (Powell & Bodur, 2019; Reeves & Pedulla, 2011) and may lead to high
dropout rates observed in prior asynchronous OPD for teachers (e.g., Dash et al., 2012; Masters
et al., 2010). Addressing this challenge is not exclusive to asynchronous OPD but applies
broadly to online learning environments in general (Bağrıacık Yılmaz & Karataş, 2022; Lee &
Choi, 2011). Researchers across subject areas have used intelligent tutoring systems (ITS) to
generate automatic feedback in order to scaffold students’ online learning (Cavalcanti et al.,
2021). However, the application of such technologies, specifically for mathematics teachers’
learning in asynchronous OPD programs, remains limited.
As the interest in designing and delivering both synchronous and asynchronous teacher
OPD has grown, a limited amount of data has become available regarding the actual impact of
these programs on students’ overall academic outcomes — the ultimate goal of PD in general. In
a comprehensive review of more than 40 OPD programs and sites for mathematics teachers,
Ginsburg et al. (2004) found that none of the OPD programs had undergone external evaluation
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
39
using rigorous research designs, such as randomized controlled trials (RCTs) or quasiexperimental designs. A recent systematic review by Lay and colleagues (2020) indicated
research in OPD is advancing with increasingly rigorous empirical methods and theoretically
grounded approaches to design, implementation, and evaluation. However, despite this progress,
a significant challenge persists: the rapid proliferation and popularity of OPD programs
continues to outpace the rate of rigorous program evaluations in the field. This disparity raises
concerns about the ability to effectively measure the quality of many OPD initiatives.
This study aims to address the aforementioned gaps by examining the impact on student
learning of an interactive, personalized asynchronous OPD program designed for teachers. This
asynchronous OPD provides teachers with individualized, real-time feedback on their
performance through a virtual facilitator. The program is designed to enhance teachers’ content
knowledge (CK, i.e., conceptual understanding of the mathematics content being taught) and
pedagogical content knowledge (PCK, i.e., knowledge of students’ mathematical thinking and
knowledge of mathematics teaching), which are vital contributors to the quality of mathematics
instruction (Baumert et al., 2010; Copur-Gencturk, 2015; Hill & Chin, 2018; Kersting et al.,
2012). While teachers who completed the program increased their CK and PCK (CopurGencturk & Orrill, 2023), the extent to which the impact of the program could extend to their
students’ learning had not been explored. In this study, we aimed to seek an answer to this
important research question. We randomly assigned teachers who were recruited to the study to a
treatment or control condition and collected pre- and posttest data from their students after
teachers in the treatment group had completed the program.
In the following sections, we begin by laying the theoretical foundation for the design of
this program, followed by a review of existing technology-based systems for teacher learning.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
40
Subsequently, we provide detailed descriptions of the design and implementation of our
program. In the Methods section, we present the research design for our RCT, the measures used,
and our analytic approach. The findings are then presented and discussed, and their implications
for both research and practice in the field of teacher online learning are highlighted.
Conceptual Framework
Effective Teacher Professional Development
Teachers need professional support to enhance their knowledge and skills in order to
keep up with the latest educational research and standards to elevate student learning (Zhang et
al., 2020). Among different learning opportunities in the teaching profession, formal PD is an
important learning opportunity for teachers to develop CK and PCK for quality teaching
(Darling-Hammond et al., 2017). Traditional formal teacher PD is characterized by specific
learning goals, structured learning activities and curriculum, and a unified schedule, which
typically occurs during a specific time period (e.g., summer break) and at a specific location.
Given that PD programs vary in their formats, content, and enactment strategies, researchers
have attempted to identify key features that characterize effective teacher PD programs (Blank &
de las Alas, 2009; Garet et al., 2001; Kennedy, 1998; Scher & O’Reilly, 2009; Timperley et al.,
2007; Yoon et al., 2007). Existing systematic reviews on rigorous evaluations of traditional PD
have generally indicated that effective programs are subject-specific, address both content
knowledge and pedagogical content knowledge, have longer duration, and involve active,
collaborative learning among teachers (Darling‐Hammond et al., 2009; Scher & O’Reilly, 2009;
Yoon et al., 2007). However, large-scale programs incorporating these supposedly effective
elements have shown mixed results in improving teacher knowledge, instructional, and student
outcomes (Garet et al., 2010, 2016; Jacob et al., 2017; Jacobs et al., 2007). Recent reviews have
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
41
also challenged some of these conclusions regarding effective PD features. In Kennedy’s most
recent review of PD programs (2019), she compared programs with common designs (e.g.,
duration, content-focus) and found programs which engage teachers constructively with training
content by using classroom artifacts such as videotapes of classroom events and student work
samples generated larger program effects on student learning outcomes.
Dialogue-Based Intelligent Tutoring System
Intelligent tutoring systems generally refer to computer systems that model learners’
psychological states (e.g., knowledge level, learning strategies) to provide individualized
feedback and instruction (Ma et al., 2014). Our proposed intelligent OPD system is adapted from
a dialogue-based ITS called AutoTutor (see Ma et al., 2014 for a review of various types of
ITSs). AutoTutor system employs animated pedagogical agents to tutor learners through natural
language conversations, simulating human tutoring strategies (Graesser, 2011; Graesser et al.,
2004; See Nye et al., 2014, for a review of AutoTutor-related systems). These systems are
grounded in constructive learning theories (Aleven & Koedinger, 2002; Chi et al., 1994; Griffin
et al., 2008) and effective human tutoring that promote reasoning through interactive dialogues
(Chi et al., 2001; Shah et al., 2002; VanLehn et al., 2007). AutoTutor systems are built on a
design theory featuring two key components: a student model that captures learner’s cognitive
and non-cognitive states, inferred from performance data collected during the learning process,
and a tutor model that use this information to provide timely, personalized instruction to
facilitate better learning outcomes (Ma et al., 2014; Shute & Psotka, 1996; Sottilare et al., 2013).
The cognitive diagnosis and adaptive feedback are achieved through a series of interactions
between learners and a virtual agent, guided by the Expectation and Misconception Tailoring
(EMT) dialogue framework. Specifically, as shown in Figure 2.1, the interaction sequences
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
42
begins when a learner responds to an activity prompted by the virtual facilitator in the program.
For each activity, a set of expectations (i.e., learning goals), anticipated misconceptions, and
corresponding hints and prompts were stored as scripts within the system. As learners respond to
questions, the system matches their input against the prior-set expectations and anticipated
misconceptions. Based on the matching results, the system guides the dialogue by providing
targeted prompts and hints, encouraging learners to either complete additional activities or
elaborate on specific expectation-misconception pairs. The system continuously analyzes new
responses to identify fulfilled expectations and provides further hints or prompts for unmet
expectations. Throughout these interactions, the virtual facilitator also gives short feedback,
reviews relevant content, and summarizes key concepts covered in the activity (Graesser et al.,
2004; Nye et al., 2014).
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
43
Figure 2.1
Illustration of the Mechanism of An Interactive, Automated Feedback Cycle through the
Expectation-Misconceptions Framework
Literature Review
Supporting Teacher Learning with Technologies
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
44
Advances in educational technologies have transcended time and location constraints in
traditional teacher PD, providing teachers with new learning opportunities (Borko et al., 2010;
Lay et al., 2020). Asynchronous OPD constitutes the primary form of formal teacher learning
opportunities supported by educational technologies. The distinction between asynchronous OPD
and traditional PD primarily manifests in their modes of delivery. Asynchronous OPD involves
the deployment of pre-structured curricula and learning activities on open-source learning
platforms (e.g., edX, Moodle; Griffin et al., 2018; Sheridan & Wen, 2021; Taranto & Arzarello,
2020; Tzovla et al., 2021) or within self-contained learning systems (e.g., Dash et al., 2012;
Simpson et al., 2022). Asynchronous OPD affords teachers the flexibility to access instructional
content at their convenience, irrespective of time or location. Yet, the intrinsic attribute of
asynchronous learning concurrently complicates the facilitation of real-time interaction between
teachers-as-learners and PD facilitators, making the provision of timely constructive feedback a
significant challenge (Ginsburg et al., 2004; Powell & Bodur, 2019).
Many existing asynchronous OPD programs rely on training human facilitators, who
primarily guide the overall online learning experience, such as introducing teachers to the
learning platforms and monitoring course completion (Dash et al., 2012; Goldenberg et al., 2014;
Griffin et al., 2018; Masters et al., 2010; Taranto & Arzarello, 2020). Typically, these facilitators
engage in asynchronous interactions in online discussion forums, giving collective feedback
rather than personalized feedback (Goldenberg et al., 2014; Griffin et al., 2018; Ramsdell &
Rose, 2006). For example, in a large-scale asynchronous online course entitled the Seeing Math
project with nearly 30,000 enrollments, facilitators were trained to build a supportive learning
environment and encourage collaboration among teachers by posing questions and comments in
an online discussion forum, but they did not provide individualized feedback on teachers’
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
45
performance in online courses (Ramsdell & Rose, 2006). Some other asynchronous OPD
programs have addressed the lack of interaction and feedback by encouraging peer interactions
and feedback in online discussion forums (Hollebrands & Lee, 2020; Kellogg et al., 2014;
Taranto & Arzarello, 2020). While teachers appreciate the flexibility offered by asynchronous
OPD, they have also expressed a desire for increased interaction with peers and PD facilitators,
as well as receiving feedback within these courses (Powell & Bodur, 2019). Research indicates
that providing timely and personalized feedback to teachers can improve their satisfaction with
asynchronous online courses (Reeves & Pedulla, 2011) and assist in the practical implementation
of course content into teachers’ own teaching practices (Ramsdell & Rose, 2006). Evidence from
decades of research on teacher PD also emphasizes the importance of personalized and
constructive feedback in scaffolding teacher learning (Darling-Hammond et al., 2017; Kraft &
Blazar, 2017). However, given the scale of asynchronous OPD programs, offering real-time
interaction and personalized feedback through human facilitators is nearly unfeasible from both
human and financial perspectives.
Over the past few decades, AI technologies such as Natura Language Processing (NLP),
Deep Learning (DL), and Machine Learning (ML), have been widely used in educational settings
(Ouyang et al., 2022). AI is generally defined as computational systems that simulate human
intelligence in machines to reason, learn, and act on complex tasks (Chiu et al., 2022; Popenici &
Kerr, 2017; Salas-Pilco & Hu, 2022). Although many AI-based education (AIEd) technologies
have been incorporated in learning management systems to simulate human tutors, analyzing and
predicting student behaviors and performance and offering personalized instruction to students,
relatively few have been designed specifically for teachers (Chiu et al., 2022; Salas-Pilco & Hu,
2022). Current AIEd tools for teachers generally concentrate on offering visual displays or
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
46
dashboards of student data within learning systems in classroom settings, with a small subset
providing suggestions for how teachers can respond to their students (Bywater et al., 2019;
Casamayor et al., 2009; d’Anjou et al., 2019; Gerard et al., 2020; Leony et al., 2012; MartinezMaldonado et al., 2013; Mavrikis et al., 2016; for a review, see Chiu et al., 2022). However, the
development of these tools does not take advantage of the rich body of work that examines what
makes teacher learning opportunities effective. Providing teachers with digested data without
further engaging them in understanding the underlying student conceptions or misconceptions
behind student data or explaining the rationale for the system-generated recommendations, can
be categorized as a form of passive learning and offers only minimally effective learning
opportunities for teachers (Darling-Hammond et al., 2017; Kennedy, 2016; Yoon et al., 2007).
Studies indicated most teachers have difficulties in analyzing and turning data to meaningful
information due to a lack of knowledge about students, subject content, and pedagogy and formal
training on ways to inquire and reflect on available data (Mandinach & Gummer, 2013; Schifter
et al., 2014; Wayman, 2005). Based on our review of technology-based tools for teacher
learning, we have not encountered any formal teacher PD program that leverage intelligent
tutoring systems to support mathematics teachers’ learning of CK and PCK by interacting with
teachers and automatically providing personalized and timely feedback through a virtual agent
based on teachers’ input.
Challenges in Practice and Research on Technology-Enhanced Online Teacher Learning
Our review of current asynchronous OPD programs and AI-Ed technologies for teacher
learning reveals limitations in both practice and research aspects. In terms of practice, despite
rapid advancements in AIEd technologies, their applications in formal teacher training remain
limited. Asynchronous OPD programs offer flexibility to accommodate teachers’ schedules and
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
47
locations. Yet OPD struggle to replicate key features in high-quality face-to-face PD, particularly
in providing intensive interactions and personalized feedback (Bragg et al., 2021; Dede et al.,
2009; Taranto & Arzarello, 2020). Moreover, AIEd technologies for teacher learning often fail to
incorporate theoretical perspectives and empirical evidence from traditional teacher PD research
in their design. We posit that an intelligent tutoring system specifically designed for teachers
could address the lack of interaction and feedback in asynchronous online learning environments
while leveraging relevant findings from teaching and teacher education, thereby better
facilitating on-the-job teacher learning.
In terms of research, the existing literature lacks comprehensive evidence of the overall
effectiveness of technology-enhanced systems for teacher learning and the measures used to
evaluate them. A significant disparity exists between the number of technology-enhanced
learning opportunities for teachers and the efforts to evaluate these opportunities using
experimental designs (Bragg et al., 2021; Lay et al., 2020). Many evaluations of technologyenhanced learning opportunities for teachers have primarily focused teachers’ ratings of system
utility (Collins & Liang, 2015; Mavrikis et al., 2016; Reeves & Pedulla, 2011; Wong et al., 2022)
or teachers’ self-reported changes in knowledge and practice (Dash et al., 2012; Meyer et al.,
2023; Sheridan & Wen, 2021) as indicators of success. However, research has shown weak
correlation between self-assessed learning and directly assessed learning in workplace training
(Copur-Gencturk & Thacker, 2021; Sitzmann et al., 2010), which raises concerns about the
validity of self-reported instruments in measuring teacher learning in PD. Insufficient attention
has been paid to changes in student performance resulting from teachers’ participation in OPD
programs or exposure to other technology-enhanced learning opportunities (Borko et al., 2010;
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
48
Dede, 2006; Ginsburg et al., 2004). When such evaluations were conducted, they generally
revealed no positive impact on student performance (e.g., Dash et al., 2012; Griffin et al., 2018).
The Current Study
The present study contributes to current understanding of asynchronous OPD design and
implementation by (1) creating an interactive, personalized OPD program that provides
opportunities for teachers to interact with and receive personalized real-time feedback from a
virtual facilitator in a dialogue-based ITS system and (2) measuring the impact of this
asynchronous OPD program with a randomized experiment. By randomly assigning teachers to
the program, we aimed to answer one overarching research question: What is the impact of this
scalable and accessible OPD program on the mathematics performance of the students of
participating teachers?
The Scalable and Accessible OPD Program: A Virtual, Interactive Program with Just-inTime Feedback
The mathematics performance of many students starts to decline in middle schools
(National Assessment of Educational Progress [NAEP], 2022), yet students’ mathematics
performance in middle school is crucial to determining their future academic success (e.g., Astin
& Astin, 1992; Marsh et al., 2005; National Mathematics Advisory Panel, 2008). This is why we
developed a program specifically for middle school mathematics teachers. To address the
challenges that arise in conveying the content of ratios and proportional relationships to students
and the need for teachers to receive additional support in this area (e.g., Copur-Gencturk et al.,
2022; Izsák & Jacobson, 2017), the program was designed to enhance middle school teachers’
content and pedagogical content knowledge of proportional reasoning. By enhancing teachers’
CK and PCK, we aimed to improve the quality of instruction related to ratios and proportional
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
49
relationships and to bolster students’ understanding of these topics. Our theory of change was
supported by prior work that has highlighted the influential role of teachers’ CK and PCK in
shaping the quality of learning opportunities available to their students, as well as their learning
outcomes (Baumert et al., 2010; Copur-Gencturk, 2015). Further, reviews of PD programs have
shown that programs targeting both CK and PCK are more successful in improving student
outcomes than those focusing solely on one of these aspects (Kennedy, 1998; Scher & O’Reilly,
2009).
Content of the Program
The program was developed over a three-year iterative cycle and included two main
modules targeting CK and PCK, respectively (see Table 2.1 for the content and learning
objectives of each module). The CK module was organized into five submodules with a total of
35 activities to enhance teachers’ understanding of the key concepts related to ratios and
proportional relationships (see Figure 2.2). The key concepts were identified based on a review
of the literature (Cramer et al., 1993; Fisher, 1988; Lamon, 2012; Lim, 2009; Lobato et al.,
2010).
Table 2.1
Content of the Program
Module Submodule Learning objectives
Content
knowledge
Submodule 1:
Proportional
Reasoning
The goal of this submodule is to reason
multiplicatively and apply the concept of a ratio to
solve ratio problems.
Submodule 2:
Solving Ratio
Problems Using
Different
Representations
The goal of this submodule is to use proportional
reasoning to solve real-world and mathematical
problems with multiple representations (e.g., ratio
tables, tape diagrams, double number line
diagrams, or equations).
Submodule 3:
Relationship Between
Fractions and Ratios
The goal of this submodule is to understand the
relationship between ratios and fractions.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
50
Submodule 4:
Proportionality
The goal of this submodule is to understand
covariance and invariance and distinguish
proportional situations from nonproportional ones.
Submodule 5:
Putting It All Together
The goal of this submodule is to apply the concepts
of ratios and proportionality to solve real-world
math problems.
Pedagogical
content
knowledge
Submodule 1:
Planning
The goal of this submodule is to understand how to
select and adapt math problems to target the key
ideas behind learning goals and standards.
Submodule 2:
Implementing
The goal of this submodule is to analyze students’
thinking and connect different student strategies,
then pose appropriate questions and use appropriate
representations to help students develop
proportional reasoning.
Submodule 3:
Assessing and
Reflecting
The goal of this submodule is to examine and
reflect on teaching strategies to identify what to
improve in instruction to promote students’
understanding of ratios and proportionality.
Figure 2.2
An Example of CK Activity
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
51
The PCK module was organized into three submodules corresponding to three stages of
teaching (i.e., planning, implementing, and assessing and reflecting on a lesson), with a total of
18 multiple-stage activities. Teachers were provided with opportunities to enhance elements of
PCK, such as selecting and adapting tasks that were aligned with learning goals, unpacking
students’ work and connecting the mathematical ideas in students’ responses, and reflecting on
their teaching (e.g., Ball et al., 2008; see Figure 2.3 for an example).
Figure 2.3
Sample PCK Activity Focusing on Analyzing the Teaching and Reflecting on the Teaching
Practice
Design of the Program
Two key enactment strategies were utilized in the study: active learning and just-in-time
feedback through interaction with a virtual facilitator. Engaging teachers in active learning is an
important indicator of the effectiveness of professional development programs (Desimone, 2009;
Garet et al., 2001; Haug & Mork, 2021). Instead of passively watching or listening to someone
lecturing, teachers were actively engaged with the materials by solving mathematics problems in
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
52
the content knowledge module as well as by watching and analyzing a video clip of mathematics
instruction in the pedagogical content knowledge module (Desimone et al., 2009).
Given that receiving just-in-time feedback is critical for teachers’ learning, particularly
for tailoring the learning process to teachers’ specific needs and existing knowledge (Philipsen et
al., 2019), we took advantage of technologies derived from AutoTutor (Graesser et al., 2004) to
provide real-time feedback to teachers through their conversational interactions with a virtual
facilitator. The AutoTutor is a discourse-based intelligent tutoring system that uses pedagogical
virtual agents and dialog-based intelligent tutoring in learning activities. AutoTutor
communicates with learners through a virtual agent that uses natural language (voice or text) and
allows for users’ natural language responses. In the program, we relied on a set of key discourse
moves in the AutoTutor system that were driven by an expectation–misconception tailored
(EMT) dialogue framework (see Nye et al., 2014 for a systematic review of AutoTutor Family
and natural language processing). For each activity, the system stores a set of anticipated correct
answers (expectations), and incorrect/incomplete answers (misconceptions) frequently provided
by teachers, as well as corresponding prompts and hints for each expectation and misconception.
The potential ways teachers would respond to each activity (i.e., anticipated expectations and
misconceptions) were identified by our research team based on prior assessments of mathematics
teacher knowledge, as well as teachers’ input during the pilot study phase. We also developed
corresponding hints and prompts for each expectation or misconception to facilitate teachers’
learning until they master all targeted learning goals. For each activity, the virtual facilitator
usually started by asking teachers an open-ended question with several expectations and
misconceptions. As a teacher answered a question, the dialog-based virtual facilitator compared
the teachers’ input to pre-stored expectations and misconceptions by using semantic analysis
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
53
tools (i.e., semantic pattern matching algorithms) to identify which expectation/misconception
the teacher’s response belongs to and then provided prompts/hints to help the teacher explain
each of the remaining expectations until they meet all the expectations for that question. It is
often the case that teachers’ responses are spread out over many conversational turns as the
virtual facilitator provides hints and prompts to elicit teachers’ thinking and help them move on
(see Table B.1 in the Appendix as an example of a participating teacher’s interactions with the
virtual facilitator for the activity featured in Figure 2.3).
Method
Participants
Given that our goal was to develop a scalable program that would be accessible to any
teacher at any time, we examined the program’s effectiveness by recruiting teachers across the
nation. We acquired publicly available teacher email addresses through an education data company
and distributed the study invitation via email (N = 7,306). Of the initial invitation emails, 485
bounced (6.6%). Our records indicate that 2,639 teachers opened the invitation email, 427 of them
clicked on the survey link. Of these teachers, 187 completed the sign-up survey and expressed
their interest (43.8%). In addition to disseminating invitations through email, we posted an
anonymous link to the invitation on social media. Fifty-one teachers completed the survey through
the anonymous link. Of all the teachers who completed the signup survey, 165 were eligible for
the study (i.e., full-time teachers who would be teaching mathematics in Grades 6 or 7 in the 2021–
2022 academic year). Because the content targeted in the program is taught in sixth and seventh
grades, we invited those who were teaching at these grade levels and were interested in the study
to attend online meetings with the research team. During the meetings we provided further
information regarding the study and the data collection procedures. Seventy teachers agreed to
take part in the study. We randomly assigned these teachers to the treatment condition (N = 38) or
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
54
the business-as-usual condition (N = 32). Teachers in the treatment group were given access to the
program and asked to complete it during the summer in 2021. Teachers in the control group did
not receive any professional support. We then collected data from their students during the
following academic year (i.e., 2021–2022). Table 2.2 details the background and school
characteristics of the teachers who agreed to participate in the study (i.e., the full sample) and those
who completed the study (i.e., the analytic sample).
Table 2.2
Background Characteristics of Teachers in the Study Compared with a Nationwide Sample of
U.S. Secondary Public School Teachers
Teacher or school
characteristic Full sample (%)
Analytic sample
(%)
Nationwide sample of
U.S. secondary publicschool teachers (%)
Sexa
Female 87.1 88.7 76.5
Male 11.4 9.4 23.5
Prefer not to say 1.4 1.9 NA
Race/ethnicitya
White 75.7 75.5 79.3
Black 7.1 3.8 6.7
Hispanic 2.9 3.8 9.3
Other 14.3 17.0 4.6
Highest degree earned
Bachelor’s 20.0 18.9 40.7
Master’s 77.1 79.3 56.8
Doctorate 2.9 1.9 1.3
School regionb
Midwest 15.7 17.0 21.4
Northeast 12.9 15.1 19.5
South 37.1 30.2 40.3
West 34.3 37.7 18.8
School locale
City 27.1 24.5 29.1
Suburb 38.6 39.6 38.7
Town 4.3 3.8 11.9
Rural area 30.0 32.1 20.5
Percentage of students
eligible for free or
reduced-price lunch
0 to 25 24.3 26.4 18.8
26 to 50 25.7 28.3 26.7
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
55
51 to 75 21.4 22.6 23.0
76 to 100 28.6 22.6 29.0
Note. N = 70 for the full sample, and N = 53 for the analytic sample. NA = not applicable.
a From Digest of Education Statistics,
https://nces.ed.gov/programs/digest/d20/tables/dt20_209.22.asp. b From Common Core of Data:
America’s Public Schools, https://nces.ed.gov/ccd/tables/201920_summary_2.asp.
To check whether our randomization was implemented with fidelity, we evaluated
whether the demographics of teachers in the treatment and control groups differ significantly.
We also compared teachers’ baseline CK and PCK and their students’ baseline mathematics
performance prior to the intervention. As Table 2.3. shows, the treatment and control groups of
teachers in the analytic sample were not statistically different from each other in terms of gender,
race, certification type, highest degree earned or their baseline CK and PCK. In addition, the
average baseline mathematics scores of students whose teachers were in the treatment group
compared with those in the control group were not statistically different (𝛽 = -0.02, SE = 0.03, p
= 0.63). This result validates our randomization and suggests that any differences we observe in
students’ post-test performance between treatment and control groups are likely due to the
effects of the PD program.
Table 2.3
Descriptive Statistics and Balance for the Analytic Sample
Teacher and student
characteristics
Treatment
(N = 29)
Control
(N = 24)
Difference
(T-C)
Significance of
group differences
(p-value)
Teacher characteristic
Sex
Female .90 .88 .02 (.09) .81
Male .07 .13 −.06 (.08) .51
Race/ethnicity
White .79 .71 .08 (.12) .50
Black .03 .04 −.01 (.05) .89
Hispanic .03 .04 −.01 (.05) .89
Other .14 .21 −.07 (.10) .51
Alternative certification .17 .21 −.04 (.11) .74
Master’s degree .83 .75 .08 (.12) .50
Baseline CK 0.72 .69 .03 (.03) .31
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
56
Baseline PCK 0.36 0.36 .00 (.03) .97
Student characteristic
Pretest score .42 .44 −0.02 (.03) .63
Note. Analyses used t-tests. T-C=Treatment/Control difference; Statistics for baseline CK and
PCK are the means of these continuous variables. The values of categorical variables represent
proportions. The numbers in parentheses are standard errors.
Attrition
During the recruitment, 70 teachers were selected to participate in the study. Of these, we
were not able to collect the student data from 17 teachers. The overall attrition was 24% and
differential attrition was 1.32% (treatment group attrition = 23.68%, N = 9; control group
attrition = 25%, N = 8). Teachers left the study for a variety of reasons, most of which were
unrelated to the study itself: teaching schedule changed (e.g., teachers moving to a grade level at
which students were not learning the concepts targeted in the program), COVID-related health
problems, additional stress and burdens participants reported having because of COVID (i.e., the
district’s administration of additional assessments, which did not leave time for the teachers to
administer our tests to their students). We also experienced attrition at the student level. Of the
1944 students who took the pretest (N = 1078 for teachers in the treatment group, and N = 866
for teachers in the control group), 217 did not take the posttest. The overall attribution rate at the
student level was 11% and differential attribution rate was 1.53% (treatment group attrition =
10.48%, N = 113; control group attrition = 12.01%, N = 104).
Attrition can lead to biased estimates of program impacts. To explore the possibility of
such bias, we examined if the teachers who were in the analytic sample (N = 53) were not
statistically different from those who dropped out of the study (N = 17) in terms of
demographics, professional background, and baseline CK and PCK. Results indicated these two
groups were not statistically different in terms of race (χ 2 [3, N = 70] = 5.22, p = .16) and
gender, (χ2 [2, N = 70] = 1.14, p = .57). They were not statistically different in terms of
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
57
certification (χ 2 [1, N = 70] = 5.22, p = .45) and highest degree earned (χ 2 [1, N = 70] = 5.22, p
= .22). Moreover, these two groups did not differ according to their baseline CK and PCK (M
diff = 0.01, t [70] = 0.27, p = .82 for CK; M diff = 0.04, t [70] = 1.30, p = .21 for PCK). At the
student level, the mean differences in students’ baseline mathematics performance for those who
missed the posttest did not differ by the condition to which their teacher was assigned (𝛽 = 0.03,
SE = 0.05, p = .56).
Measures
Student Test
To capture the impact of our program on students’ learning of the concepts targeted in the
program (i.e., ratios and proportional relationships), we developed a measure from items released
from the NAEP, Trends in International Mathematics and Science Study (TIMSS), and the two
Common Core-aligned national assessments, Partnership for Assessment of Readiness for
College and Careers (PARCC) and Smarter Balanced, as well as assessments adapted from prior
research (Fisher, 1988; Van Dooren et al., 2005). We selected items from these large-scale
assessments because these items’ psychometric properties have been validated across a large
national sample of students. The test included 11 multiple-choice and multiple-selection
problems (see Appendix B.3 for the student assessment) in ratios and proportional relationships.
We used the quotient of the number of questions students answered correctly and the total
number of questions on this measure (M = 0.43, SD = 0.21, for the pretest, and M = 0.55, SD =
0.22 for the posttest). Although the content of the student test was restricted to the content
targeted in the program, items on the student measures were determined based on the content
standards in ratios and proportional relationships (Common Core State Standards Initiative,
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
58
2010). Further, teachers in the treatment group were not given any materials to use in their
teaching.
Students were tested before and after instruction to measure their understanding of ratios
and proportional relationships. The student knowledge measure demonstrated good internal
consistency (reliability), with a Cronbach’s alpha coefficient of .71 for the pretest and .73 for the
posttest. The students’ scores after instruction (i.e., post-test score) were used as the dependent
variable, whereas class-mean centered student pretest scores and the class mean student pretest
scores were used as covariates in the analyses.
Treatment
We created variables indicating whether teachers were assigned to the treatment
condition (treatment group = 1; control group =0).
Teacher’s Content Knowledge Measure
Teachers’ CK was measured by using a subset of items developed by the Learning
Mathematics for Teaching (LMT) Project (Hill et al., 2004). The measure included 25 items,
with a reliability of 0.70 (i.e., Cronbach’s alpha). Before providing teachers in the treatment
group with access to the program, we measured teachers’ CK in both the treatment and control
groups. Teachers’ scores on this measure were also calculated by finding the quotient of the
correct answers teachers provided and the total number of questions (M = .71, SD = 0.12 for the
analytic sample). As can be seen in Table 2.3, teachers in the treatment and control groups were
similar in terms of content knowledge.
Teachers’ Pedagogical Content Knowledge Measure
Teachers’ PCK was measured by items developed by the LMT Project (Hill et al., 2004)
and Kersting (2008). Specifically, two multiple-choice items were taken from the LMT Project,
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
59
and eight constructed-response questions came from Kersting (2008) (Cronbach’s alpha of the
scale was 0.57). For constructed-response items, teachers were asked to watch a two-to threemin-long video featuring authentic mathematics instruction centering around the content targeted
in the program (i.e., ratios and proportional relationships). To provide the necessary context, a
brief description accompanied each video, enabling teachers to understand the instruction
presented. They were then asked to analyze the mathematical understanding of students in the
video clips (i.e., knowledge of students’ mathematical thinking, an element of PCK; Ball et al.,
2008; Copur-Gencturk & Li, 2023; Copur-Gencturk & Tolar, 2022; Shulman, 1986) or provide
suggestions to improve the teaching practices shown in the videos to increase students’
mathematical understanding (i.e., knowledge of mathematics teaching. Teachers’ responses were
coded by two independent raters using a four-point rubric based on their prior research capturing
teachers’ PCK (see Copur-Gencturk & Li, 2023; see Table B.2 for the scoring rubrics and
sample responses). The data were blinded before coding so that raters were unaware of the
teachers’ group assignments or whether the responses were from the pre- or post-assessment. A
strong agreement between the two raters was reached, as indicated by a kappa statistic of 0.81.
Teachers’ PCK scores used in the analyses were calculated by dividing the total number of
points teachers received by the maximum number of points available (M = 0.36, SD = 0.12 for
the analytic sample.)
Data Collection Procedure
The randomization of teachers to the treatment or control condition occurred during the
summer of 2021. Teachers in both conditions were asked to complete a survey to measure their
baseline CK and PCK of ratios and proportional relationships. Teachers in the treatment
condition were granted access to the program beginning in the second week of July 2021.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
60
Teachers in the control group were not given any PD and continued with their usual practices
(i.e., the business-as-usual condition). Teachers in the treatment group completed the activities at
their own locations and at any time they preferred. They spent on average 11.1 hours completing
the program (SD = 9.44). Our team was available by phone and emails to resolve any technical
issues encountered by teachers during this period.
In the following academic year (i.e., 2021–2022), we collected data from students of the
participating teachers. To measure students’ baseline understanding of the targeted concepts, a
pretest was administered to students in both conditions before they received instruction on these
concepts. The survey was administered either through Qualtrics or in a paper-and-pencil format.
The same survey was administered after students received instruction on the targeted concepts.
Because of the diverse range of districts involved in the study and the specific requirements
imposed by some districts to submit individual institutional review board (IRB) forms if
students’ demographic information needed to be collected, we obtained only students’ responses
to the mathematics problems, as well as students’ identification and teachers’ names to link
students’ posttests to their pretest data. This approach was adopted to ensure compliance with the
respective district regulations.
Analytic Approach
To estimate the impact of the PD program on student mathematics performance, we used
a two-level hierarchical linear model with students nested in teachers using STATA 17. We
firstly estimated an unconditional model with no predictors at the student and teacher level to
determine the intraclass correlation coefficient (ICC). 34.9% of the variance in students’ post-test
scores was associated with teachers (level 2), which indicates a multilevel model is an
appropriate approach to use. We then estimated a two-level model, in which students’ posttest
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
61
scores were predicted by the condition to which their teacher was assigned, while allowing for
teacher- and student-level covariates in the model. Specifically, in level 1 model, student i’s
post-test score was predicted by students’ class-mean centered pretest scores (𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑖𝑗 −
𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑗
̅̅̅̅̅̅̅̅̅̅̅ ) and the grade level (i.e., Grade 6 = 0; Grade 7 = 1) students were in (𝐺𝑟𝑎𝑑𝑒𝑖𝑗). 𝑟𝑖𝑗
represents residuals at level 1 that remain unexplained after taking into account students’ classmean centered pre-test scores and grade level. In level 2 model, we included the following
covariates: the treatment indicator (𝑇𝑗
), defined as either the treatment or the business as usual
condition for that student’s teacher, the class mean pre-test score (𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑗
̅̅̅̅̅̅̅̅̅̅̅), and teachers’
baseline CK (𝐶𝐾𝑗
) and PCK (𝑃𝐶𝐾𝑗
). We also investigated whether the PD program was equally
effective across grade levels and students’ pre-test performance (i.e., defined as students’
deviation from the class-mean pre-test score). To do so, we incorporated two interactions terms:
𝑇𝑗 ∗ (𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑖𝑗 − 𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑗
̅̅̅̅̅̅̅̅̅̅̅), and 𝑇𝑗 ∗ 𝐺𝑟𝑎𝑑𝑒𝑖𝑗 in the model. We initially allowed for random
effects in students’ class-mean centered pretest score and grade level across teachers. However,
the model that allows for two random slopes did not converge5
. We then simplified the model by
removing random effects in 𝛽1𝑗 and 𝛽2𝑗 and assumed that the slope for grade level and students’
pre-test performance were the same across teachers. Below is the final model specification:
Level 1 model (student level):
𝑌𝑖𝑗 = 𝛽0𝑗 + 𝛽1𝑗
(𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑖𝑗 − 𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑗
̅̅̅̅̅̅̅̅̅̅̅) + 𝛽2𝑗 𝐺𝑟𝑎𝑑𝑒𝑖𝑗 + 𝑟𝑖𝑗
Level 2 model (teacher level):
𝛽0𝑗 = 𝛾00 + 𝛾01 𝑇𝑗 + 𝛾02 𝑝𝑟𝑒𝑡𝑒𝑠𝑡𝑗
̅̅̅̅̅̅̅̅̅̅̅ + 𝛾03 𝐶𝐾𝑗 + 𝛾05 𝑃𝐶𝐾𝑗 + 𝜇0𝑗
𝛽1𝑗 = 𝛾10 + 𝛾11 𝑇𝑗
5 The software failed to generate parameter estimates for random effects.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
62
𝛽2𝑗 = 𝛾20 + 𝛾21 𝑇𝑗
Results
We found that, on average, students taught by teachers in the treatment group scored 0.56
points on the post-test, whereas students taught by teachers in the control group scored an
average of 0.53 points, when we adjusted for students’ grade level, their class-mean centered
pre-test performance, their class-mean pre-test performance, teachers’ baseline CK and PCK.
The average marginal effect of the treatment is 0.04 (p = 0.046). To provide a practical meaning
of the treatment effect, we calculated effect size using two different approaches. The impact of
the program on students’ mathematics scores was 0.186 SD when standard deviation of the
control group students’ pre-test scores was used. Because the treatment was at the teacher level,
we also estimated the impact of the program by dividing the coefficient by the standard deviation
of the Level 2 variance of the model without predictors, which was equal to 0.28 SD.
As described above, we also explored whether the program impact was different
depending on students’ deviation from their class-mean pre-test score and the grade level they
were in. We did not find significant interaction effects between the treatment and class-mean
centered pretest scores and between the treatment and grade level. However, we may lack the
statistical power7
to detect whether or not coefficient across different student populations are
statistically different from each other. The findings also suggested positive effects of students’
deviation from their class-mean pre-test scores and their class-mean pre-test scores on students’
6 This effect size is calculated by dividing the average marginal effect of treatment by the standard deviation of the
control group students’ pre-test scores.
7 The statistical power is 0.8 for detecting a significant 0.22 SD treatment effect in a model without cross-level
interaction terms and teachers’ baseline CK and PCK, based on the sample size, a significant level of 0.05, and twotailed hypothesis tests. The power would be less than 0.8 if four additional covariates are included in the model (i.e.,
the final model)
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
63
post-test scores (p < 0.001 for average marginal effects). However, teachers’ baseline CK and
PCK were not significantly related to students’ post-test performance.
Table 2.4
Results for the Estimated Effects of the Program on Students' Mathematics Performance
Predictor
Coefficient
(SE)
Fixed effect
Intercept 0.09
(0.06)
Treatment effect 0.02
(0.02)
Class-mean centered pretest score 0.58***
(0.03)
Treatment effect × class-mean centered pretest score -0.03
(0.04)
class mean pretest score 0.91***
(0.08)
Grade 7 (Grade 6 is the reference category) 0.01
(0.03)
Treatment effect × Grade 0.04
(0.03)
Baseline content knowledge 0.06
(0.09)
Baseline pedagogical content knowledge 0.02
(0.09)
Random effect
Variance (Intercept) 0.003***
(0.001)
Variance (Residual) 0.020***
(0.001)
Note. N = 1,727 for students, and N = 53 for teachers. Numbers in parentheses are standard
errors.
~ p < .01 *p < .05. ***p < .001.
Discussion
Federal policymakers and scholars have recognized that teachers are key to student
learning and that students’ academic outcomes can be improved by enhancing their teachers’
capacity through teacher PD (e.g., Darling-Hammond et al., 2017; Every Student Succeeds Act,
2015). Yet despite the substantial investment of billions of federal and state dollars in teacher PD
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
64
(Birman et al., 2007; Picus & Odden, 2011), only a handful of PD programs have led to
statistically significant positive student outcomes (e.g., Jacobs et al., 2007; Lewis & Perry, 2017;
Roschelle et al., 2010). More important, even PD programs that are successful locally fail to
produce similar positive outcomes when they are scaled up (e.g., Garet et al., 2011; Jacob et al.,
2017). This could be due to the challenges of meeting the specific needs of individual teachers
(Desimone & Pak, 2017). To tackle the problem of scaling up high-quality PD programs, we
designed an asynchronous OPD program that could analyze teachers’ responses to learning
activities and provide personalized, timely feedback through a virtual facilitator supported by
NLP. In addition to finding previously that this program was successful in improving teachers’
knowledge and instruction (Copur-Gencturk & Orrill, 2023), in the present study we found
through a randomized experiment that this program had a significant positive impact on students’
performance.
One major implication of this study is that it provides evidence of a more affordable
solution for scaled PD for teachers. We were able to create a program without a human instructor
by incorporating the ITS technologies so that teachers could learn by interacting with a virtual
facilitator and receiving real-time feedback. Although we did not design the study to investigate
which elements of this program mattered more in relation to others, in terms of creating specific
positive outcomes at the student level, we can speculate on the potential reasons behind this
success. We attributed the program’s success to the integration of evidence-based insights
regarding effective teacher learning opportunities into a conversational intelligent tutoring
system. In particular, we spent a substantial amount of time to ensure that the content of the
program (i.e., what teachers learned in the program) as well as the enactment strategies used (i.
e., how teachers learned the targeted content, such as by analyzing authentic teaching videos and
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
65
student work) were informed by research. We investigated prior literature as well as our own
work to identify the nuances in teachers’ CK and PCK so that the learning activities, hints, and
prompts were on point in addressing their needs. As mentioned above, the majority of existing
AI-Ed supported learning systems are not specifically tailored for teachers, thereby failing to
enhance teacher learning effectively (Chiu et al., 2022). The findings of the study build on
evidence that PD programs targeting both key elements of CK and PCK that matter for teaching
and using enactment strategies, such as analysis of student work and classroom instruction, could
lead to improvements in students’ academic performance (Darling-Hammond et al., 2017;
Kennedy, 2019; Scher & O’Reilly, 2009). Additionally, the PD system in the study scaffolds
teachers’ learning process (e.g., self-explanation, reflection) by using an interactive and
constructive tutoring style in natural language. While prior work has shown that AutoTutorrelated systems applied in various mathematics and science domains have facilitated learning
(Graesser et al., 2012), our work contributes to the literature by providing a prototype for using
AI-Ed technologies to develop effective and individualized PD opportunities for teachers. Given
that our study was not designed to investigate which aspects of the program (e.g., the design of
activities, the automated feedback) could have contributed to the positive impact of the program,
future work is needed to investigate the relative contribution of the different design elements to
teachers’ learning.
The goal of this study was to develop and test the possibility of scaled teacher PD
because of the limited evidence in the current literature. For this reason, this study was
conducted on a sample of mathematics teachers across the nation. However, our study sample
was small, which may limit the statistical power to detecting heterogeneity in treatment effects
across different student population. Replication studies with larger sample sizes is needed to
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
66
provide a more comprehensive understanding of the impact of the program on both teachers and
students. Additionally, collecting data from teachers across the nation resulted in unanticipated
challenges, such as limitations on collecting student demographic information. Collecting student
demographic data would have created additional barriers to our research team in administering
the mathematics test to students of some of the teachers. As a result, we could not investigate
whether and to what extent students from different backgrounds benefited from their teachers’
participation in this program. Future work is needed to answer this important question.
It is also important to point out that we encountered sample attrition at both the teacher
and student levels. We did not find significant differences between those who dropped out of the
program and those who remained in the program in terms of key variables, such as teacher
background characteristics or baseline knowledge level. Similarly, students of teachers in both
the treatment and control groups in the analytic sample had similar baseline test scores. While
these checks increase our confidence in the study findings, replication studies are needed to
gauge the impact of the program on students.
We also wish to note that the program studied here is only a prototype. Participating
teachers reported that, in some cases, the system could not recognize the correctness of their
answers and prompted them to explain their responses. This was partly due to the intelligent
tutoring system we used in our program, which was limited in terms of its ability to evaluate the
accuracy of teachers’ responses in relation to the expectations set. With the current advances in
AI, we believe that more accurate analyses of teachers’ responses in relation to the expectations
set for activities are possible.
Conclusion
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
67
In sum, the positive impact of this intervention provides a proof of concept for the use of
intelligent virtual agents to analyze teachers’ responses and provide personalized feedback. This
program overcomes the limitations of the lack of interaction and personalized support in current
OPD programs while it is still accessible to any teacher with Internet access at any time and in
any location. Additionally, the teachers’ responses and the virtual facilitator’s feedback are kept
confidential, which may encourage the participation of teachers who are hesitant to share their
ideas. Finally, the success of our OPD program underscores the importance of cross-disciplinary
research. The development of impactful programs requires collaboration among content experts
who can identify the key concepts for teaching and learning, teacher educators who can lend
their expertise in how teachers learn from PD, and computer scientists who can help leverage the
power of AI in educational settings.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
68
Chapter 3 : Perceptions versus Performance: Assessing Teacher Learning in Asynchronous
Online Professional Development
Abstract
Teacher learning in asynchronous online professional development (OPD) is often measured by
self-reported instruments, despite uncertainties regarding teachers’ accuracy in their selfassessment of learning. This study explored whether teachers’ self-reported gains in contentspecific knowledge for teaching mathematics aligned with those measured by direct assessments.
Through quantitative analyses of data collected from 57 middle school mathematics teachers
who participated in a fully asynchronous OPD program, we found no significant correlation
between teachers’ self-reported knowledge gains and those measured by direct assessments.
Additionally, we examined the role of teachers’ use of self-regulated learning (SRL) strategies,
namely organization, elaboration, and monitoring, in their learning from the OPD program by
conducting linear regression analyses. Our results indicated that teachers who frequently
monitored their understanding of the program content demonstrated greater gains in their content
knowledge as measured by direct assessments. However, none of the SRL strategies predicted
the accuracy of teachers’ self-assessments of knowledge gains from the program. These findings
raise concerns about solely relying on self-reported instruments to capture teacher learning in
asynchronous OPD programs and emphasize the need for more robust and accurate outcome
measures in the asynchronous online learning context.
Keywords: Online teacher professional development; Self-reported instruments; Self-regulated
learning; Mathematics teacher knowledge
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
69
Introduction
The effectiveness of content-focused8 PD is usually determined based on changes in
teacher knowledge and practice measured by empirically validated instruments (Kennedy, 2016).
Asynchronous OPD, a rapidly expanding format of PD, commonly relies on teachers’ selfreported data as a measure of program effectiveness (e.g., Meyer et al., 2023; Sheridan & Wen,
2021). While insights into teachers’ perceptions of OPD have implications for its design and
implementation, the validity of self-reported instruments in capturing changes in teachers’
cognitive abilities remains a critical issue. Theoretically, self-reported knowledge reflects an
individual’s confidence or perception of his/her ability while observed knowledge is based on
external evaluations using relatively consistent criteria (Sangster et al., 2013; Sitzmann et al.,
2010). If teachers’ self-reports are not measuring the knowledge that we expect teachers to learn
from the OPD, it will hinder our efforts to understand the OPD impact and why it works or why
it doesn’t work as expected. Empirical research on the validity of teachers’ self-reported
knowledge (e.g., Fütterer et al., 2023; Schmid et al., 2021; Von Kotzebue, 2023) and practice
(Cheng et al., 2023; Kaufman et al., 2016) has revealed small to moderate correlations between
self-reported and observed data. Addressing this discrepancy issue is important for identifying
effective online learning opportunities for teachers and understanding the factors that facilitate or
hinder teacher learning in this context. However, OPD researchers have seldom compared
teachers’ self-reported and directly assessed learning within one setting to understand how
different types of data influence conclusions about OPD impact.
In addition to the validity issue of outcome measures, it is also seldom the case that
research attempts to explore factors related to the discrepancy between teachers’ self-reported
8 Content-focused PD programs focus on enhancing teachers’ knowledge and practice for teaching the subject
matter.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
70
learning and the learning measured by cognitive assessments, especially in the asynchronous
OPD context. Self-regulated learning (SRL), defined as the ability to manage the learning
process through monitoring and controlling cognition and metacognition (Panadero, 2017;
Winne, 2001), has been identified as an important factor influencing individuals’ learning
outcome in online contexts (Wong et al., 2019; Xu et al., 2023). The underlying mechanism of
how SRL influences learning outcomes is that SRL allows individuals to become more aware of
their learning progress by observing and comparing the achieved outcomes against their desired
outcomes. This enables individuals to take more control of their own learning and make
necessary adjustments to their learning strategies. Within the mechanism, self-assessment can be
understood as a process that teachers carry out to self-regulate their learning (Panadero &
Alonso-Tapia, 2017). Therefore, competent self-regulated learners may have more accurate selfassessments of their learning. However, research has primarily concentrated on the role of
teachers as facilitators of students’ SRL, with much less attention given to in-service teachers as
self-regulated learners (Karlen et al., 2023; Kramarski & Heaysman, 2021).
To address limitations above, this study aimed to (1) examine the alignment between
middle school mathematics teachers’ self-reported and directly assessed knowledge gains after
participating in an asynchronous OPD program and (2) explore the role of teachers’ SRL in
teachers’ knowledge gains from the OPD program and the accuracy of their self-assessments.
Unlike prior work that focused on teachers’ current level of knowledge to understand the
alignment between self-reported and observed learning from OPD (e.g., Gardner-Neblett et al.,
2020), our study emphasizes knowledge gains. This subtle difference in the focus of the outcome
measure matters because most OPD evaluation studies that rely on teachers’ self-reports
typically ask teachers to report how OPD improved their knowledge and instruction (e.g., Meyer
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
71
et al., 2023; Sheridan & Wen, 2021). Therefore, it is important to focus on knowledge gains
when comparing teachers’ self-reports to their performance in cognitive assessments. In addition
to the two main purposes of the study, we also conducted an exploratory analysis to test how
different types of outcome measures relate to the quality of teachers’ mathematics instruction
after the OPD program. If the magnitude of the relationship between teachers’ instructional
quality and the knowledge gains measured by direct assessments is larger than the relationship
with teachers’ self-reported knowledge gains, it may indicate empirically validated cognitive
assessments could better reflect the knowledge that matters for quality instruction. Specifically,
this study aimed to answer:
(1) To what extent do teachers’ self-reported gains of content-specific knowledge for
teaching from the asynchronous OPD program align with the gains measured by direct
assessments?
(2) To what extent is teachers’ use of SRL strategies (i.e., organization, elaboration, and
metacognitive monitoring) associated with their self-reported and directly assessed
knowledge gains, as well as the accuracy of their self-assessments?
(3) How is teachers’ instructional quality after the OPD program, as measured by teaching
artifacts along with student work samples, associated with teachers’ self-reported and
directly assessed knowledge gains?
Conceptualizing Self-Regulated Learning
SRL refers to a process of regulating one’s motivation, cognition, and metacognition by
employing different learning strategies (Pintrich, 2000; Winne, 2001). While there are variations
in specifics of SRL theories (Panadero, 2017), research generally recognizes that SRL is
characterized by cognitive and metacognitive activities, which have a positive impact on
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
72
knowledge acquisition, retention, and application (Green & Azevedo, 2010; Winne, 2017). Thus,
in this study, we focus on teachers’ employment of cognitive and metacognitive activities to
regulate their learning in an asynchronous OPD program. According to Pintrich (2004; 1993),
cognitive strategies involve one’s use of basic (e.g., rehearsal) and complex strategies (e.g.,
elaboration) for the processing of information from learning materials. In this study, we narrow
our focus on cognitive strategies to more advanced strategies including elaboration strategies
(e.g., paraphrasing, creating analogies) and organization strategies (e.g., notetaking, outlining).
The selection of these two advanced cognitive strategies aligns with the complex nature of the
knowledge (i.e., teachers’ CK and PCK) teachers are expected to learn from the OPD program,
as well as the OPD learning activities designed for teachers’ active engagement in solving realworld teaching problems. Specifically, organization strategies involve arranging and integrating
information so that it is easier to process and recall in the long term, which helps learners choose
the appropriate information and establish connections among information to be learned (Pintrich
& Groot, 1990). For example, mapping out mathematical concepts in the content area of ratio
and proportional relationships enables teachers to see the big picture of this content area and
understand the relationships between different mathematical concepts within this content area.
Elaboration strategies involve adding, constructing, and generating meaning to information for
better understanding and memorization (Levin, 1988; Pintrich & Groot, 1990). For example,
relating new teaching strategies (e.g., a tool) introduced in the OPD sessions to the strategies a
teacher already knows will help the teacher integrate the new content with existing knowledge.
Metacognitive strategies refer to the awareness and regulation of one’s cognitive processes
through monitoring and evaluating one’s understanding of the learning content (Pintrich, 2000;
Winne, 2017). Metacognitive strategies enable individuals to take control of their learning
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
73
process and make adjustment as needed to improve understanding. For example, during the
learning process, a teacher may regularly identify a mathematical concept in the area of
proportional relationships that is challenging to understand and requiring further practice. After
completing a learning activity, a teacher may self-assess and evaluate his/her level of
understanding of the mathematical concept.
Literature Review
Distinction Between Teachers’ Self-Reported and Observed Learning
The skepticism surrounding the use of self-reported instruments to measure an
individual’s cognitive ability has been present across disciplines (Hammond, 2015; Krauskopf &
Forssell, 2018; Sitzmann et al., 2010). Regarding the assessments of teacher knowledge,
researchers generally reported small correlations between teachers’ perceived and observed
knowledge (Fütterer et al., 2023; Hammond, 2015; Krauskopf & Forssell, 2018). For example,
Drummond and Sweeney (2017) compared pre-service teachers’ perceptions about their TPACK
based on a self-reported instrument and teachers’ observed TPACK assessed by a wellestablished knowledge test. The finding showed that the self-reported TPACK measure only had
a small correlation with the performance-based TPACK measure. Other studies have examined
the alignment between teachers’ self-reported and observed instructional practice (e.g., Cheng et
al., 2023; Kaufman et al., 2016). These studies revealed that while teachers are generally more
accurate in reporting the frequency of specific instructional practices (Mayer, 1999; Ross et al.,
2003), their accuracy diminishes when reporting the quality of their instructional practices
(Kaufman et al., 2016; Stecher et al., 2006).
Recently scholars have started to pay attention to the alignment between teachers’ selfreported learning based on self-reports and observed learning captured by direct assessments
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
74
during teacher PD (Copur-Gencturk & Thacker, 2021; Gardner-Neblett et al., 2020). CopurGencturk and Thacker (2021) reported a non-significant and barely zero correlation between
self-reported and direct assessments of mathematical content knowledge gains among 545
teachers from 24 traditional yearlong face-to-face PD programs. In a study on a two-hour
asynchronous OPD program for improving early childhood teachers’ knowledge of dual
language learners, Gardner-Neblett et al. (2020) found that teachers’ self-reported and observed
knowledge showed no significant correlation before the OPD intervention and a weak positive
correlation after the OPD intervention. The discrepancy between these two measures indicates
that what early childhood teachers know about young dual language learners may be differ from
what they believe they know. However, the comparison between self-reported and observed
learning in Gardner-Neblett et al. (2020) was focused on the current level of teacher knowledge
rather than the gains in the teacher knowledge. Clarifying the focus of knowledge measures is
important as the accuracy of self-assessments could vary depending on whether the measure is
capturing the current state of knowledge or changes in knowledge. Self-assessments of
knowledge gains, which require a comparison between one’s current and previous level of
knowledge, are inherently more complex than merely assessing current knowledge. In Sitzmann
et al.’s (2010) meta-analysis, when the self-assessment measure was focused on the current level
of knowledge, a moderate correlation was found between self-assessments and observed
cognitive knowledge. However, this correlation vanished when self-assessment measure focused
on knowledge gains. In many studies measuring the impact of OPD or face-to-face PD programs
based on teachers’ self-reports, teachers are commonly asked to evaluate the changes in their
knowledge or practice attributable to the program (e.g., Garet et al., 2016; Griffin et al., 2018;
Sheridan & Wen, 2021), rather than rating their current level of knowledge or practice. Thus, it is
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
75
more practically meaningful to focus on knowledge gains when comparing self-assessments and
direct assessments.
Overall, attention to the measures used to assess teacher learning during OPD is lacking.
Examining the relationship between teachers’ self-reported and observed learning across various
PD modalities is important, as prior literature indicates the accuracy of self-assessed knowledge
can vary based on instructional delivery methods (Sitzmann et al., 2010). Sitzmann et al’s (2010)
study found that the correlation between self-assessed learning and observed learning was much
stronger in face-to-face instruction, compared to the correlation in web-based instruction in
education and workplace training.
Self-Regulated Learning in Teacher PD
Self-regulated learning has been extensively studied and recognized as an important
factor influencing students’ learning outcomes in online K-12 and higher education (Broadbent
& Poon, 2015; Wong et al., 2019; Xu et al., 2023). In the context of work-related training, a
meta-analysis revealed that SRL constructs collectively explained 17% of the variance in
learning outcomes, when controlling for cognitive ability and pretraining knowledge (Sitzmann
& Ely, 2011). However, the exploration of how teachers, as working professionals, manage their
learning in online environments (both synchronous and asynchronous) remains limited. Among
studies that investigate teachers’ SRL, only a few have attempted to explore the relationship
between teachers’ SRL and their learning outcomes in online contexts (e.g., Fan et al., 2021;
Huang et al., 2021; Tasar & Imer Cetin, 2021). For example, Huang et al. (2021) explored how
68 pre-service teachers’ metacognitive monitoring (i.e., one aspect of SRL) related to their
development of technological pedagogical content knowledge (TPACK) in a computer-based
learning system. They found that teachers with higher competence in metacognitive monitoring
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
76
reported more improvement in their TPACK than teachers with less competence in
metacognitive monitoring. Fan et al. (2021) examined how teachers with different course
performance differed in their choice of learning tactics across lesson units in a Chinese MOOC
course on flipped classroom teaching. In the study, Fan and colleagues used teachers’ total scores
after the course to represent teacher learning in the MOOC course without controlling for
teachers’ baseline knowledge level. Their study found teachers in the high-performing group
demonstrated higher level of SRL by adjusting their learning tactics according to the needs of
different course units, while low-performing teachers showed no significant adjustments in their
tactic usage.
Despite preliminary positive findings about the role of teachers’ SRL, research is limited
in a few aspects. First, current studies have been conducted in controlled laboratory
environments, such as computer labs, where teachers engaged with learning materials on
computers synchronously, typically for less than an hour (e.g., Huang et al., 2021; Huang &
Lajoie, 2021; Kramarski & Michalsky, 2009). This context raises concerns, as learner behavior
in a supervised, short-duration setting may differ significantly from that in unsupervised
condition. Second, few studies have attempted to understand the role of teachers’ SRL in their
development of content knowledge for teaching such as CK and PCK. Traditional contentfocused OPD literature predominantly focuses on identifying program-level factors, such as the
utility of technologies and the relevance of training materials, that might influence teachers’
learning (Bragg et al., 2021; Powell & Bodur, 2019). However, SRL, a concept stemming from
the field of psychology, is seldom investigated in traditional PD literature. Given the highly
autonomous nature of fully asynchronous OPD programs, teachers’ ability to self-direct their
learning may matter more in such programs than in traditional face-to-face PD programs. Thus,
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
77
understanding teachers’ SRL in their learning of content knowledge for teaching in fully
asynchronous OPD is necessary and could inform program designers on how to scaffold teacher
learning in these programs. Additionally, from a theoretical perspective, SRL is driven by selfassessments through monitoring and evaluation (Panadero, 2017; Winne, 2017). Thus, teachers
who consistently monitor and evaluate their comprehension of program content might provide
more accurate self-assessments of learning from the program. However, we have not found any
empirical studies testing this hypothesis, as most studies have relied on either self-reported
measures or direct assessments to capture teacher learning, rather than using a combination of
both.
Method
Context of the Study and Participants
This study is part of a broader research project aiming to develop an asynchronous OPD
program for enhancing middle school mathematics teachers’ CK and PCK for teaching
proportional relationships. The OPD program was implemented within a dialogue-based ITS that
provides teachers with opportunities to interact with and receive timely personalized feedback
from a virtual facilitator (See Chapter 2 page 49 for a detailed description about content and
design of this OPD program). The data used in this study were collected from middle school
mathematics teachers teaching grade 6 or 7 across the United States. Before recruiting
participants, the study design and data collection procedures were reviewed and approved by the
authors’ Institutional Review Board (IRB). With IRB approval, we recruited teachers across the
United States by contacting them via school emails obtained from an education research
company for a fee or through an anonymized link posted on social media. We invited qualified
teachers (i.e., those able to complete our OPD program in 2021 summer and teaching grade 6/7
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
78
in the academic year of 2021-2022) to participate in this study. These teachers were asked to
complete an initial survey that included a consent form outlining the purpose of the study,
research activities participants would complete, the compensation, the confidential nature, and
benefits and potential risks of the study. Participation was entirely voluntary, and participants
could withdraw at any time without penalty. Data were only collected from teachers who gave
consent. These teachers were then given access to the OPD program in 2021 summer. The
analytical sample included 57 teachers who provided self-assessments of their learning
improvement from the program and completed both pre-and post-assessments on their
knowledge. Most of the teachers in the analytical sample were identified as female (82.46%) and
White (70.18%), which was similar to the demographic characteristic of secondary public-school
teachers in the United States. Over half of the teachers have a specialized teaching credential in
math (57.89%) and had experience taking OPD programs (61. 40%) (see Table 3.1 for teacher
characteristics in our sample and the national-wide sample in the U.S.).
Table 3.1
Teacher and School Characteristics in the Analytic sample and National Representative Sample
Teacher characteristics N
Analytic
sample
(%)
National-wide sample of
U.S. secondary publicschool teachers (%)
Gendera
Male 10 17.54 23.50
Female 47 82.46 76.50
Racea
White 40 70.18 79.30
Black 5 8.77 6.70
Hispanic 5 8.77 9.30
Other 7 12.28 4.60
Highest degree earneda
Bachelor’s 14 24.56 40.70
Master’s 42 73.68 56.80
Doctorate 1 1.75 1.30
Years of teaching experience
Less than 3 21 36.84 7.30
3 to 9 8 14.04 29.10
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
79
10 to 20 18 31.58 37.30
Over 20 10 17.54 26.30
Math teaching credential NA
No math credential 24 42.11
Having math credential 33 57.89
Prior experience taking OPD NA
No 22 36.84
Yes 35 61.40
Note.
a From Digest of Education Statistics,
https://nces.ed.gov/programs/digest/d20/tables/dt20_209.22.asp.
Measures
Direct Assessments of Teachers’ Content Knowledge for Teaching
Teachers’ CK and PCK were measured using identical items and coding rubrics, as
detailed in Paper 2 (refer to page 59-60). Teachers’ scores on CK and PCK were calculated by
dividing the total number of points teachers received by the maximum number of points
available (i.e., on a scale of 0 – 1 point). Teachers took the CK and PCK test prior to and after
completing the OPD program (repeated measures). Changes in scores from pre-test to post-test
indicated teachers’ observed gains in CK and PCK (see Table 3.2 for descriptive statistics).
Self-Reported Gains in Content Knowledge for Teaching
To ensure content alignment between the knowledge measured by the self-reported
instrument and direct assessments, we asked teachers to report the types of knowledge they
elicited when answering the items in direct assessments. Specifically, we asked teachers to rate
the extent to which this OPD program improved their knowledge of (a) ratio and proportional
relationships (i.e., CK), (b) understanding of how children think and learn about ratios and
proportional relationships (i.e., one dimension of PCK), and (c) knowledge of instructional
representations and strategies to teach ratios and proportional relationship (i.e., the other
dimension of PCK). Items were based on a 5-point scale ranging from 1 (Not at all) to 5 (Great
extent). Teachers’ scores on the CK survey item indicated their self-reported gains in CK.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
80
Teachers’ average scores on the two PCK survey items indicated their self-reported gains in PCK
(Cronbach’s alpha = 0.80 for PCK survey items).
Test-Survey Alignment Measure
We first standardized teachers’ scores of their CK and PCK gains in both self-reports and
direct assessments. We then calculated the test-survey alignment by taking the absolute value of
the difference between teachers’ standardized test scores and their standardized survey scores.
Thus, an alignment score close to zero indicates greater consistency between teachers’ selfreported knowledge gains and directly assessed knowledge gains (see Table 3.2 for descriptive
data).
Quality of Mathematics Instruction
To evaluate the quality of teachers’ instructional practice on ratio and proportional
relationships after they completed the OPD program, four sets of classroom artifacts teachers
used in their instruction were collected. Each set of artifacts included the task(s) teachers used in
teaching, six student work samples chosen by teachers to represent different levels of student
understanding, and a cover sheet (see Appendix C.1) with a set of questions asking teachers
about the rationale for selecting the task(s), the key concepts targeted in the task(s), and their
analyses of students’ level of understanding. Prior studies have shown that classroom artifacts
closely align with observed teaching practices and could be used as a stable and reliable measure
of instructional quality (Baumert et al., 2010; Borko et al., 2005; Boston, 2012; Clare &
Aschbacher, 2001; Joyce et al., 2018; Matsumura et al., 2002). For example, Matsumura and her
colleagues (2002, 2008) at the National Center for Research on Evaluation, Standards, and
Student Testing (CRESST) have investigated the validity of assessing instructional quality
through classroom artifacts. They found the correlation between overall ratings of observed
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
81
instruction and classroom artifacts were 0.68 (p < 0.01) in mathematics. The number of
classroom artifact sets we collected was based on previous research, which indicated that four
teaching tasks, along with three sets of student work for each task, could provide a reliable
estimate of instructional quality when evaluated by two raters (Matsumura et al., 2008).
The artifacts were coded based on three dimensions of high-quality mathematics
instruction: the selection of cognitively demanding tasks, the maintenance of cognitive demands
of the task during teaching and providing coherent mathematics learning experience for
conceptual understanding. The cognitive demand of the task was evaluated based on the potential
of the tasks teachers selected for students to develop a rich mathematical understanding, using a
4-point rubric (for scoring rubric, see Table C.2 in Appendix) in the Instructional Quality
Assessment (IQA) protocol (Boston, 2012). The maintenance of the cognitive demands of the
task during teaching was evaluated based on six students’ written work for each task and coded
using another 4-point rubric in the IQA protocol to capture the extent to which the tasks provided
opportunities for students to engage in mathematically rich understanding (for rubric, see Table
C.3 in Appendix). To evaluate whether teachers provided a coherent learning experience for
students’ conceptual understanding, we coded the cover sheets teachers provided for each task
and their analyses of associated student work, by using a 4-point rubric developed by the
research team to capture the consistency in teachers’ planning (i.e., the task, rationales for task
selection, and key concepts targeted), implementation (i.e., student work samples), and
expectations (i.e., teachers’ analysis of students’ work to evaluate their understanding) (for
rubric, see Table C.4 in Appendix).
The data were blind coded by two independent raters and any differences in coding were
discussed until agreements were reached. The Cohen’s kappa statistics for three dimensions
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
82
ranged from 0.72 to 0.75, indicating substantial agreement between two raters. The reliability
(i.e., Cronbach’s alpha) for three dimensions ranged from 0.74 to 0.88. For each teacher, we
calculated the average scores for each dimension of instruction across the four sets of learning
artifacts, which were then used in further analyses to indicate teachers’ performance in three
dimensions of their teaching practice.
Self-Regulated Learning Strategies
We measured three aspects of SRL strategies, including organization, elaboration, and
metacognitive regulation, through a self-reported survey instrument after teachers completed the
OPD program. The survey included ten Likert-type items, adapted from Pintrich’s (1990)
Motivated Strategies for Learning Questionnaire (see Table C.5 in Appendix for all items). Each
item was rated on a 1-4 scale (1= strongly disagree; 4 = strongly agree), asking teachers to
indicate their level of involvement in different SRL activities. Specifically, teachers’ usage of
organization strategies was measured by three items with a reliability of 0.72 (e.g., “I review my
notes and try to find the most important content in this online PD”). Three items were used to
capture teachers’ usage of elaboration strategies with a reliability of 0.65 (e.g., “When studying
for this online PD, I try to relate the content in the online PD to what I already know”). The
aspect of metacognitive regulation was measured by four items with a reliability of 0.78 (e.g., “I
monitor and evaluate what I understand by pausing at a regular interval or whenever needed
while studying for this online PD”). A confirmatory factor analysis was conducted to test the
hypothesized latent structure of SRL scales. The result of the analysis showed the hypothesized
model was a reasonable fit to the data (comparative fit index [CFI] = 0.96; root mean square
error of approximation [RMSEA] = 0.06) based on the rule of thumb for CFI and RMSEA, as
recommended by Hu and Bentler (1999) (CFI>0.95; RMSEA ≤ 0.06). Teachers’ average scores
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
83
across items on each SRL strategy were used in the analyses to indicate their level of proficiency
in that aspect of SRL.
Table 3.2
Descriptive Statistics for the Measures Used in the Analyses
Measures Mean
Std.
dev. Min Max
Teacher content knowledge for teaching
Pre-CKa 0.72 0.13 0.28 0.92
Post-CKa 0.76 0.13 0.28 0.96
Pre-PCKa 0.35 0.11 0.12 0.62
Post-PCKa 0.39 0.14 0.08 0.73
Observed gains in CKa 0.04 0.10 -0.28 0.28
Observed gains in PCKa 0.03 0.10 -0.19 0.23
Self-reported gains in CKa 3.39 0.75 2.00 4.00
Self-reported gains in PCKa 3.32 0.71 2.00 4.00
Test-Survey Alignment
CKa 1.18 0.97 0.04 4.39
PCKa 0.96 0.81 0.07 3.43
Teachers' self-regulated learning strategies
Organizationa 2.78 0.74 1.00 4.00
Elaborationa 3.32 0.52 2.00 4.00
Metacognitive regulationa 3.17 0.55 1.75 4.00
Teachers’ instructional practice
Potential cognitive demand of the tasksb 3.00 0.49 2.25 4.00
Cognitive demand of the tasks during instructionb 2.52 0.61 1.50 3.75
Coherent mathematics for conceptual understandingc 2.31 0.87 1.00 4.00
Note. CK = content knowledge; PCK = pedagogical content knowledge
aN = 57; bN = 28; cN = 27.
Analytical Approach
To examine the alignment between teachers’ self-reported and directly assessed CK
gains, I firstly conducted a linear regression model in which teachers’ observed CK gains were
predicted by teachers’ self-reported CK gains. To explore the relationship between teachers’
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
84
SRL strategies and their observed CK gains, I then added three SRL strategies into the regression
model. The full model9
is specified as:
𝐶𝐾_𝑜𝑏𝑠𝑒𝑟𝑣𝑒𝑑𝑔𝑎𝑖𝑛𝑠𝑖 = 𝛽0 + 𝛽1𝐶𝐾_𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖 +
𝛽3𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽4𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[1]
Below is the model specification for exploring the relationship between teachers’ self-reported
CK gains and SRL strategies:
𝐶𝐾_𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑔𝑎𝑖𝑛𝑠𝑖 = 𝛽0 + 𝛽1𝐶𝐾_𝑜𝑏𝑠𝑒𝑟𝑣𝑒𝑑𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖 +
𝛽3𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽4𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[2]
We followed the same procedure to examine the alignment between teachers’ self-reported and
directly assessed PCK and the relationship between SRL strategies and PCK gains measured by
self-reports and direct assessments. Below are the full model specifications:
𝑃𝐶𝐾_𝑜𝑏𝑠𝑒𝑟𝑣𝑒𝑑𝑔𝑎𝑖𝑛𝑠𝑖 = 𝛽0 + 𝛽0 + 𝛽1𝑃𝐶𝐾_𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖
+ 𝛽3𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽4𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[3]
𝑃𝐶𝐾_𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑔𝑎𝑖𝑛𝑠𝑖 = 𝛽0 + 𝛽1𝑃𝐶𝐾_𝑜𝑏𝑠𝑒𝑟𝑣𝑒𝑑𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖 +
𝛽3𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽4𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[4]
To examine whether teachers’ use of SRL strategies are related to the accuracy of
teachers’ self-reported knowledge gains, we conducted a linear regression model where the testsurvey alignment (i.e., the absolute value of the difference between standardized self-reported
and observed knowledge gains) is predicted by teachers’ use of three SRL strategies for CK and
PCK, respectively. Below are the model specifications for the estimation of test-survey
alignment of CK and PCK:
𝐴𝑙𝑖𝑔𝑛𝑚𝑒𝑛𝑡𝐶𝐾𝑖
= 𝛽0 + 𝛽1𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽2𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽3𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[5]
9 We estimated robust standard errors across all linear regression models in the study.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
85
𝐴𝑙𝑖𝑔𝑛𝑚𝑒𝑛𝑡𝑃𝐶𝐾𝑖
= 𝛽0 + 𝛽1𝑂𝑟𝑔𝑎𝑛𝑖𝑧𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽2𝐸𝑙𝑎𝑏𝑜𝑟𝑎𝑡𝑖𝑜𝑛𝑖 + 𝛽3𝑀𝑒𝑡𝑎𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛𝑖 + 𝜀𝑖
[6]
To explore how teachers’ self-reported and directly assessed knowledge gains each relate
to the quality of teachers’ instruction, we then conducted six sets of ordinary least squares
regression analyses (i.e., three dimensions of instructional quality by CK and PCK). Each
dimension of instructional quality is predicted by teachers’ self-reported knowledge gains,
observed knowledge gains, while controlling for teachers’ baseline knowledge. By adjusting for
teachers’ baseline knowledge, we could estimate the relationship between knowledge gains and
instructional quality among teachers with the same starting point. Below are model specifications
for examining the relationship between teachers’ learning gains in CK and the quality of
instructional practice:
𝑇𝑎𝑠𝑘𝑝𝑜𝑡𝑒𝑛𝑡𝑖𝑎𝑙𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝐶𝐾𝑖 + 𝜀𝑖
[7]
𝐶𝑜𝑔𝑛𝑖𝑡𝑖𝑣𝑒_𝑑𝑒𝑚𝑎𝑛𝑑𝑠_𝑖𝑛_𝑡𝑒𝑎𝑐ℎ𝑖𝑛𝑔𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝐶𝐾𝑖 + 𝜀𝑖
[8]
𝐶𝑜ℎ𝑒𝑟𝑒𝑛𝑡_𝑚𝑎𝑡ℎ_𝑡𝑒𝑎𝑐ℎ𝑖𝑛𝑔𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝐶𝐾𝑖 + 𝜀𝑖
[9]
Below are model specifications for examining the relationship between teachers’ learning gains
in PCK and the quality of instructional practice:
𝑇𝑎𝑠𝑘𝑝𝑜𝑡𝑒𝑛𝑡𝑖𝑎𝑙𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝑃𝐶𝐾𝑖 + 𝜀𝑖
[10]
𝐶𝑜𝑔𝑛𝑖𝑡𝑖𝑣𝑒_𝑑𝑒𝑚𝑎𝑛𝑑𝑠_𝑖𝑛_𝑡𝑒𝑎𝑐ℎ𝑖𝑛𝑔𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝑃𝐶𝐾𝑖 + 𝜀𝑖
[11]
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
86
𝐶𝑜ℎ𝑒𝑟𝑒𝑛𝑡_𝑚𝑎𝑡ℎ_𝑡𝑒𝑎𝑐ℎ𝑖𝑛𝑔𝑖 = 𝛽0 + 𝛽1𝑠𝑒𝑙𝑓𝑟𝑒𝑝𝑜𝑟𝑡𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 +
𝛽2𝑂𝑏𝑠𝑒𝑟𝑣𝑒𝑑_𝑃𝐶𝐾𝑔𝑎𝑖𝑛𝑠𝑖 + 𝛽3𝑏𝑎𝑠𝑒𝑙𝑖𝑛𝑒𝑃𝐶𝐾𝑖 + 𝜀𝑖
[12]
It is worth to note that the sample size in above analyses is small, which limited the statistical
power10 to reject the null hypothesis at the significance level of 0.05, if the null is false. A guided
sample size for power to equal 0.8 with three or four predictors (i.e., number of predictors in
above analyses) in multiple regression is 218 and 311, respectively (Maxwell, 2000), which is
much larger than the sample size in the study (N = 57 for models specified in equation 1-6; N <
30 for models specified in equation 7-12). Thus, the significance level of p-value was not
adjusted to correct for an increased risk of type I error given multiple null hypothesis
significance testing was conducted in the study (Armstrong, 2014; Bender & Lange, 2001; Feise,
2002). Considering that the p-value is affected by the size of the sample and limited in providing
practical meaning, reporting “effect magnitude” (i.e., a measure of the practical significance;
Kirk, 1996, p. 748) along with p-value, has been recommended by many researchers to provide a
more informative and meaningful explanation of research findings (Hojat & Xu, 2004; Kirk,
1996; Pogrow, 2019; Sullivan & Feinn, 2012). Two advantages of effect size measures include
(1) reflecting the magnitude of the association and (2) not being influenced by sample size
(Pogrow, 2019). In this study, proportion of variance uniquely accounted for by each predictor11
,
over and above that of all other predictors (i.e., Cohen’s f2
; Cohen, 1988; Selya et al., 2012),
10 A post-hoc power analysis in G*Power 3.1 (Faul et al., 2009) based on a significance level of 0.05, sample size,
and number of predictors revealed that the power is 0.59 for detecting a medium effect size (cohen’s f 2 = 0.15) and
0.11 for detecting a small effect size (cohen’s f 2 = 0.11) for the models specified in equation 1 to 4. Similar power
analysis was conducted for models specified in equation 5-12.
11 It is calculated by using 𝑓
2 =
𝑅𝐴𝐵
2 −𝑅𝐴
2
1−𝑅𝐴𝐵
2
where B is the predictor of interest (e.g., metacognition), A is the set of all
other variables. 𝑅𝐴𝐵
2
is the proportion of variance explained for by A and B together. 𝑅𝐴
2
is the proportion of variance
explained by A.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
87
were reported as a measure of effect size, along with p-values, to understand both practical and
statistical significance of the findings.
Results
Table 3.3 and 3.4 displayed the results of the association between self-reported and
observed knowledge gains, as well as the relationship these two types of outcome measures and
SRL strategies. According to the results of Model 1 shown in Table 3.3, teachers’ self-reported
CK gains were not statistically significantly associated with their observed CK gains in direct
assessments (p = 0.21). Results for Model 1 shown in Table 3.4 showed the relationship between
teachers’ self-reported PCK gains and observed PCK gains in assessments. Similarly, the
association is not statistically significant (p = 0.14). The results seemed to indicate teachers’
perceptions about their learning gains in the OPD program were not well aligned with the
learning gains captured in direct assessment.
In terms of the relationship between SRL strategies and knowledge gains of CK, the
result of Model 2 presented in Table 3.3 showed there is a statistically significant positive
association between monitoring and teachers’ knowledge gains of CK, as measured by direct
assessments (p = 0.002). Additionally, 17% of variance (a medium effect) in observed CK gains
was uniquely accounted for by teachers’ use of monitoring strategies. Together, the results
indicated teachers who involved more in monitoring and evaluating their understanding of the
OPD program content (i.e., metacognitive regulation) demonstrated greater gains in CK, as
measured by direct assessments. Organization and elaboration strategies were not statistically
significant predictors of teachers’ observed CK gains with a small effect size. Additionally,
results of Model 4 in Table 3.3 indicated none of the SRL strategies significantly predicted
teachers’ self-reported CK gains and explained a very small amount of variance in the outcome.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
88
Similarly, none of the SRL strategies significantly facilitated teachers’ learning of PCK,
regardless of how it is measured, and had limited practical meaning based on the variance
explained (see Model 2 and Model 2 in Table 3.4). Lastly, we did not find any evidence
supporting the role of SRL strategies in the accuracy of teachers’ self-assessments of knowledge
gains of both CK and PCK.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
89
Table 3.3
Self-Regulated Learning and Teachers’ Knowledge Gains in CK
Measures Observed Gains Self-Reported Gains Alignment
Model1 Model2 Cohen’s f2 Model3 Model4 Cohen’s f2 Model5 Cohen’s f2
Self-reported gains -0.02 -0.02 0.03 0.03
(0.02) (0.02)
Observed gains -1.49 -1.50
(1.08) (1.22)
Organization -0.01 0.01 0.17 0.02 0.03 0.00
(0.02) (0.18) (0.21)
Elaboration -0.05 0.06 0.15 0.01 -0.44 0.04
(0.03) (0.22) (0.37)
Monitoring 0.09** 0.17 -0.00 0.00 -0.35 0.03
(0.03) (0.25) (0.27)
Intercept 0.12~ 0.02 3.44*** 2.48** 3.67**
(0.07) (0.10) (0.10) (0.78) (1.17)
Observations 57 57 57 57 57
R-squared 0.04 0.21 0.04 0.09 0.13
Note. Coefficient estimates are reported with corresponding robust standard errors in parentheses; Cohen’s f2 are reported for
Model 2, Model 4 and Model 5.
~
p <0.1. *p < .05. **p < .01. ***p < .00
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
90
Table 3.4
Self-Regulated Learning and Teachers’ Knowledge Gains in PCK
Measures Observed Gains Self-Reported Gains Alignment
Model1 Model2 Cohen’s f2 Model3 Model4 Cohen’s f2 Model5 Cohen’s f2
Self-reported gains 0.03 0.03 0.03
(0.02) (0.02)
Observed gains 1.45 1.24 0.03
(1.01) (0.99)
Organization 0.01 0.00 0.00 0.00 0.06 0.00
(0.02) (0.18) (0.14)
Elaboration 0.02 0.01 0.15 0.01 0.11 0.00
(0.03) (0.21) (0.20)
Monitoring -0.01 0.00 0.26 0.03 -0.03 0.00
(0.03) (0.22) (0.28)
Intercept -0.07 -0.11 3.27*** 1.97** 0.53
(0.07) (0.12) (0.10) (0.70) (1.01)
Observations 57 57 57 57 57
R-squared 0.04 0.06 0.04 0.11 0.01
Note. Coefficient estimates are reported with corresponding robust standard errors in parentheses; Cohen’s f2 are reported for
Model 2, Model 4 and Model 5.
~
p <0.1. *p < .05. **p < .01. ***p < .001
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
91
Results presented in Table 3.5 and 3.6 indicate how teachers’ self-reported and observed gains of
CK and PCK were related to each dimension of teachers’ instructional quality after participating
the OPD program. As results of Model 1 shown in Table 3.5 displayed, both teachers’ selfreported and observed CK gains were significantly associated with teachers’ selection of
cognitive demanding tasks in lesson planning (p = 0.02 for self-reported CK gains; p = 0.01 for
observed CK gains). In terms of the practical significance, observed CK gains explained around
10% of variance more in the outcome than that of teachers’ self-reported CK gains. In contrast,
only teachers’ observed CK gains were statistically significantly related to the cognitive demands
of tasks during enaction (R2 = 0.23, p = 0.008) and coherent mathematics for conceptual
understanding (R2 = 0.15, p = 0.041) and had a moderate magnitude. Taken together, the results
in Table 3.5 suggested observed CK gains seemed to better reflect the certain knowledge
required for high-quality mathematics instruction overall based on its statistical and practical
significance. As displayed in Table 3.6, I did not find any positive relationships at the
significance when synthesizing level of 0.05 between teachers’ self-reported or directly assessed
PCK gains and three dimensions of instructional quality. The small sample size (N < 30) could
be a potential reason for not observing any statistically significant results. Moreover, teachers’
observed and self-reported PCK gains explained a small amount of variance in the quality of
instructional practice, which may suggest teachers did not gain enough PCK from the OPD
program to improve their instruction.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
92
Table 3.5
Relationship Between Teachers’ Learning of CK and Instructional Practice
Measures
Potential
cognitive
demands of the
tasks
Cognitive
demands of the
tasks during
instruction
Coherent
mathematics
for conceptual
understanding
Model 1 Cohen’s f2 Model 2 Cohen’s f2 Model 3 Cohen’s f2
Self-reported gains 0.26* 0.14 0.30 0.14 0.18 0.02
(0.10) (0.18) (0.23)
Observed gains 2.42** 0.22 2.96** 0.23 3.40* 0.15
(0.86) (1.02) (1.58)
Baseline CK 2.38* 0.24 3.65** 0.33 4.48** 0.26
(1.02) (1.13) (1.25)
Intercept 0.28 -1.31 -1.71
(1.02) (1.27) (1.39)
Observations 28 28 27
R-squared 0.30 0.36 0.29
Note. Coefficient estimates are reported with corresponding robust standard errors in parentheses.
~
p <0.1. *p < .05. **p < .01. ***p < .001
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
93
Table 3.6
Relationship Between Teachers’ Learning of PCK and Instructional Practice
Measures
Potential
cognitive
demands of the
tasks
Cognitive
demands of the
tasks during
instruction
Coherent
mathematics
for conceptual
understanding
Model 1 Cohen’s f2 Model 2 Cohen’s f2 Model 3 Cohen’s f2
Self-reported gains 0.16 0.04 0.08 0.00 0.05 0.00
(0.14) (0.20) (0.26)
Observed gains 0.28 0.00 1.94~ 0.11 2.30 0.08
(0.85) (1.08) (1.64)
Baseline PCK 0.07 0.00 1.38 0.05 2.21 0.06
(0.82) (1.15) (1.92)
Intercept 2.42*** 1.67~ 1.26
(0.56) (0.86) (1.23)
Observations 28 28 27
R-squared 0.06 0.15 0.13
Note. Coefficient estimates are reported with corresponding robust standard errors in parentheses.
~
p <0.1. *p < .05. **p < .01. ***p < .001
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
94
Discussion
Self-reported Learning, Observed Learning, and Teachers’ SRL
Researchers have called for external evaluations of OPD by using multiple and rigorous
outcome measures (Dede et al., 2009; Lay et al., 2020). However, self-reported instruments
remain the most widely used and often the only measure for understanding the impact of
asynchronous OPD on teachers. This study addresses the validity issue of self-reported
instruments by comparing mathematics teachers’ self-reported and observed knowledge gains
through participating in a fully asynchronous OPD program. Additionally, it extends prior
research by examining in-service teachers’ competence as self-regulated learners and its relation
to teacher learning in online settings and the accuracy of their self-assessments of knowledge
gains.
The first research questions aimed to test the convergent validity of self-reported
instruments to measure teachers’ knowledge gains by examining how closely they are related to
cognitive assessments that measure the same knowledge. The results showed a lack of
statistically significant correlation between teachers’ self-reported and observed knowledge for
both CK and PCK. The findings in the study diverged from some prior studies that compared
self-reported and direct assessments of teachers’ current level of knowledge (Drummond &
Sweeney, 2017; Gardner-Neblett et al., 2020; Krauskopf & Forssell, 2018). These studies
generally reported a statistically significant small to medium correlation between teachers’ selfperceived and directly assessed knowledge. However, when the focus is shifted to knowledge
gains, our findings align with a relevant study by Copur-Gencturk and Thacker (2021), which
reported non-significant zero correlation coefficient between teachers’ self-reported learning and
the learning measured by direct assessments in the context of face-to-face PD. One potential
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
95
explanation is that the study does not have the statistical power to detect statistically significant
results. Another potential explanation is that evaluating changes in knowledge is inherently more
challenging for individuals than assessing their current knowledge. This resonates with findings
in Sitzmann et al. (2010), which reported the magnitude of correlation between self-assessments
and cognitive assessments of knowledge decreased when the focus of the measure is on gains.
Taken together, the lack of significant correlation between the two outcome measures suggests
that what teachers believe they learned from the asynchronous OPD program does not align with
what direct assessments measure they have learned. This discrepancy indicates that self-reported
instruments and direct assessments may actually capture different latent constructs. Our
exploratory analyses found that knowledge gains measured by direct assessments could better
predict the overall quality of teachers’ instructional practice from the aspect of statistical and
practical significance. This provides additional evidence that direct assessments may better
capture the content-specific knowledge necessary for quality teaching.
In the second research question, I expand previous work on potential factors associated
with teacher learning in asynchronous OPD by examining teachers’ competence as self-regulated
learners. In teachers’ learning of CK, I found that teachers who regularly monitored and
evaluated their understanding of the program content demonstrated greater knowledge gains of
CK, as measured by direct assessments, compared to their peers with less involvement in these
monitoring behaviors. However, these teachers did not report higher perceived CK gains. The
finding aligns with theoretical frameworks of SRL, which emphasize that monitoring plays a
central role in the cognitive process of learners (Panadero, 2017; Pintrich, 2004; Winne, 2001). It
is also consistent with results reported in Huang et al. (2021) that teachers’ competence in
metacognitive SRL was significantly associated with TPACK performance based on evaluations
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
96
of teachers’ lesson plans, rather than their self-reports. In terms of teachers’ learning of PCK,
monitoring did not lead to either higher perceived gains or directly assessed knowledge gains.
This result suggests that monitoring may play a different role in teacher learning depending on
the cognitive demands of different types of teacher knowledge. Content knowledge is often
considered the prerequisite of PCK (Agathangelou & Charalambous, 2021) because the latter
requires teachers’ conceptual understanding of the mathematics concepts and knowledge about
students’ mathematical thinking and instructional strategies. Given the high cognitive demands
for learning PCK, teachers may not be aware of their knowledge gap when monitoring and
evaluating their learning of this type of knowledge during PD and miss the opportunities to
adjust for more efficient learning. Lastly, I explored how teachers’ use of SRL strategies was
related to the alignment between their self-reported and observed gains of CK and PCK.
Contrary to our hypothesis, none of the SRL strategies seemed to help teachers provide more
accurate self-assessments of knowledge gains.
Limitation
I want to note a few limitations in the study. First, because the OPD program targeted
specific content, all participating teachers were mathematics teachers teaching grade 6 or 7. The
study used a convenience sample. Although we recruited a nationwide sample, it may not be
nationally representative. Teachers in other subjects or grades might provide a more accurate
evaluation of their learning or demonstrate different levels of SRL. Thus, the findings might not
be generalized to other teacher populations. Second, the sample size in this study is small, which
limited the statistical power to detect significant results. Thus, the findings regarding p-value
should be interpreted with caution. Following recommendations from researchers (Hojat & Xu,
2004; Kirk, 1996; Pogrow, 2019; Sullivan & Feinn, 2012), I reported effect size estimates, along
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
97
with p-value to provide more informative and meaningful explanations of research findings.
Third, the study only investigated the role of a few SRL strategies on teachers’ learning, leaving
many other SRL aspects (e.g., SRL on motivation) unexplored. Also, it is important to consider
that teachers might use or adapt specific SRL strategies in response to the distinct demands of
learning sessions within PD programs (Fan et al., 2022; Huang & Lajoie, 2021). However, our
method of measuring SRL was unable to capture these dynamic and real-time processes. This
limitation is attributed to the initial design of the PD system, which was not specifically
developed to understand or support teachers’ SRL. We are aware that there is a possibility of
inaccuracy in teachers’ self-reports about their use of learning strategies.
Implications for Research and Practice
The results of the study emphasize the importance of carefully considering the benefits
and shortcomings of different methods for evaluating the impact of asynchronous OPD
programs. An asynchronous OPD program, while perceived as effective based on teachers’ selfreports, may not demonstrate the same level of effectiveness if evaluated using cognitive
assessments. Researchers and practitioners need to be cautious when interpreting and reporting
the impacts of OPD programs, particularly if these conclusions are derived solely from teachers’
self-reports. By comparing different methods for measuring the same outcome in one program
setting, we are not trying to prove which method is superior to the other. Given that the outcome
itself (e.g., teachers’ PCK or instructional quality) is typically a multidimensional construct, it is
challenging to capture it using just one type of measure (Cheng et al., 2023). Therefore, we
encourage researchers and practitioners to consider more about the nature of the specific
outcome they are expecting to achieve through professional development programs and then
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
98
choose multiple measures to get a comprehensive understanding about teacher learning from the
program (Copur-Gencturk & Thacker, 2021).
Paying attention to the measures used to evaluate OPD impact can also help establish a
shared understanding of what effective and ineffective asynchronous OPD programs look like.
For decades, researchers have attempted to identify the characteristics of effective face-to-face
teacher PD (Copur-Gencturk et al., 2019; Garet et al., 2001; Kennedy, 2019), with these findings
often informing the development of new PD programs. If we neglect differences in outcome
measures when determining the OPD impact, it might lead to serious consequences for future
program design and implementation. We urge researchers to distinguish between asynchronous
OPD programs that use different types of outcome measures to determine program impact. This
is particularly important when identifying design elements and implementation features that may
contribute to the success of various programs.
Our examination of teachers’ competence as self-regulated learners in an asynchronous
online learning context highlights an aspect that is largely ignored by content-focused teacher PD
literature, but is potentially instrumental in systematically enhancing teacher learning in
asynchronous OPD. Previous work has focused on teachers’ application of SRL in development
of technological knowledge for teaching (e.g., TPACK, see Huang et al., 2023; Kramarski &
Michalsky, 2010) or general pedagogical approaches (e.g., flipped classroom, see Fan et al.,
2022). Our work provides preliminary evidence about the positive role of teachers’
metacognitive SRL in the development of their mathematics content knowledge. The finding
suggests a need to support teachers in developing their own SRL skills in asynchronous online
learning contexts, as it will likely facilitate their learning in the absence of frequent interactions
with and timely feedback from peers and instructors.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
99
Future Work
Based on our findings and the limitations in current study design, we propose a few
directions for future work to further our understanding of teacher learning in asynchronous OPD
programs and to inform the design and implementation of future programs. Future work could
investigate how to improve the validity of self-reported instruments to capture teachers’ content
knowledge for teaching. Researchers could expand the self-reported items to more nuanced
questions such as teachers’ conceptual understanding about a mathematics concept in a
mathematical area (e.g., covariance and invariance as key features of proportional relationships)
and the knowledge about a specific instructional strategy to teach a certain mathematics concept
(e.g., use double number lines to facilitate students’ learning of ratio problems). Future work can
also explore how to adjust for bias in teachers’ self-reported learning by applying anchoring
vignettes correction (Kaufman et al., 2019) and overclaiming techniques (Vonkova et al., 2021).
In terms of understanding teachers’ SRL, future AI-supported PD systems could be enhanced by
integrating features that record teachers’ real-time learning behaviors so that researchers could
capture dynamic and real-time SRL activities teachers are engaged in. Such innovation would
not only deepen our understanding of teachers’ learning processes but also shed light on the
types of scaffolding teachers need in complex, technology-enhanced learning environments. It is
also interesting to investigate how teachers’ competence as self-regulated learners contributes to
their promotion of students’ SRL in instruction, and in turn, facilitates students’ academic
performance (Karlen et al., 2023; Kramarski & Heaysman, 2021).
Conclusion
With the increasing demands for flexible learning opportunities for teachers over the past
decades, it is important to identify effective asynchronous OPD by empirically evaluating
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
100
program impact using valid and reliable measures. As we found in the study, teachers’ perceived
learning in an asynchronous OPD program did not align well with their performance in direct
assessments. This finding urges practitioners and researchers to pay attention to the affordances
and limitations within different measurement methods and their potential influence on our
conclusions about program impact. Our investigation into teachers’ SRL in online programs
highlights the role of teachers’ cognitive and metacognitive activities in their learning. It also
encourages researchers to further explore how different SRL strategies contribute to the
development of different types of knowledge.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
101
Conclusion
This three-paper dissertation explored teachers’ learning of content-specific knowledge
necessary for high-quality mathematics instruction, through their own teaching practice and a
fully asynchronous OPD program implemented within a dialogue-based ITS. I also attempted to
identify teacher-level factors, such as professional background and competence as self-regulated
learners, that potentially influence their attainment of knowledge within informal and digital
learning context. Findings highlight that teaching itself provides numerous learning opportunities
for teachers. As teachers engage in their daily teaching practice, being knowledgeable about the
subject matter would likely accelerate their development of the knowledge necessary for
effective teaching. The preliminary positive evidence regarding the impact of an interactive,
personalized asynchronous OPD program designed for teachers on their students’ achievement
underscores the potential of interdisciplinary collaboration across the fields of teacher education,
educational technology, and computer science. With widespread interests in students’ SRL, this
work emphasizes the importance of teachers’ competence as self-regulated learners. The positive
role that metacognitive monitoring strategies play in teachers’ learning of various types of
knowledge encourages future work into the role of SRL in teacher professional development.
Lastly, to identify effective learning opportunities for teachers, we need te be aware of the
affordances and limitations of the measures used to capture teacher learning. Using diverse
outcome measures would provide a more comprehensive understanding of the effectiveness of
different teacher learning opportunities.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
102
References
Agathangelou, S. A., & Charalambous, C. Y. (2021). Is content knowledge pre-requisite of
pedagogical content knowledge? An empirical investigation. Journal of Mathematics
Teacher Education, 24(4). https://doi.org/10.1007/s10857-020-09466-0
Aleven, V. A. W. M. M., & Koedinger, K. R. (2002). An effective metacognitive strategy:
Learning by doing and explaining with a computer-based Cognitive Tutor. Cognitive
Science, 26(2), 147–179. https://doi.org/10.1016/S0364-0213(02)00061-7
Allen, J. P., Pianta, R. C., Gregory, A., Mikami, A. Y., & Lun, J. (2011). An interaction-based
approach to enhancing secondary school instruction and student achievement. Science,
333(6045), 1034–1037. https://doi.org/10.1126/science.1207998
Armstrong, R. A. (2014). When to use the Bonferroni correction. Ophthalmic & Physiological
Optics: The Journal of the British College of Ophthalmic Opticians (Optometrists),
34(5), 502–508. https://doi.org/10.1111/opo.12131
Astin, A. W., & Astin, H. S. (1992). Undergraduate science education: The impact of different
college environments on the educational pipeline in the sciences. The Higher Education
Research Institute, UCLA. https://eric.ed.gov/?id=ED362404
Association of Mathematics Teacher Educators. (2017). Standards for Preparing Teachers of
Mathematics. Available online at amte.net/standards.
Ayan, R., & Isiksal-Bostan, M. (2019). Middle school students’ proportional reasoning in real
life contexts in the domain of geometry and measurement. International Journal of
Mathematical Education in Science and Technology, 50(1), 65–81.
https://doi.org/10.1080/0020739X.2018.1468042
Bağrıacık Yılmaz, A., & Karataş, S. (2022). Why do open and distance education students drop
out? Views from various stakeholders. International Journal of Educational Technology
in Higher Education, 19(1), 28. https://doi.org/10.1186/s41239-022-00333-x
Ball, D. L. (1990). Prospective elementary and secondary teachers’ understanding of division.
Journal for Research in Mathematics Education, 21(2), 132–144.
https://doi.org/10.2307/749140
Ball, D. L., Thames, M. H., & Phelps, G. (2008). Content knowledge for teaching. Journal of
Teacher Education, 59(5), 389–407. https://doi.org/10.1177/0022487108324554
Baumert, J., Kunter, M., Blum, W., Brunner, M., Voss, T., Jordan, A., Klusmann, U., Krauss, S.,
Neubrand, M., & Tsai, Y.-M. (2010). Teachers’ mathematical knowledge, cognitive
activation in the classroom, and student progress. American Educational Research
Journal, 47(1), 133–180. https://doi.org/10.3102/0002831209345157
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
103
Berliner, D. C. (1988). The development of expertise in pedagogy. American Assoc. of Colleges
for Teacher Education.
Bender, R., & Lange, S. (2001). Adjusting for multiple testing—When and how? Journal of
Clinical Epidemiology, 54(4), 343–349. https://doi.org/10.1016/S0895-4356(00)00314-0
Birman, B. F., Le Floch, K. C., Klekotka, A., Ludwig, M., Taylor, J., Walters, K., Wayne, A., &
Yoon, K.-S. (2007). State and local implementation of the “No Child Left Behind Act.”
volume II--teacher quality under “NCLB”: Interim report. In US Department of
Education. US Department of Education. https://eric.ed.gov/?id=ED497970
Blank, R. K., & de las Alas, N. (2009). The effects of teacher professional development on gains
in student achievement: How meta-analysis provides scientific evidence useful to
education leaders. In Council of Chief State School Officers. Council of Chief State
School Officers. https://eric.ed.gov/?id=ED544700
Blazar, D. (2015). Grade assignments and the teacher pipeline: A low-cost lever to improve
student achievement? Educational Researcher, 44(4), 213–227.
https://doi.org/10.3102/0013189X15580944
Blömeke, S., Jentsch, A., Ross, N., Kaiser, G., & König, J. (2022). Opening up the black box:
Teacher competence, instructional quality, and students’ learning progress. Learning and
Instruction, 79, 101600. https://doi.org/10.1016/j.learninstruc.2022.101600
Blömeke, S., Houang, R.T., Suhl, U. (2014). Diagnosing teacher knowledge by applying
multidimensional item response theory and multiple-group models. In: Blömeke, S.,
Hsieh, FJ., Kaiser, G., Schmidt, W. (Eds) International perspectives on teacher
knowledge, beliefs and opportunities to learn. Advances in Mathematics Education.
Springer, Dordrecht. https://doi.org/10.1007/978-94-007-6437-8_22 Borko, H.,
Eisenhart, M., Brown, C. A., Underhill, R. G., Jones, D., & Agard, P. C. (1992). Learning to
teach hard mathematics: Do novice teachers and their instructors give up too easily?
Journal for Research in Mathematics Education, 23(3), 194.
https://doi.org/10.2307/749118
Borko, H., Jacobs, J., & Koellner, K. (2010). Contemporary approaches to teacher professional
development. In P. Peterson, E. Baker, & B. McGaw (Eds.), International Encyclopedia
of Education (pp. 548–556). Harvard Education Press. https://doi.org/10.1016/B978-0-
08-044894-7.00654-0
Borko, H., Stecher, B. M., Alonzo, A. C., Moncure, S., & McClam, S. (2005). Artifact packages
for characterizing classroom practice: A pilot study. Educational Assessment, 10(2), 73–
104. https://doi.org/10.1207/s15326977ea1002_1
Boston, M. (2012). Assessing instructional quality in mathematics. The Elementary School
Journal, 113(1), 76–104. https://doi.org/10.1086/666387
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
104
Bragg, L. A., Walsh, C., & Heyeres, M. (2021). Successful design and delivery of online
professional development for teachers: A systematic review of the literature. Computers
& Education, 166, 104158. https://doi.org/10.1016/j.compedu.2021.104158
Brennan, K., Blum-Smith, S., & Yurkofsky, M. M. (2018). From checklists to heuristics:
Designing MOOCs to support teacher learning. Teachers College Record, 120(9), 1–48.
https://doi.org/10.1177/016146811812000904
Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement
in online higher education learning environments: A systematic review. The Internet and
Higher Education, 27, 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007
Bywater, J. P., Chiu, J. L., Hong, J., & Sankaranarayanan, V. (2019). The teacher responding
tool: Scaffolding the teacher practice of responding to student ideas in mathematics
classrooms. Computers & Education, 139, 16–30.
https://doi.org/10.1016/j.compedu.2019.05.004
Campbell, P. F., & Malkus, N. N. (2011). The impact of elementary mathematics coaches on
student achievement. Elementary School Journal, 111(3), 430–454.
https://doi.org/10.1086/657654
Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C.-P., & Loef, M. (1989). Using
knowledge of children’s mathematics thinking in classroom teaching: An experimental
study. American Educational Research Journal, 26(4), 499–531.
https://doi.org/10.2307/1162862
Casamayor, A., Amandi, A., & Campo, M. (2009). Intelligent assistance for teachers in
collaborative e-learning environments. Computers & Education, 53(4), 1147–1154.
https://doi.org/10.1016/j.compedu.2009.05.025
Chazan, D., Yerushalmy, M., & Leikin, R. (2008). An analytic conception of equation and
teachers’ views of school algebra. Journal of Mathematical Behavior, 27(2), 87–100.
https://doi.org/10.1016/j.jmathb.2008.07.003
Cheng, Q., Shen, J., & Zhang, S. (2023). Comparing perceived and observed instructional
practices and their predictive power for student mathematics achievement: An analysis of
Shanghai data from OECD global teaching inSights. Asian Journal for Mathematics
Education, 2(4), 445–468. https://doi.org/10.1177/27527263231210322
Chi, M. T. H., De Leeuw, N., Chiu, M.-H., & Lavancher, C. (1994). Eliciting self-explanations
improves understanding. Cognitive Science, 18(3), 439–477.
https://doi.org/10.1207/s15516709cog1803_3
Chi, M. T. H., Siler, S. A., Jeong, H., Yamauchi, T., & Hausmann, R. G. (2001). Learning from
human tutoring. Cognitive Science, 25(4), 471–533. https://doi.org/10.1016/S0364-
0213(01)00044-1
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
105
Chiu, J. L., Bywater, J. P., & Lilly, S. (2022). The role of AI to support teacher learning and
practice: A review and future directions. In Artificial Intelligence in STEM Education.
CRC Press.
Charalambous, C. Y., Hill, H. C., Chin, M. J., & McGinn, D. (2020). Mathematical content
knowledge and knowledge for teaching: Exploring their distinguishability and
contribution to student learning. Journal of Mathematics Teacher Education, 23(6), 579–
613. https://doi.org/10.1007/s10857-019-09443-2
Clare, L., & Aschbacher, P. R. (2001). Exploring the technical aurality of using assignments and
student work as indicators of classroom practice. Educational Assessment, 7(1), 39–59.
https://doi.org/10.1207/S15326977EA0701_5
Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences (2nd ed.). Routledge.
https://doi.org/10.4324/9780203771587
Collins, L. J., & Liang, X. (2015). Examining high quality online teacher professional
development: Teachers’ voices. International Journal of Teacher Leadership, 6(1), 18–
34.
Collopy, R. (2003). Curriculum materials as a professional development tool: How a
mathematics textbook affected two teachers’ learning. The Elementary School Journal,
103(3), 287–311. https://doi.org/10.1086/499727
Copur-Gencturk, Y. (2015). The effects of changes in mathematical knowledge on teaching: A
longitudinal study of teachers’ knowledge and instruction. Journal for Research in
Mathematics Education, 46(3), 280–330.
https://doi.org/10.5951/jresematheduc.46.3.0280
Copur-Gencturk, Y. (2021). Teachers’ conceptual understanding of fraction operations: Results
from a national sample of elementary school teachers. Educational Studies in
Mathematics, 107(3), 525–545. https://doi.org/10.1007/s10649-021-10033-4
Copur-Gencturk, Y., Baek, C., & Doleck, T. (2022). A closer look at teachers’ proportional
reasoning. International Journal of Science and Mathematics Education, 21, 113–129.
https://doi.org/10.1007/s10763-022-10249-7
Copur-Gencturk, Y., & Doleck, T. (2021). Linking teachers’ solution strategies to their
performance on fraction word problems. Teaching and Teacher Education, 101, 103314.
https://doi.org/10.1016/j.tate.2021.103314
Copur-Gencturk, Y., & Li, J. (2023). Teaching matters: A longitudinal study of mathematics
teachers’ knowledge growth. Teaching and Teacher Education, 121, 103949.
https://doi.org/10.1016/j.tate.2022.103949
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
106
Copur-Gencturk, Y., Li, J., & Atabas, S. (2024). Improving teaching at scale: Can AI be
incorporated into professional development to create interactive, personalized learning
for teachers? American Educational Research Journal, 00028312241248514.
https://doi.org/10.3102/00028312241248514
Copur-Gencturk, Y., Li, J., Cohen, A. S., & Orrill, C. H. (2024). The impact of an interactive,
personalized computer-based teacher professional development program on student
performance: A randomized controlled trial. Computers & Education, 210, 104963.
https://doi.org/10.1016/j.compedu.2023.104963
Copur-Gencturk, Y., & Ölmez, İ. B. (2022). Teachers’ attention to and flexibility with referent
units. International Journal of Science and Mathematics Education, 20(6), 1123–1139.
https://doi.org/10.1007/s10763-021-10186-x
Copur-Gencturk, Y., & Orrill, C. H. (2023). A promising approach to scaling up professional
development: Intelligent, interactive, virtual professional development with just-in-time
feedback. Journal of Mathematics Teacher Education. https://doi.org/10.1007/s10857-
023-09615-1
Copur-Gencturk, Y., Plowman, D., & Bai, H. (2019). Mathematics teachers’ learning:
Identifying key Learning opportunities linked to teachers’ knowledge growth. American
Educational Research Journal, 56(5), 1590–1628.
https://doi.org/10.3102/0002831218820033
Copur-Gencturk, Y., & Rodrigues, J. (2021). Content-specific noticing: A large-scale survey of
mathematics teachers’ noticing. Teaching and Teacher Education, 101, 103320.
https://doi.org/10.1016/j.tate.2021.103320
Copur-Gencturk, Y., & Thacker, I. (2021). A comparison of perceived and observed learning
from professional development: Relationships among self-Reports, direct assessments,
and teacher characteristics. Journal of Teacher Education, 72(2), 138–151.
https://doi.org/10.1177/0022487119899101
Copur-Gencturk, Y., & Tolar, T. (2022). Mathematics teaching expertise: A study of the
dimensionality of content knowledge, pedagogical content knowledge, and contentspecific noticing skills. Teaching and Teacher Education, 114, 103696.
https://doi.org/10.1016/j.tate.2022.103696
Council for the Accreditation of Educator Preparation (CAEP). (2018). K-6 elementary teacher
preparation standards [Initial Licensure Programs]. Retrieved from
https://caepnet.org/~/media/Files/caep/standards/2018-caep-k-6-elementary-teacherprepara.pdf?la=en
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
107
Cramer, K. A., Post, T., & Currier, S. (1993). Learning and teaching ratio and proportion:
Research implications. In Douglas. T. Owens (Ed.), Research ideas for the classroom:
Middle grades mathematics (pp. 159–178). National Council of Teachers of
Mathematics.
d’Anjou, B., Bakker, S., An, P., & Bekker, T. (2019). How peripheral data visualisation systems
support secondary school teachers during VLE-supported lessons. Proceedings of the
2019 on Designing Interactive Systems Conference, 859–870.
https://doi.org/10.1145/3322276.3322365
Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional
development. Learning Policy Institute.
Darling‐Hammond, L., Wei, R. Chung., Andree, A., Richardson, N., & Orphanos, S. (2009).
Professional learning in the learning profession: A status report on teacher development
in the United States and abroad. In National Staff Development Council. National Staff
Development Council. 504 South Locust Street, Oxford, OH.
Dash, S., Magidin de Kramer, R., O’Dwyer, L. M., Masters, J., & Russell, M. (2012). Impact of
online professional development or teacher quality and student achievement in fifth grade
mathematics. Journal of Research on Technology in Education, 45(1), 1–26.
https://doi.org/10.1080/15391523.2012.10782595
Davis, E. A., & Krajcik, J. S. (2005). Designing educative curriculum materials to promote
teacher learning. Educational Researcher, 34(3), 3–14.
https://doi.org/10.3102/0013189X0340030
Dede, C., Jass Ketelhut, D., Whitehouse, P., Breit, L., & McCloskey, E. M. (2009). A research
agenda for online teacher professional development. Journal of Teacher Education,
60(1), 8–19. https://doi.org/10.1177/0022487108327554
Desimone, L. M. (2009). Improving impact studies of teachers’ professional development:
Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–
199. https://doi.org/10.3102/0013189X08331140
Desimone, L. M., & Pak, K. (2017). Instructional coaching as high-quality professional
development. Theory into Practice, 56(1), 3–12.
Drummond, A., & Sweeney, T. (2017). Can an objective measure of technological pedagogical
content knowledge (TPACK) supplement existing TPACK measures? British Journal of
Educational Technology, 48(4), 928–939. https://doi.org/10.1111/bjet.12473
Elliott, J. C. (2017). The Evolution from traditional to online professional development: A
review. Journal of Digital Learning in Teacher Education, 33(3), 114–125.
https://doi.org/10.1080/21532974.2017.1305304
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
108
Every Student Succeeds Act, 20 U.S.C. § 6301 (2015). https://www.congress.gov/bill/114thcongress/senate-bill/1177
Fan, Y., Jovanović, J., Saint, J., Jiang, Y., Wang, Q., & Gašević, D. (2022). Revealing the
regulation of learning strategies of MOOC retakers: A learning analytic study. Computers
& Education, 178. https://doi.org/10.1016/j.compedu.2021.104404
Fan, Y., Matcha, W., Uzir, N. A., Wang, Q., & Gašević, D. (2021). Learning analytics to reveal
links between learning design and self-regulated learning. International Journal of
Artificial Intelligence in Education, 31(4), 980–1021. https://doi.org/10.1007/s40593-
021-00249-z
Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using
G*Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods,
41(4), 1149–1160. https://doi.org/10.3758/BRM.41.4.1149
Feise, R. J. (2002). Do multiple outcome measures require p-value adjustment? BMC Medical
Research Methodology, 2(1), 8. https://doi.org/10.1186/1471-2288-2-8
Fisher, J. B., Schumaker, J. B., Culbertson, J., & Deshler, D. D. (2010). Effects of a
computerized professional development program on teacher and student outcomes.
Journal of Teacher Education, 61(4), 302–312.
https://doi.org/10.1177/0022487110369556
Fisher, L. C. (1988). Strategies used by secondary mathematics teachers to solve proportion
problems. Journal for Research in Mathematics Education, 19(2), 157–168.
https://doi.org/10.2307/749409
Franke, M. L., Carpenter, T. P., Levi, L., & Fennema, E. (2001). Capturing teachers’ generative
change: A follow-up study of professional development in mathematics. American
Educational Research Journal, 38(3), 653–689.
https://doi.org/10.3102/00028312038003653
Fütterer, T., Steinhauser, R., Zitzmann, S., Scheiter, K., Lachner, A., & Stürmer, K. (2023).
Development and validation of a test to assess teachers’ knowledge of how to operate
technology. Computers and Education Open, 5, 100152.
https://doi.org/10.1016/j.caeo.2023.100152
Gardner-Neblett, N., Franco, X., Mincemoyer, C., & Morgan-Lopez, A. A. (2020). Web-based
professional development for improving early childhood professionals’ actual and
perceived knowledge of dual language learners. Journal of Early Childhood Teacher
Education, 41(4), 403–432. https://doi.org/10.1080/10901027.2020.1718805
Garet, M. S., Birman, B. F., Porter, A. C., Desimone, L., & Herman, R. (1999). Designing
effective professional development: Lessons from the Eisenhower program and technical
appendices. American Institutes for Research. https://eric.ed.gov/?id=ED442634
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
109
Garet, M. S., Heppen, J. B., Walters, K., Parkinson, J., Smith, T. M., Song, M., Garrett, R.,
Yang, R., Borman, G. D., & Wei, T. E. (2016). Focusing on mathematical knowledge:
The impact of content-intensive teacher professional development. National Center for
Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S.
Department of Education.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes
professional development effective? Results from a national sample of teachers.
American Educational Research Journal, 38(4), 915–945.
https://doi.org/10.3102/00028312038004915
Garet, M. S., Wayne, A. J., Stancavage, F., Taylor, J., Walters, K., Song, M., Brown, S.,
Hurlburt, S., Zhu, P., Sepanik, S., Doolittle, F., & Warner, E. (2010). Middle school
mathematics professional development impact study: Findings after the first year of
implementation (NCEE 2010-4009). National Center for Education Evaluation and
Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Gerard, L., Wiley, K., Bradford, A., King-Chen, J., Lim-Breitbart, J., & Linn, M. C. (2020).
Impact of a teacher action planner that captures student ideas on teacher customization
decisions. Proceedings of the 14th International Society for Learning Sciences
Conference, 2077–2084.
Ginsburg, A., Gray, T., & Levin, D. (2004). Online professional development for mathematics
teachers: A strategic analysis. American Institutes for Research.
Goldenberg, L., Culp, K., Clements, M., Anderson, A., & Pasquale, M. (2014). Online
professional development for high-school biology teachers: Effects on teachers’ and
students’ knowledge. Journal of Technology and Teacher Education, 22(3), 287–309.
Graesser, A. C. (2011). Learning, thinking, and emoting with discourse technologies. American
Psychologist, 66(8), 746–757. https://doi.org/10.1037/a0024974
Graesser, A. C., Lu, S., Jackson, G. T., Mitchell, H. H., Ventura, M., Olney, A., & Louwerse, M.
M. (2004). AutoTutor: A tutor with dialogue in natural language. Behavior Research
Methods, Instruments, & Computers, 36(2), 180–192.
https://doi.org/10.3758/BF03195563
Green, J. A., & Azevedo, R. (2010). The measurement of learners’ self-regulated cognitive and
metacognitive processes while using computer-based learning environments. Educational
Psychologist, 45(4), 203–209. https://doi.org/10.1080/00461520.2010.515935
Griffin, C. C., Dana, N. F., Pape, S. J., Algina, J., Bae, J., Prosser, S. K., & League, M. B.
(2018). Prime online: Exploring teacher professional development for creating inclusive
elementary mathematics classrooms. Teacher Education and Special Education, 41(2),
121–139. https://doi.org/10.1177/0888406417740702
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
110
Griffin, T. D., Wiley, J., & Thiede, K. W. (2008). Individual differences, rereading, and selfexplanation: Concurrent processing and cue validity as constraints on
metacomprehension accuracy. Memory & Cognition, 36(1), 93–103.
https://doi.org/10.3758/MC.36.1.93
Hammond, L. (2015). Early childhood educators’ perceived and actual metalinguistic
knowledge, beliefs and enacted practice about teaching early reading. Australian Journal
of Learning Difficulties, 20(2), 113–128. https://doi.org/10.1080/19404158.2015.1023208
Harris, D. N., & Sass, T. R. (2011). Teacher training, teacher quality and student achievement.
Journal of Public Economics, 95(7–8), 798–812.
https://doi.org/10.1016/j.jpubeco.2010.11.009
Haug, B. S., & Mork, S. M. (2021). Taking 21st century skills from vision to classroom: What
teachers highlight as supportive professional development in the light of new demands
from educational reforms. Teaching and Teacher Education, 100, 103286.
https://doi.org/10.1016/j.tate.2021.103286
Hill, H. C., Blazar, D., & Lynch, K. (2015). Resources for teaching: Examining personal and
institutional predictors of high-quality instruction. AERA Open, 1(4),
2332858415617703. https://doi.org/10.1177/2332858415617703
Hill, H. C., & Chin, M. (2018). Connections between teachers’ knowledge of students,
instruction, and achievement outcomes. American Educational Research Journal, 55(5),
1076–1112. https://doi.org/10.3102/0002831218769614
Hill, H. C., Schilling, S. G., & Ball, D. L. (2004). Developing measures of teachers’ mathematics
knowledge for teaching. Elementary School Journal, 105(1), 11–30.
https://doi.org/10.1086/428763
Hoekstra, A., Beijaard, D., Brekelmans, M., & Korthagen, F. (2007). Experienced teachers’
informal learning from classroom teaching. Teachers and Teaching: Theory and
Practice, 13(2), 191–208. https://doi.org/10.1080/13540600601152546
Hoekstra, A., & Korthagen, F. (2011). Teacher learning in a context of educational change:
Informal learning versus systematically supported learning. Journal of Teacher
Education, 62(1), 76–92. https://doi.org/10.1177/0022487110382917
Hollebrands, K. F., & Lee, H. S. (2020). Effective design of massive open online courses for
mathematics teachers to support their professional learning. ZDM, 52(5), 859–875.
https://doi.org/10.1007/s11858-020-01142-0
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis:
Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.
https://doi.org/10.1080/10705519909540118
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
111
Huang, L., Doleck, T., Chen, B., Huang, X., Tan, C., Lajoie, S. P., & Wang, M. (2023).
Multimodal learning analytics for assessing teachers’ self-regulated learning in planning
technology-integrated lessons in a computer-based environment. Education and
Information Technologies. https://doi.org/10.1007/s10639-023-11804-7
Huang, L., & Lajoie, S. P. (2021). Process analysis of teachers’ self-regulated learning patterns
in technological pedagogical content knowledge development. Computers & Education,
166, 104169. https://doi.org/10.1016/j.compedu.2021.104169
Huang, L., Li, S., Poitras, E. G., & Lajoie, S. P. (2021). Latent profiles of self‐regulated learning
and their impacts on teachers’ technology integration. British Journal of Educational
Technology, 52(2), 695–713. https://doi.org/10.1111/bjet.13050
Izsák, A., & Jacobson, E. (2017). Preservice teachers’ reasoning about relationships that are and
are Not proportional: A knowledge-in-pieces account. Journal for Research in
Mathematics Education, 48(3), 300–339.
https://doi.org/10.5951/jresematheduc.48.3.0300
Izsák, A., Jacobson, E., & Bradshaw, L. (2019). Surveying middle-grades teachers’ reasoning
about fraction arithmetic in terms of measured quantities. Journal for Research in
Mathematics Education, 50(2), 156–209.
https://doi.org/10.5951/jresematheduc.50.2.0156
Jacob, R., Hill, H., & Corey, D. (2017). The impact of a professional development program on
teachers’ mathematical knowledge for teaching, instruction, and student achievement.
Journal of Research on Educational Effectiveness, 10(2), 379–407.
https://doi.org/10.1080/19345747.2016.1273411
Jacobs, V. R., Franke, M. L., Carpenter, T. P., Levi, L., & Battey, D. (2007). Professional
development focused on children’s algebraic reasoning in elementary school. Journal for
Research in Mathematics Education, 38(3), 258–288.
Joyce, J., Gitomer, D. H., & Iaconangelo, C. J. (2018). Classroom assignments as measures of
teaching quality. Learning and Instruction, 54, 48–61.
https://doi.org/10.1016/j.learninstruc.2017.08.001
Kahan, J. A., Cooper, D. A., & Bethea, K. A. (2003). The role of mathematics teachers’ content
knowledge in their teaching: A framework for research applied to a study of student
teachers. Journal of Mathematics Teacher Education, 6(3), 223–252.
https://doi.org/10.1023/A:1025175812582
Karlen, Y., Hirt, C. N., Jud, J., Rosenthal, A., & Eberli, T. D. (2023). Teachers as learners and
agents of self-regulated learning: The importance of different teachers’ competence
aspects for promoting metacognition. Teaching and Teacher Education, 125, 104055.
https://doi.org/10.1016/j.tate.2023.104055
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
112
Kaufman, J. H., Engberg, J., Hamilton, L. S., Yuan, K., & Hill, H. C. (2019). Validity evidence
supporting use of anchoring vignettes to measure teaching practice. Educational
Assessment, 24(3), 155–188. https://doi.org/10.1080/10627197.2019.1615374
Kaufman, J. H., Stein, M. K., & Junker, B. (2016). Factors associated with alignment between
teacher survey reports and classroom observation ratings of mathematics Instruction. The
Elementary School Journal, 116(3), 339–364. https://doi.org/10.1086/684942
Kellogg, S., Booth, S., & Oliver, K. (2014). A social network perspective on peer supported
learning in MOOCs for educators. The International Review of Research in Open and
Distributed Learning, 15(5). https://doi.org/10.19173/irrodl.v15i5.1852
Kennedy, M. (1998). Form and substance in inservice teacher education. Research monograph.
National Institute for Science Education, University of Wisconsin-Madison.
https://eric.ed.gov/?id=ED472719
Kennedy, M. (2016). How does professional development improve teaching? Review of
Educational Research, 86(4), 945–980. https://doi.org/10.3102/0034654315626800
Kennedy, M. (2019). How we learn about teacher learning. Review of Research in Education,
43(1), 138–162. https://doi.org/10.3102/0091732X19838970
Kersting, N. (2008). Using video clips of mathematics classroom instruction as item prompts to
measure teachers’ knowledge of teaching mathematics. Educational and Psychological
Measurement, 68(5), 845–861. https://doi.org/10.1177/0013164407313369
Kersting, N., Givvin, K. B., Thompson, B. J., Santagata, R., & Stigler, J. W. (2012). Measuring
usable knowledge: Teachers’ analyses of mathematics classroom videos predict teaching
quality and student learning. American Educational Research Journal, 49(3), 568–589.
https://doi.org/10.3102/0002831212437853
Kersting, N., Sotelo, F. L., & Stigler, J. W. (2010). Teachers’ analyses of classroom video
predict student learning of mathematics: Further explorations of a novel measure of
teacher knowledge. Journal of Teacher Education, 61(1–2), 172–181.
https://doi.org/10.1177/0022487109347875
Kini, T., & Podolsky, A. (2016). Does teaching experience increase teacher effectiveness? A
review of US research. Learning Policy Institute. https://learningpolicyinstitute.org/ourwork/publications-resources/does-teaching-experience-increase-teacher-effectivenessreview-research
Kleickmann, T., Richter, D., Kunter, M., Elsner, J., Besser, M., Krauss, S., & Baumert, J. (2013).
Teachers’ content knowledge and pedagogical content knowledge: The role of structural
differences in teacher education. Journal of Teacher Education, 64(1), 90–106.
https://doi.org/10.1177/0022487112460398
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
113
Kraft, M. A., & Blazar, D. (2017). Individualized coaching to improve teacher practice across
grades and subjects: New experimental evidence. Educational Policy, 31(7), 1033–1068.
https://doi.org/10.1177/0895904816631099
Kraft, M. A., & Blazar, D. (2018). Taking teacher coaching to scale. Education Next, 69–74.
Kraft, M. A., Blazar, D., & Hogan, D. (2018). The effect of teacher coaching on instruction and
achievement: A meta-analysis of the causal evidence. Review of Educational Research,
88(4), 547–588. https://doi.org/10.3102/0034654318759268
Kramarski, B., & Heaysman, O. (2021). A conceptual framework and a professional
development model for supporting teachers’ “triple SRL–SRT processes” and promoting
students’ academic outcomes. Educational Psychologist, 56(4), 298–311.
https://doi.org/10.1080/00461520.2021.1985502
Kramarski, B., & Michalsky, T. (2009). Investigating preservice teachers’ professional growth in
self-regulated learning environments. Journal of Educational Psychology, 101(1), 161–
175. https://doi.org/10.1037/a0013101
Kramarski, B., & Michalsky, T. (2010). Preparing preservice teachers for self-regulated learning
in the context of technological pedagogical content knowledge. Learning and Instruction,
20(5), 434–447. https://doi.org/10.1016/j.learninstruc.2009.05.003
Krauskopf, K., & Forssell, K. (2018). When knowing is believing: A multi-trait analysis of selfreported TPCK. Journal of Computer Assisted Learning, 34(5), 482–491.
https://doi.org/10.1111/jcal.12253
Krauss, S., Baumert, J., & Blum, W. (2008). Secondary mathematics teachers’ pedagogical
content knowledge and content knowledge: Validation of the COACTIV constructs.
ZDM, 40(5), 873–892. https://doi.org/10.1007/s11858-008-0141-9
Krauss, S., Bruckmaier, G., Lindl, A., Hilbert, S., Binder, K., Steib, N., & Blum, W. (2020).
Competence as a continuum in the COACTIV study: The “cascade model.” ZDM, 52(2),
311–327. https://doi.org/10.1007/s11858-020-01151-z
Krauss, S., Brunner, M., Kunter, M., Baumert, J., Blum, W., Neubrand, M., & Jordan, A. (2008).
Pedagogical content knowledge and content knowledge of secondary mathematics
teachers. Journal of Educational Psychology, 100(3), 716–725.
https://doi.org/10.1037/0022-0663.100.3.716
Kyndt, E., Gijbels, D., Grosemans, I., & Donche, V. (2016). Teachers’ everyday professional
development: Mapping informal learning activities, antecedents, and learning outcomes.
Review of Educational Research, 86(4), 1111–1150.
https://doi.org/10.3102/0034654315627864
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
114
Ladd, H. F., & Sorensen, L. C. (2017). Returns to teacher experience: Student achievement and
motivation in middle school. Education Finance and Policy, 12(2), 241–279.
https://doi.org/10.1162/EDFP_A_00194
Lamon, S. J. (2012). Teaching fractions and ratios for understanding: Essential content
knowledge and instructional strategies for teachers. Routledge.
Lantz-Andersson, A., Lundin, M., & Selwyn, N. (2018). Twenty years of online teacher
communities: A systematic review of formally-organized and informally-developed
professional learning groups. Teaching and Teacher Education, 75, 302–315.
https://doi.org/10.1016/j.tate.2018.07.008
Lay, C. D., Allman, B., Cutri, R. M., & Kimmons, R. (2020). Examining a decade of research in
online teacher professional development. Frontiers in Education, 5.
https://www.frontiersin.org/articles/10.3389/feduc.2020.573129
Lee, J., & Santagata, R. (2020). A longitudinal study of novice primary school teachers’
knowledge and quality of mathematics instruction. ZDM - Mathematics Education, 52(2),
295–309. https://doi.org/10.1007/s11858-019-01123-y
Lee, Y., & Choi, J. (2011). A review of online course dropout research: Implications for practice
and future research. Educational Technology Research and Development, 59(5), 593–
618. https://doi.org/10.1007/s11423-010-9177-y
Leikin, R., & Zazkis, R. (2010). Teachers’ opportunities to learn mathematics through teaching.
In Learning through teaching mathematics: Development of teachers’ knowledge and
expertise in practice (pp. 3–21). Springer Netherlands. https://doi.org/10.1007/978-90-
481-3990-3_1
Leony, D., Pardo, A., De La Fuente Valentín, L., De Castro, D. S., & Kloos, C. D. (2012).
GLASS: A learning analytics visualization tool. Proceedings of the 2nd International
Conference on Learning Analytics and Knowledge, 162–163.
https://doi.org/10.1145/2330601.2330642
Levin, J. R. (1988). Elaboration-based learning strategies: Powerful theory = powerful
application. Contemporary Educational Psychology, 13(3), 191–205.
https://doi.org/10.1016/0361-476X(88)90020-3
Lewis, C., & Perry, R. (2017). Lesson study to scale up research-based knowledge: A
randomized, controlled trial of fractions learning. Journal for Research in Mathematics
Education, 48(3), 261–299. http://doi.org/ 10.5951/jresematheduc.48.3.0261
Li, Y., & Kaiser, G. (2011). Expertise in mathematics instruction: Advancing research and
practice from an international perspective. In Y. Li & G. Kaiser (Eds.), Expertise in
Mathematics Instruction: An International Perspective (pp. 3–15). Springer US.
https://doi.org/10.1007/978-1-4419-7707-6_1
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
115
Lim, K. H. (2009). Burning the candle at just one end: Using nonproportional examples helps
students determine when proportional strategies apply. Mathematics Teaching in the
Middle School, 14(8), 492–500. https://doi.org/10.5951/MTMS.14.8.0492
Littenberg-Tobias, J., & Slama, R. (2022). Large-scale learning for local change: The challenge
of Massive Open Online Courses as educator professional learning. Frontiers in
Education, 7, 899535. https://doi.org/10.3389/feduc.2022.899535
Lloyd, G. M. (2008). Curriculum use while learning to teach: One student teacher’s
appropriation of mathematics curriculum materials. Journal for Research in Mathematics
Education, 39(1), 63–94. https://doi.org/10.2307/30034888
Lloyd, G. M., & Wilson, M. (1998). Supporting innovation: The impact of a teacher’s
conceptions of functions on his implementation of a reform curriculum. Journal for
Research in Mathematics Education, 29(3), 248–274. https://doi.org/10.2307/749790
Lobato, J., Ellis, A., & Zbiek, R. M. (2010). Developing essential understanding of ratios,
proportions, and proportional reasoning for teaching mathematics: Grades 6-8. National
Council of Teachers of Mathematics.
Lohman, M. C., & Woolf, N. H. (2001). Self-initiated learning activities of experienced publicschool teachers: Methods, sources, and relevant organizational influences. Teachers and
Teaching: Theory and Practice, 7(1), 59–74. https://doi.org/10.1080/13540600123835
Ma, L. (1999). Knowing and teaching elementary mathematics: Teacher’s understanding of
fundamental mathematics in China and the United States. Lawrence Erlbaum Associates,
Inc.
Ma, W., Adesope, O. O., Nesbit, J. C., & Liu, Q. (2014). Intelligent tutoring systems and
learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901–
918. https://doi.org/10.1037/a0037123
Mandinach, E. B., & Gummer, E. S. (2013). A systemic view of implementing data literacy in
educator preparation. Educational Researcher, 42(1), 30–37.
https://doi.org/10.3102/0013189X12459803
Marcus, R., & Chazan, D. (2010). What experienced teachers have learned from helping students
think about solving equations in the one-variable-first algebra curriculum. In R. Leikin &
R. Zazkis (Eds.), Learning through teaching mathematics: Development of teachers’
knowledge and expertise in practice (pp. 169–187). Springer Netherlands.
https://doi.org/10.1007/978-90-481-3990-3_9
Marsh, H. W., Trautwein, U., Lüdtke, O., Köller, O., & Baumert, J. (2005). Academic selfconcept, interest, grades, and standardized test scores: Reciprocal effects models of
causal ordering. Child Development, 76(2), 397–416. https://doi.org/10.1111/j.1467-
8624.2005.00853.x
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
116
Martinez-Maldonado, R., Dimitriadis, Y., Kay, J., Yacef, K., & Edbauer, M.-T. (2013).
MTClassroom and MTDashboard: Supporting analysis of teacher attention in an
orchestrated multi-tabletop classroom. Proceedings of the Computer Supported
Collaborative Learning, 119–128.
Masters, J., De Kramer, R. M., O’Dwyer, L. M., Dash, S., & Russell, M. (2010). The effects of
online professional development on fourth grade English language arts teachers’
knowledge and instructional practices. Journal of Educational Computing Research,
43(3), 355–375. https://doi.org/10.2190/EC.43.3.e
Matsumura, L. C., Garnier, H. E., Slater, S. C., & Boston, M. D. (2008). Toward measuring
instructional interactions “At-Scale.” Educational Assessment, 13(4), 267–300.
https://doi.org/10.1080/10627190802602541
Matsumura, L. C., Garnier, H., Pascal, J., & Valdés, R. (2002). Measuring instructional quality
in accountability systems: Classroom assignments and student achievement. Educational
Assessment, 8(3), 207–229. https://doi.org/10.1207/S15326977EA0803_01
Mavrikis, M., Gutierrez-Santos, S., & Poulovassilis, A. (2016). Design and evaluation of teacher
assistance tools for exploratory learning environments. Proceedings of the Sixth
International Conference on Learning Analytics & Knowledge, 168–172.
https://doi.org/10.1145/2883851.2883909
Mayer, D. P. (1999). Measuring instructional practice: Can policymakers trust survey data?
Educational Evaluation and Policy Analysis, 21(1), 29–45.
https://doi.org/10.2307/1164545
Maxwell, S. E. (2000). Sample size and multiple regression analysis. Psychological Methods,
5(4), 434–458. https://doi.org/10.1037/1082-989X.5.4.434
Meschede, N., Fiebranz, A., Möller, K., & Steffensky, M. (2017). Teachers’ professional vision,
pedagogical content knowledge and beliefs: On its relation and differences between preservice and in-service teachers. Teaching and Teacher Education, 66, 158–170.
https://doi.org/10.1016/j.tate.2017.04.010
Meyer, A., Kleinknecht, M., & Richter, D. (2023). What makes online professional development
effective? The effect of quality characteristics on teachers’ satisfaction and changes in
their professional practices. Computers & Education, 200, 104805.
https://doi.org/10.1016/j.compedu.2023.104805
Mintz, J., Hick, P., Solomon, Y., Matziari, A., Ó’Murchú, F., Hall, K., Cahill, K., Curtin, C.,
Anders, J., & Margariti, D. (2020). The reality of reality shock for inclusion: How does
teacher attitude, perceived knowledge and self-efficacy in relation to effective inclusion
in the classroom change from the pre-service to novice teacher year? Teaching and
Teacher Education, 91, 103042. https://doi.org/10.1016/j.tate.2020.103042
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
117
Namkung, J. M., Fuchs, L. S., & Koziol, N. (2018). Does initial learning about the meaning of
fractions present similar challenges for students with and without adequate whole-number
skill? Learning and Individual Differences, 61, 151–157.
https://doi.org/10.1016/j.lindif.2017.11.018
National Board for Professional Teaching Standards. (1989). What Teachers Should Know and
Be Able to Do. Retrieved from https://www.nbpts.org/wpcontent/uploads/2017/07/what_teachers_should_know.pdf
National Mathematics Advisory Panel. (2008). Foundations for Success: The Final Report of the
National Mathematics Advisory Panel. U.S. Department of Education,Washington, DC.
National Research Council. (2001). Adding it up: Helping children learn mathematics (J.
Kilpatrick, J. Swafford, & B. Findell, Eds.). National Academies Press.
https://doi.org/10.17226/9822
Nye, B. D., Graesser, A. C., & Hu, X. (2014). AutoTutor and family: A review of 17 years of
natural language tutoring. International Journal of Artificial Intelligence in Education,
24(4), 427–469. https://doi.org/10.1007/s40593-014-0029-5
Ost, B. (2014). How do teachers improve? The relative importance of specific and general
human capital. American Economic Journal: Applied Economics, 6(2), 127–151.
https://doi.org/10.1257/app.6.2.127
Ouyang, F., Jiao, P., McLaren, B. M., & Alavi, A. H. (Eds.). (2022). Artificial intelligence in
STEM education: The paradigmatic shifts in research, education, and technology. CRC
Press.
Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for
research. Frontiers in Psychology, 8.
https://www.frontiersin.org/articles/10.3389/fpsyg.2017.00422
Panadero, E., & Alonso-Tapia, J. (2017). Self-assessment: Theoretical and practical
connotations. When it Happens, How is it Acquired and what to do to Develop it in our
Students. Electronic Journal of Research in Education Psychology, 11(30), 551–576.
https://doi.org/10.14204/ejrep.30.12200
Papay, J. P., & Kraft, M. A. (2015). Productivity returns to experience in the teacher labor
market: Methodological challenges and new evidence on long-term career improvement.
Journal of Public Economics, 130, 105–119.
https://doi.org/10.1016/j.jpubeco.2015.02.008
Philipsen, B., Tondeur, J., Pareja Roblin, N., Vanslambrouck, S., & Zhu, C. (2019). Improving
teacher professional development for online and blended learning: A systematic metaaggregative review. Educational Technology Research and Development, 67(5), 1145–
1174. https://doi.org/10.1007/s11423-019-09645-8
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
118
Picus, L. O., & Odden, A. R. (2011). Reinventing school finance: Falling forward. Peabody
Journal of Education, 86(3), 291–303.
Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In Handbook of
Self-Regulation (pp. 451–502). Elsevier. https://doi.org/10.1016/B978-012109890-
2/50043-3
Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated
learning in college students. Educational Psychology Review, 16(4), 385–407.
https://doi.org/10.1007/s10648-004-0006-x
Pintrich, P. R., & Groot, E. V. D. (1990). Motivational and self-regulated learning components
of classroom academic performance. Journal of Educational Psychology, 82(1), 33–40.
Pintrich, P. R., Smith, D. A. F., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive
validity of the motivated strategies for learning questionnaire (Mslq). Educational and
Psychological Measurement, 53(3), 801–813.
https://doi.org/10.1177/0013164493053003024
Popenici, S. A. D., & Kerr, S. (2017). Exploring the impact of artificial intelligence on teaching
and learning in higher education. Research and Practice in Technology Enhanced
Learning, 12(1), 22. https://doi.org/10.1186/s41039-017-0062-8
Powell, C. G., & Bodur, Y. (2019). Teachers’ perceptions of an online professional development
experience: Implications for a design and implementation framework. Teaching and
Teacher Education, 77, 19–30. https://doi.org/10.1016/j.tate.2018.09.004
Putnam, R. T., Heaton, R. M., Prawat, R. S., & Remillard, J. (1992). Teaching mathematics for
understanding: Discussing case studies of four fifth-grade teachers. The Elementary
School Journal, 93(2), 213–228. https://doi.org/10.1086/461723
Rabe-Hesketh, S., & Skrondal, A. (2008). Multilevel and Longitudinal Modeling Using Stata.
Stata Press. https://www.routledge.com/Multilevel-and-Longitudinal-Modeling-UsingStata-Volumes-I-and-II/Rabe-Hesketh-Skrondal/p/book/9781597181365
Ramsdell, R., & Rose, R. (2006). PBS TeacherLine and Concord Consortium’s Seeing Math
Secondary: Scaling up a national professional development program. In C. Dede (Ed.),
Online Professional Development for Teachers: Emerging Models and Methods (pp. 69–
88).
Reeves, T. D., & Pedulla, J. J. (2011). Predictors of teacher satisfaction with online professional
development: Evidence from the USA’s e‐Learning for Educators initiative. Professional
Development in Education, 37(4), 591–611.
https://doi.org/10.1080/19415257.2011.553824
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
119
Remillard, J. T., & Bryans, M. B. (2004). Teachers’ orientations toward mathematics curriculum
materials: Implications for teacher learning. Journal for Research in Mathematics
Education, 35(5), 352–388. https://doi.org/10.2307/30034820
Ross, J. A., McDougall, D., Hogaboam-Gray, A., & LeSage, A. (2003). A survey measuring
elementary teachers’ implementation of standards-based mathematics teaching. Journal
for Research in Mathematics Education, 34(4), 344–363.
https://doi.org/10.2307/30034787
Roschelle, J., Shechtman, N., Tatar, D., Hegedus, S., Hopkins, B., Empson, S., Knudsen, J., &
Gallagher, L. P. (2010). Integration of technology, curriculum, and professional
development for advancing middle school mathematics: Three large-scale studies.
American Educational Research Journal, 47(4), 833–878.
https://doi.org/10.3102/0002831210367426
Salas-Pilco, S. Z., & Hu, X. (2022). Artificial Iintelligence and learning analytics in teacher
education: A systematic review. Education Sciences, 12(8), 569.
https://doi.org/10.3390/educsci12080569
Sangster, P., Anderson, C., & O’Hara, P. (2013). Perceived and actual levels of knowledge about
language amongst primary and secondary student teachers: Do they know what they think
they know? Language Awareness, 22(4), 293–319.
https://doi.org/10.1080/09658416.2012.722643
Scher, L., & O’Reilly, F. (2009). Professional development for K–12 math and science teachers:
What do we really know? Journal of Research on Educational Effectiveness, 2(3), 209–
249. https://doi.org/10.1080/19345740802641527
Schifter, C., Natarajan, U., Ketelhut, D. J., & Kirchgessner, A. (2014). Data-driven decisionmaking: Facilitating teacher use of student data to inform classroom instruction.
Contemporary Issues in Technology and Teacher Education, 14(4), 419–432.
Schmid, M., Brianza, E., & Petko, D. (2021). Self-reported technological pedagogical content
knowledge (TPACK) of pre-service teachers in relation to digital technology use in
lesson plans. Computers in Human Behavior, 115, 106586.
https://doi.org/10.1016/j.chb.2020.106586
Schmidt, W., Tatto, M., Bankov, K., Blömeke, S., Cedillo, T., Cogan, L., Han, S., Houang, R.,
Hsieh, F., Paine, L., Santillán, M., & Schwille, J. (2007). The Preparation Gap: Teacher
Education for Middle School Mathematics in Six Countries. MSU Center for Research in
Mathematics and Science Education.
Selya, A. S., Rose, J. S., Dierker, L. C., Hedeker, D., & Mermelstein, R. J. (2012). A practical
guide to calculating Cohen’s f2, a measure of local effect size, from PROC MIXED.
Frontiers in Psychology, 3, 111. https://doi.org/10.3389/fpsyg.2012.00111
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
120
Shah, F., Evens, M., Michael, J., & Rovick, A. (2002). Classifying student initiatives and tutor
responses in human keyboard-to-keyboard tutoring sessions. Discourse Processes, 33(1),
23–52. https://doi.org/10.1207/S15326950DP3301_02
Sheridan, K. M., & Wen, X. (2021). Evaluation of an online early mathematics professional
development program for early childhood teachers. Early Education and Development,
32(1), 98–112. https://doi.org/10.1080/10409289.2020.1721402
Sherin, M., & Van Es, E. A. (2009). Effects of video club participation on teachers’ professional
vision. Journal of Teacher Education, 60(1), 20–37.
https://doi.org/10.1177/0022487108328155
Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational
Researcher, 15(2), 4–14.
Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard
Educational Review, 57(1), 1–23. https://doi.org/10.17763/haer.57.1.j463w79r56455411
Shute, V. J., & Psotka, J. (1996). Intelligent tutoring systems: Past, present, and future. In D. H.
Jonassen (Ed.), Handbook of research for educational communications and technology
(pp. 570–600). Macmillan Library Reference USA.
Siegler, R., Carpenter, T., Fennell, F., Geary, D., Lewis, J., Okamoto, Y., Thompson, L., &
Jonathan Wray. (2010). Developing effective fractions instruction for kindergarten
through 8th grade. IES Practice Guide (NCEE 2010-4039). What Works Clearinghouse.
Siegler, R. S., Thompson, C. A., & Schneider, M. (2011). An integrated theory of whole number
and fractions development. Cognitive Psychology, 62, 273–296.
https://doi.org/10.1016/j.cogpsych.2011.03.001
Simpson, A., Rosenberg, M., Ward, B., Thornton, A. L., Derbyshire, A., & Jackson, B. (2022).
Primary school teacher outcomes from online professional development for physical
literacy: A randomised controlled trial. Psychology of Sport and Exercise, 61, 102199.
https://doi.org/10.1016/j.psychsport.2022.102199
Sitzmann, T., & Ely, K. (2011). A meta-analysis of self-regulated learning in work-related
training and educational attainment: What we know and where we need to go.
Psychological Bulletin, 137(3), 421–442. https://doi.org/10.1037/a0022777
Sitzmann, T., Ely, K., Brown, K. G., & Bauer, K. N. (2010). Self-assessment of knowledge: A
cognitive learning or affective measure? Academy of Management Learning &
Education, 9(2), 169–191.
Sottilare, R. A., Graesser, A., Hu, X., & Holden, H. (Eds.). (2013). Design recommendations for
intelligent tutoring systems. U.S. Army Research Laboratory.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
121
Stecher, B., Le, V.-N., Hamilton, L., Ryan, G., Robyn, A., & Lockwood, J. R. (2006). Using
structured classroom vignettes to measure instructional practices in mathematics.
Educational Evaluation and Policy Analysis, 28(2), 101–130.
https://doi.org/10.3102/01623737028002101
Stein, M. K., Baxter, J. A., & Leinhardt, G. (1990). Subject-matter knowledge and elementary
instruction: A case from functions and graphing. American Educational Research
Journal, 27(4), 639–663. https://doi.org/10.2307/1163104
Taranto, E., & Arzarello, F. (2020). Math MOOC UniTo: An Italian project on MOOCs for
mathematics teacher education, and the development of a new theoretical framework.
ZDM, 52(5), 843–858. https://doi.org/10.1007/s11858-019-01116-x
Tasar, M. F., & Imer Cetin, N. (2021). Scaffolding prompt questions and learners’ self-regulated
learning about the nature of science in hypermedia. International Journal of Curriculum
and Instruction, 13(2), 1802–1824.
Tatto, M. T., Schwille, J., Senk, S. L., Ingvarson, L., Peck, R., & Rowley, G. (2008). Teacher
education and development study in mathematics (TEDS-M): Policy, practice, and
readiness to teach primary and secondary mathematics. Conceptual framework. Teacher
Education and Development International Study Center, College of Education, Michigan
State University. https://files.eric.ed.gov/fulltext/ED542390.pdf
Thompson, P. W., & Saldanha, L. A. (2003). Fractions and multiplicative reasoning. In J.
Kilpatrick, W. G. Martin, & D. Schifter (Eds.), A research companion to principles and
standards for school mathematics (pp. 95–114). National Council of Teachers of
Mathematics.
Timperley, H., Wilson, A., Barrar, H., & Fung, I. (2007). Teacher professional learning and
development:Best evidence synthesis iteration. Ministry of Education.
http://educationcounts.edcentre.govt.nz/goto/BES
Tröbst, S., Kleickmann, T., Heinze, A., Bernholt, A., Rink, R., & Kunter, M. (2018). Teacher
knowledge experiment: Testing mechanisms underlying the formation of preservice
elementary school teachers’ pedagogical content knowledge concerning fractions and
fractional arithmetic. Journal of Educational Psychology, 110(8), 1049–1065.
http://dx.doi.org/10.1037/edu0000260
Tzovla, E., Kedraka, K., Karalis, T., Kougiourouki, M., & Lavidas, K. (2021). Effectiveness of
in-service elementary school teacher professional development MOOC: An experimental
research. Contemporary Educational Technology, 13(4), ep324.
https://doi.org/10.30935/cedtech/11144
Van de Walle, J. A., Karp, K. S., & Bay-Williams, J. M. (2022). Elementary and Middle School
Mathematics: Teaching Developmentally, 11th Edition. In Pearson. Pearson.
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
122
VanLehn, K., Graesser, A. C., Jackson, G. T., Jordan, P., Olney, A., & Rosé, C. P. (2007). When
are tutorial dialogues more effective than reading? Cognitive Science, 31(1), 3–62.
https://doi.org/10.1080/03640210709336984
Von Kotzebue, L. (2023). Two is better than one—Examining biology-specific TPACK and its
T-dimensions from two angles. Journal of Research on Technology in Education, 55(5),
765–782. https://doi.org/10.1080/15391523.2022.2030268
Vonkova, H., Papajoanu, O., Stipek, J., & Kralova, K. (2021). Identifying the accuracy of and
exaggeration in self-reports of ICT knowledge among different groups of students: The
use of the overclaiming technique. Computers & Education, 164, 104112.
https://doi.org/10.1016/j.compedu.2020.104112
Van Dooren, W., De Bock, D., Hessels, A., Janssens, D., & Verschaffel, L. (2005). Not
everything is proportional: Effects of age and problem type on propensities for
overgeneralization. Cognition and Instruction, 23(1), 57–86.
Wayman, J. C. (2005). Involving teachers in data-driven decision making: Using computer data
systems to support teacher inquiry and reflection. Journal of Education for Students
Placed at Risk (JESPAR), 10(3), 295–308. https://doi.org/10.1207/s15327671espr1003_5
Winne, P. H. (2001). Self-regulated learning viewed from models of information processing. In
B. J. Zimmerman & D. H. Schunk (Eds.), Self-regulated learning and academic
achievement: Theoretical perspectives (pp. 153–189). Lawrence Erlbaum Associates.
Winne, P. H. (2017). Cognition and Metacognition within Self-Regulated Learning. In D. H.
Schunk & J. A. Greene (Eds.), Handbook of Self-Regulation of Learning and
Performance (2nd ed., pp. 36–48). Routledge. https://doi.org/10.4324/9781315697048-3
Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. J., & Paas, F. (2019). Supporting
self-regulated learning in online learning environments and MOOCs: A systematic
review. International Journal of Human-Computer Interaction, 35(4–5), 356–373.
https://doi.org/10.1080/10447318.2018.1543084
Wong, J. T., Bui, N. N., Fields, D. T., & Hughes, B. S. (2022). A learning experience design
approach to online professional development for teaching science through the arts:
Evaluation of teacher content knowledge, self-efficacy and STEAM perceptions. Journal
of Science Teacher Education, 0(0), 1–31.
https://doi.org/10.1080/1046560X.2022.2112552
Woods, P. J., & Copur-Gencturk, Y. (2024). Examining the role of student-centered versus
teacher-centered pedagogical approaches to self-directed learning through teaching.
Teaching and Teacher Education, 138, 104415.
https://doi.org/10.1016/j.tate.2023.104415
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
123
Xu, Z., Zhao, Y., Liew, J., Zhou, X., & Kogut, A. (2023). Synthesizing research evidence on
self-regulated learning and academic achievement in online and blended learning
environments: A scoping review. Educational Research Review, 39, 100510.
https://doi.org/10.1016/j.edurev.2023.100510
Yoon, K. S., Teresa Duncan, Silvia Wen-Yu Lee, Beth Scarloss, & Kathy L. Shapley. (2007).
Reviewing the evidence on how teacher professional development affects student
achievement (Issues & Answers Report, REL 2007–No. 033). U.S. Department of
Education, Institute of Education Sciences, National Center for Education Evaluation and
Regional Assistance, Regional Educational Laboratory Southwest.
http://ies.ed.gov/ncee/edlabs
Zhang, S., Shi, Q., & Lin, E. (2020). Professional development needs, support, and barriers:
TALIS US new and veteran teachers’ perspectives. Professional Development in
Education, 46(3), 440–453. https://doi.org/10.1080/19415257.2019.1614967
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
124
Appendix
Appendix A
Table A.1
Scoring Rubric for Teachers’ Knowledge of Students’ Mathematical Thinking
Score Sample responses
Score 1
No analysis or an incorrect analysis of students’
mathematical thinking or too generic a response.
“The teacher did not make it clear
what a part-to-part or a part-to-whole
ratio actually meant before asking the
class what type of ratio she was
describing.”
“I think the boy in the clip understood
the lesson.”
● The teacher’s response focuses on issues
other than students’ thinking around key
mathematical issues in the video clip.
● The response includes an incorrect
analysis of the mathematical thinking of
students in the video.
● The response is too general (e.g.,
“Students are struggling”).
Score 2
At least one correct statement about students’
mathematical thinking around the key
mathematical issues, but no evidence for this
observation.
● The teacher’s responses include a
description of the student(s)’
mathematical thinking around key
mathematical issues in the video, but no
evidence is provided for why the
student(s) are having a problem or how
the participating teacher arrives at this
observation.
● The response includes chronological
descriptions with a focus on the
mathematical content but does not
provide an analysis or interpretation of
the situation (i.e., the response describes
apparent processes, such as the teacher’s
or student’s comments in the video).
“They don’t appear to understand
what a part-to-part and part-to-whole
ratio actually means.”
TEACHER LEARNING IN INFORMAL AND DIGITAL CONTEXTS
125
Score 3
One accurate description or analysis of the
students’ mathematical thinking around key
mathematical issues with a potential underlying
reason.
● The response provides at least one
correct explanation of how the
participating teacher arrives at this
conclusion or why the teacher thinks the
student in the video (mis)understands a
certain aspect of the mathematical topic.
● The response includes a diagnosis of the
mathematical thinking of a student even
though this issue is not obvious (e.g., the
student’s answer is correct but his or her
explanation indicates otherwise). Thus, it
goes beyond a chronological description
of what can be seen in the video by
including at least one accurate analysis of
the student’s mathematical thinking.
“The first student did not understand
the concept of part to part versus part
to whole, but the second student, Clay,
understood that it was an example of
part to part. However, his reasoning
was because they are each separate. I
don't think the class as a whole
understands the difference.”
Score 4
Entire response is correct in terms of the
analysis; identifies the source of confusion that
student(s) in the video experienced, together
with evidence.
● The response provides accurate insights,
identifies the source of confusion that
students experienced, and provides
evidence for how the teacher arrived at
this conclusion and why she or he
thought the student(s) were having the
problems.
● The teacher’s analysis includes an issue
that is not easily recognizable in the
video, and she or he provides an
explanation for her or his analysis.
“Some students are struggling to
understand when a ratio is part to
whole and part to part, and I think the
misunderstanding starts from not
understanding what it means to be part
or the whole. Jenna thinks that the
pizza is the whole, and even Clay,
who correctly thinks the relationship
is part to part, thinks so because the
pizza and the boys can be separated.
There isn't any evidence in this video
that the students understand that it's
part to part because it neglects the
other things at the party (girls,
presents, cake). Now they are able to
answer the teacher's questions about
it, but as far as fully understanding
why it is part to part, the students that
spoke did not use that logic in their
arguments.”
Interactive, Personalized Learning for Teachers
126
Table A.2
Scoring Rubric for Teachers’ Knowledge of Mathematics Teaching
Score Sample responses
Score 1
Offering a generic instructional strategy or an
instructional strategy that does not focus on key
mathematical issues in the video.
“Next, I would give students continued
practice to help them grasp the skill.
Hopefully, they would then come to
conclusions about how to identify the
two different types of ratios and solve
them.”
● The teacher’s response focuses on issues
other than mathematical pedagogy to
improve students’ understanding of the
key mathematical idea(s) in the video
clip.
● The response includes an instructional
strategy that has no potential to improve
students’ mathematical understanding of
the key ideas in the video (e.g., the
instruction in the video is problematic,
but the response indicates the teacher
should use the same strategy).
● The response is too general (e.g., “I
would use manipulatives”) or
mathematically incorrect.
Score 2
Offering at least one instructional strategy to
help students understand the key ideas without
providing an explicit justification for this
instructional decision.
● The teacher’s response includes at least
one instructional strategy that helps
students overcome the key mathematical
issues they are struggling with, but no
justification is provided for why or how
this strategy would improve the
students’ understanding.
“Have the students work out part-to-part
and part-to-whole ratios with similar
objects, for example, boys to total
people, or boys to girls.”
Interactive, Personalized Learning for Teachers
127
Score 3
Offering an instructional strategy to overcome
one of the key mathematical issues in the video
clip by providing a justification based on what
is happening in the video for this instructional
decision.
● The teacher’s response includes at least
one instructional strategy and a rationale
for this instructional decision.
“Because there was some confusion, I
would simplify the data set a little more
so there weren’t so many objects and that
the objects would be more similar. For
example, have the students look at a
basket of fruit with three different types
of fruit (apples, bananas, and oranges).
Explain how part to part would be
comparing one type of fruit to another
type, whereas part to whole would be
comparing one fruit to all the fruit in the
basket.”
Score 4
Providing an entire response that is correct,
offering at least one instructional strategy to
make the key mathematical issues explicit to
students in the video clip, and providing a
rationale based on students’ struggles around
the key mathematical issues.
I would probably label the whole party
set as the whole and label each thing in it
as a part. By doing this, students can
have a visual breakdown of what it
means when we talk about the whole
versus parts. I would double check their
understanding by asking if a ratio of cake
to boys and girls would be part to part or
part to whole. I think this example would
help me determine if they fully
understand the difference because it can
be easy for them to say that cake: boys
and girls would be part to whole because
it includes all of the people at the party.
However, they need to recognize that all
of the people don't make up the full party
set that we labeled as the whole. If
students demonstrate that they
understand, I would push them to think
about what a part-to-whole ratio would
look like in this context.
● The teacher’s response includes an
instructional strategy that aims to
improve students’ understanding of the
key ideas by noticing and addressing the
source of students’ confusion along with
a justification.
● The entire response is correct, the
overarching issue with regard to
students’ mathematical understanding is
identified, and a rationale for the
instructional decision is provided.
Interactive, Personalized Learning for Teachers
128
Appendix B
Table B.1
An Example of Teachers’ Interactions with the Virtual Facilitator in One Content Knowledge
Learning Activity
Virtual facilitator The dimensions of four rectangles are given in the figure. Which
rectangle looks most like a square? Explain your thinking using
complete sentences.
Teacher The rectangle that is 37 ft by 40 ft. To determine the answer, I divided
the shorter length by the longer length and selected the ratio which was
the closest to 1.
Virtual facilitator Correct! Let me help you a little. In the pop-up activity, you will see a
rectangle with a 3-unit difference between the length and width. By
dragging the corners, you can change the size of the rectangle, but
there will still be a 3-unit difference in the length and width. Take a
moment to manipulate the rectangle. As you make the dimensions
larger and smaller, think about how the same 3-unit difference between
length and width affects how much the rectangle does or does not look
like a square. Once you are finished, hit the dismiss button.
Teacher (The teacher plays with the rectangle on the screen.) The 3-unit
difference becomes a smaller proportion of the overall length or width
and therefore impacts the ratio less and less as the rectangle gets
larger.
Virtual facilitator Excellent! The ratio of length to width of a square is 1, and rectangles
that have a ratio of length to width closer to 1 will look more like a
square.
Virtual facilitator We purposely selected these four rectangles to show how rectangles
with the same 3-unit difference between length and width can have
different ratios that cause the rectangle to look more or less like a
square. As the dimensions get larger (for example, 37x40), the effect
of the 3-unit difference is less on the ratio and it will look more like a
square. However, as the dimensions get smaller (for example, 7x10),
the effect of the 3-unit difference is greater on the ratio and it will look
less like a square. It is important to note here that because the quotient
of width and length (i.e., the ratio) and not the difference between the
length and width determines how much a rectangle can look like a
square.
Interactive, Personalized Learning for Teachers
129
Table B.2
Scoring Rubric for Teachers’ Pedagogical Content Knowledge
Rubric for the Items Capturing the Teachers’ Students’ Mathematical Thinking
Score Sample responses
Score 1
No analysis or an incorrect analysis of students’
mathematical thinking or too generic a response.
“She is looking at the pizza as a
whole (whole to part). She is not
realizing the pizza is being broken
up into parts as well.”
● The teacher’s response focuses on issues
other than students’ thinking around key
mathematical issues in the video clip.
● The response includes an incorrect analysis
of the mathematical thinking of students in
the video.
● The response is too general (e.g., “Students
are struggling”).
Score 2
At least one correct statement about students’
mathematical thinking around the key mathematical
issues, but no evidence to support this observation.
● The teacher’s responses include a description
of the student(s)’ mathematical thinking
around key mathematical issues in the video,
but no evidence is provided for why the
student(s) are having a problem or how the
participating teacher arrives at this
observation.
● The response includes chronological
descriptions with a focus on the mathematical
content but does not provide an analysis or
interpretation of the situation (i.e., the
response describes apparent processes, such as
the teacher’s or student’s comments in the
video).
“Jenna simply restated what she was
asked. She doesn't understand the
difference between considering the
part versus the whole.”
Interactive, Personalized Learning for Teachers
130
Score 3
One accurate description or analysis of the students’
mathematical thinking around key mathematical
issues with a potential underlying reason.
● The response provides at least one correct
explanation of how the participating teacher
arrives at this conclusion or why the teacher
thinks the student in the video
(mis)understands a certain aspect of the
mathematical topic.
● The response includes a diagnosis of the
mathematical thinking of a student even
though this issue is not obvious (e.g., the
student’s answer is correct but his or her
explanation indicates otherwise). Thus, it goes
beyond a chronological description of what
can be seen in the video by including at least
one accurate analysis of the student’s
mathematical thinking.
“Jenna understands that there are 3
whole pizzas but is not
understanding that she has to take
the whole data set into consideration
for it to be part to whole.”
Score 4
Entire response is correct in terms of the analysis;
identifies the source of confusion that student(s) in
the video experienced, together with evidence.
● The response provides accurate insights,
identifies the source of confusion that
students experienced, and provides evidence
for how the teacher arrived at this conclusion
and why she or he thought the student(s)
were having the problems.
● The teacher’s analysis includes an issue that
is not easily recognizable in the video, and
she or he provides an explanation for her or
his analysis.
“She does not understand that the
birthday party is the whole and
everything at the party is the part. I
think she is assuming that pizzas
come in whole pizzas when you buy
them, then they have to be the
whole. Since there is more than 1
boy, she thinks that the boys are
parts.”
Rubric for the Items Capturing Teachers’ Knowledge of Mathematics Teaching
Score Sample responses
Score 1
Offering a generic instructional strategy or an
instructional strategy that does not focus on key
mathematical issues in the video.
“I would ask the students to make
real lemonade and actually begin
through tasting to see the lemonade's
different dilutions.”
Interactive, Personalized Learning for Teachers
131
● The teacher’s response focuses on issues
other than mathematical pedagogy to
improve students’ understanding of the key
mathematical idea(s) in the video clip.
● The response includes an instructional
strategy that has no potential to improve
students’ mathematical understanding of the
key ideas in the video (e.g., the instruction in
the video is problematic, but the response
indicates the teacher should use the same
strategy).
● The response is too general (e.g., “I would
use manipulatives”) or mathematically
incorrect.
Score 2
Offering at least one instructional strategy to help
students understand the key ideas without providing
an explicit justification for this instructional
decision.
• The teacher’s response includes at least one
instructional strategy that helps students
overcome the key mathematical issues they
are struggling with, but no justification is
provided for why or how this strategy would
improve the students’ understanding.
“A way to further improve the
understanding would be to write
both ratios as a fraction and compare
them numerically, perhaps with
simplifications.”
Interactive, Personalized Learning for Teachers
132
Score 3
Offering an instructional strategy to overcome one
of the key mathematical issues in the video clip by
providing a justification based on what is happening
in the video for this instructional decision.
● The teacher’s response includes at least one
instructional strategy and a rationale for this
instructional decision.
“If we're not sure about the lemons,
let's take a look at the water. How
many cups of water do we see in
example a? How many cups of water
do we see in example b? Do we
think the cups of water might impact
the taste of the lemonade? How so?
Let's look at each example
separately. I would then transition
into a discussion of comparing
numbers of cups of lemon to number
of cups of water. This would help
the student understand that there is a
relationship between water and
lemons prior to formally introducing
the definition and model of a ratio.”
Score 4
Providing an entire response that is correct, offering
at least one instructional strategy to make the key
mathematical issues explicit to students in the video
clip, and providing a rationale based on students’
struggles around the key mathematical issues.
“Since Jose was still set on the idea
that more lemons in a recipe make it
more lemony, I would pose the
question that what if the two recipes
have the same amount of lemons but
different amounts of water. For
example, 3 lemons to 2 cups of
water and 3 lemons to 1 cup of
water. This might help him see that
if one of the parts of the ratio are the
same, it is easier to determine the
answer.”
● The teacher’s response includes an
instructional strategy that aims to improve
students’ understanding of the key ideas by
noticing and addressing the source of
students’ confusion along with a
justification.
● The entire response is correct, the
overarching issue with regard to students’
mathematical understanding is identified,
and a rationale for the instructional decision
is provided.
Interactive, Personalized Learning for Teachers
133
B.3 Student Assessment
1. The table shows the number of pages a student reads and how many minutes it takes the
student to read those pages.
Which statement best describes the relationship in the table?
o The number of minutes is proportional to the number of pages, with a unit rate of 1.5
minutes per page.
o The number of minutes is proportional to the number of pages, with a unit rate of 2.25
minutes per page.
o The relationship in the table is not proportional.
o I don't know.
2. When playing basketball, Jan makes 4 out of every 10 shots she takes. Select ALL the
statements that describe Jan's situation.
▢ The ratio of the number of shots Jan makes to the number of shots she takes is 2:5.
▢ The ratio of the number of shots Jan makes to the number of shots she does not make is
2:3.
▢ The equation 4x = 10y shows the relationship between x, the number of shots Jan
makes, and y, the number of shots she takes.
▢ The equation 6x = 4z shows the relationship between x, the number of shots Jan
makes, and z, the number of shots she does not make.
▢ I don't know.
Interactive, Personalized Learning for Teachers
134
3. Which graphs represent a proportional relationship? Select ALL correct answers.
▢ ▢
▢ ▢
▢ ▢
▢ I don’t know.
Interactive, Personalized Learning for Teachers
135
4. Amelia works for 6 hours and earns $48. The graph shows the relationship between the
number of hours Amelia works, x, and the total amount she earns, y. Which point represents
the number of dollars Amelia makes per hour?
o (1, 6)
o (1, 8)
o (2, 16)
o (6, 48)
o I don't know.
5. There are 30 students in a class. The ratio of boys to girls is 2:3. How many boys are in the
class?
o 9
o 12
o 13
o 16
o I don't know.
Interactive, Personalized Learning for Teachers
136
6. The length of a photograph is 5 inches, and its width is 3 inches. The photograph is enlarged
proportionally. The length of the enlarged photograph is 10 inches. What is the width of the
enlarged photograph?
o 6 inches
o 7 inches
o 8 inches
o 15 inches
o 16 2
3
inches
o I don’t know.
7. Stacie rides her bike 3 miles in 12 minutes. At this rate, how long will it take her to ride her
bike 7 miles?
o 22 minutes
o 28 minutes
o 36 minutes
o 43 minutes
o 84 minutes
o I don’t know.
8. Which of the following ratios is equivalent to the ratio of 6 to 4?
o 12 to 18
o 12 to 8
o 8 to 6
o 4 to 6
o 2 to 3
o I don't know.
Interactive, Personalized Learning for Teachers
137
9. if 1
1
3
cups of flour are needed for a batch of cookies, how many cups of flour will be needed
for 3 batches?
o 4
1
3
o 4
o 3
o 2
2
3
o I don't know.
10. A group of 5 musicians take 20 minutes to play a song. If the group is increased to 35
musicians, how long will it take them to play the same song?
o 10 minutes
o 20 minutes
o 25 minutes
o 40 minutes
o I don't know.
11. Two teams of workers are making meatballs. Every worker works at the same rate. The
first team has 4 workers, and it takes them 12 minutes to make an order of meatballs. The
second team has 8 workers. How long will it take the second team to make the same number
of meatballs as the first team?
o 3 minutes
o 6 minutes
o 24 minutes
o 48 minutes
o I don't know.
Interactive, Personalized Learning for Teachers
138
Appendix C
C.1 Student Work Cover Sheet
Task # _____ on Day _____
1. Why did you choose this task? How will students’ work on this problem help you determine
whether they have mastered the learning goal?
2. What key concept(s) and/or standard(s) does the task cover?
3. Indicate if this assignment is typical of your daily instruction ☐. If not, please explain:
4. Describe any instructions or directions that were given to students:
5. How did students work on the task? What did you do? What did students do? Were students
working in groups or individually? Was this assignment done out-of-school?
6. How did you assess students’ work on the task? What did you expect to see in students’
work on the task? What products/processes were students held accountable for? Did you use
a rubric or scoring guide?
7. Explain your rationale for selecting the student work samples for the following categories:
a. High-quality work: How does these students’ work on this problem help you
determine they provided high-quality work?
b. Medium-quality work: How does these students’ work on this problem help you
determine they provided medium-quality work?
c. Your choosing: Why did you choose these students’ work? What does these students’
work tell you about what they understand or do not understand?
Interactive, Personalized Learning for Teachers
139
Table C.2
Classroom Artifact Rubrics for Instructional Quality - Potential of the Task
Score Brief description Sample task from the data set Rationale for scoring
1 The cognitive demand of the
task is limited to students’
reproducing answers by using
definitions, facts, or
memorizations.
The problems in this task
require students to select an
option by using their factual
knowledge and the
definition of a ratio.
Interactive, Personalized Learning for Teachers
140
2 The potential of the task
is limited to students’
following a given
procedure to solve a
problem rather than
developing conceptual
understanding.
The task explicitly calls for a
specific procedure for students
to follow (setting up a
proportion to solve the
problem). The task requires
students to solve several of the
same types of problems without
creating meaning for the
mathematical ideas or
processes.
3 The task has the
potential to build
students’ conceptual
understanding but does
not require students to
explain their thinking or
justify their answers.
This task has the potential to
build conceptual understanding
by engaging students in solving
a problem without providing a
prescribed procedure. However,
the task does not explicitly
prompt students to provide an
explanation.
Interactive, Personalized Learning for Teachers
141
4 The task has the potential to
develop students’ conceptual
understanding of the
mathematical concepts, the
meaning behind the
procedures, and the
relationships by solving
complex, nonalgorithmic
problems and by prompting
students to explain their
reasoning and thinking.
The task has the potential to
develop students’ conceptual
understanding of the
multiplicative nature of the
ratios by asking them to
compare two mixtures that
were made by keeping the
differences between the
amounts of juice and soda
constant. In addition, the task
includes explicit prompts for
students to explain their
reasoning.
Note. For further details on the scoring categories, please see Boston, 2012.
Interactive, Personalized Learning for Teachers
142
Table C.4.
Classroom Artifact Rubrics for Instructional Quality - Coherent Mathematics for Conceptual
Understanding
Score Description
1 The lesson goal, the selection of the tasks, the cognitive demand of the tasks,
and teachers’ expectations from and analysis of students’ work focus on a
superficial understanding of the concepts and mainly target isolated skills.
2 The overall focus across the lesson goal, task selection, and expectations from
student work is at the procedural level, but there are scattered opportunities
for students to develop a conceptual understanding.
3 Overall, the goal of the lesson, task choice, and analysis of students’ work
shows evidence of that the teacher planned to develop students’ conceptual
understanding by discussing connections between different representations,
solutions, or mathematical concepts or using complex and nonalgorithmic
thinking. The focus on conceptual understanding and connections is planned
by the teacher beforehand in a consistent and coherent way. However, the
teacher’s expectations and analysis of student work indicate some shifts in
emphasis from students’ making sense of the concepts to students’ mastery of
skills.
4 The teacher provides consistent opportunities for students to develop a
conceptual understanding of the targeted mathematical concepts, which are
evident in the goal of the lesson, task choice, and discussion of student work.
The goal of the lesson and task design provide evidence that the teacher plans
to create opportunities for students to develop a conceptual understanding of
mathematics, and the teacher analyzes students’ work with consistent and
coherent attention to developing a conceptual understanding (e.g., connections
between different representations, solutions, mathematical concepts, or using
complex and nonalgorithmic thinking).
Interactive, Personalized Learning for Teachers
143
Table C. 5
The Self-Regulated Learning Scale
SRL Dimension
Elaboration
E1. When studying for this online PD, I try to relate the content in the online PD to
what I already know.
E2. When I study for this online PD, I pull together information from different
sources, such as instructions from the virtual facilitator.
E3. I try to relate the content in this online PD to those in other PD activities
whenever possible.
Organization
O1. I try to take notes for this online PD because notes are even more important for
learning online than in an in-person PD.
O2. I review my notes and try to find the most important content in this online PD.
O3. I make simple charts, diagrams, or tables to help me organize content in the
online PD.
Monitoring
M1. I summarize my learning in this online PD.
M2. I ask myself questions to make sure I know the content I have been studying in
this online PD.
M3. I monitor and evaluate what I understand by pausing at a regular interval or
whenever needed while studying for this online PD.
M4. I try to monitor closely the areas where I needed the most study and practice in
this online PD.
Abstract (if available)
Abstract
Teachers’ content knowledge for teaching is crucial for quality mathematics instruction. In the dissertation, I present three papers that explore the development of teachers’ content knowledge for teaching within two relatively underexplored learning contexts: the teaching practice and an asynchronous online professional development program. In the first paper, I examined teacher learning of pedagogical content knowledge through their own teaching practice. Drawing on longitudinal data from over 200 novice mathematics teachers, the study found that teachers were able to learn PCK through their own teaching practice. Teachers with robust content knowledge showed faster yearly growth in their PCK through teaching. In the second paper, my coauthors and I described the design of an asynchronous OPD program implemented in an intelligent tutoring system. We measured its impact on the mathematics performance of students whose teachers completed the program by conducting a randomized experiment. Our findings revealed that the program significantly enhanced students’ mathematics performance. In the third paper, I investigated the alignment between teachers’ self-reported learning and directly-assessed learning from an asynchronous OPD program and explored how teachers’ use of self-regulated learning strategies related to teacher learning from the program. The study found what teachers believed they learned from the OPD program did not align with what direct assessments could capture what they learned. Teachers who regularly monitored and evaluated their learning progress demonstrated greater improvement in content knowledge, as measured by direct assessments.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The importance of WHY: teachers’ knowledge of student misunderstandings of ratios and proportional relationships
PDF
The enactment of equitable mathematics teaching practices: an adapted gap analysis
PDF
Noticing identity: a critically reflective cycle to leverage student mathematical funds of knowledge and identity
PDF
Cultivating seeds of support: growing the capacity of educators to create meaningful learning opportunities in math
PDF
Efficacy drivers that aid teacher professional development transfer
PDF
TikToking and Instagramming: following high school teacher influencers' roles in supporting and informing teacher practices
PDF
Learning the language of math: supporting students who are learning English in acquiring math proficiency through language development
PDF
Support for English learners: an examination of the impact of teacher education and professional development on teacher efficacy and English language instruction
PDF
Digital portfolios for learning and professional development: a faculty development curriculum
PDF
Developing a computer science education program: an innovation study
PDF
Examining teachers’ perceptions of diversity, equity, and inclusion professional development
PDF
Authentic professional learning: a case study
PDF
Improving professional learning for teachers
PDF
Examining teacher pre-service/credential programs and school site professional development and implementation of culturally relevant teaching of BIPOC students
PDF
Effects of individualized professional development on the theoretical understandings and instructional practices of teachers
PDF
Cogenerative dialogues as spaces for teacher, student, and public learning: a design investigation into two instantiations
PDF
Preparing teachers to advance equity through deeper learning and antiracist practices
PDF
What is the relationship between self-efficacy of community college mathematics faculty and effective instructional practice?
PDF
An examination of professional development in algebra for fifth-grade teachers
PDF
Success in the sticky: exploring the professional learning and instructional practices that are sticky for distinguished secondary STEM educators of students historically…
Asset Metadata
Creator
Li, Jingxian
(author)
Core Title
Navigating the future of teacher professional development: Three essays on teacher learning in informal and digital context
School
Rossier School of Education
Degree
Doctor of Philosophy
Degree Program
Education
Degree Conferral Date
2024-08
Publication Date
08/30/2024
Defense Date
08/23/2024
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Mathematics Education,OAI-PMH Harvest,teacher knowledge,teacher professional development
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Copur-Gencturk, Yasemin (
committee chair
), Aguilar, Stephen (
committee member
), Cohen, Allan (
committee member
), Du, Han (
committee member
)
Creator Email
lijingxi@usc.edu,ljx199295@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC113999ZI8
Unique identifier
UC113999ZI8
Identifier
etd-LiJingxian-13453.pdf (filename)
Legacy Identifier
etd-LiJingxian-13453
Document Type
Dissertation
Format
theses (aat)
Rights
Li, Jingxian
Internet Media Type
application/pdf
Type
texts
Source
20240830-usctheses-batch-1204
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
teacher knowledge
teacher professional development