Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Reforming developmental education in math: exploring the promise of self-placement and alternative delivery models
(USC Thesis Other)
Reforming developmental education in math: exploring the promise of self-placement and alternative delivery models
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
i
REFORMING DEVELOPMENTAL EDUCATION IN MATH: EXPLORING THE PROMISE OF
SELF-PLACEMENT AND ALTERNATIVE DELIVERY MODELS
by
Holly Irene Kosiewicz
A Dissertation Presented to the
FACULTY OF THE USC GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(EDUCATION)
August 2015
Copyright 2015 Holly Kosiewicz
ii
TABLE OF CONTENTS
Dedication iv
Acknowledgements v
Chapter One: Introduction 1
Chapter Two: Injecting Innovation in Developmental Math: Issues of Use and Student
Access in Resource Constrained Community Colleges. 6
Alternative Models of Delivery of Developmental Math 10
Providing Supplemental Instruction and Support 11
Reducing Time in Remediation 12
Redesigning Curriculum 15
Use of and Access to Alternative Models of Delivery: What We Don’t Know 18
Setting 19
Methods: Content and Descriptive Analyses 20
Content Analysis 20
Descriptive Analysis 23
A Revised Typology for Developmental Math MODs 24
The Extended Traditional Model 25
The Combined Alternative Delivery Model 26
Use of and Access to Alternative Models to Teach Developmental Math 27
Low Take Up of Alternative MODS for Developmental Math 27
Students with the Greatest Remedial Needs are Not Receiving the Most Support 27
Students Do Not Have Equal Access to Different Types of Delivery Innovations 29
Trends in the Use of Alternative MODs 30
Increased Adoption of Alternative MODs over Time, Sharp Decrease in 2010 30
No Major Changes in the Use of Alternative MODs, Slight Uptick in Accelerated Models 31
Increased Use of Accelerated MODs in Lower Levels, Combined Approach in
Highest Level 31
Discussion 32
Chapter Three: Giving Community College Students Voice: The Effects of Mathematics
Self-Placement on Academic Achievement 37
A Brief Overview of Remediation Assessment and Placement Policies in Community College 40
The Efficacy of Increasing Voice on Improving Student Success: What Do We Know? 42
Natural Experiment: Exogenous Change of Placement Regime at College X 43
Setting 45
Data Sources 46
A Difference-in-Difference Estimation Strategy 47
Assumptions for Identification 49
Results 51
Differences in Math Assignment and Enrollment Patterns between Test- and Self-Placers 52
The Effects of a Self-Placement Regime on Student Success 55
Heterogeneous Effects 57
Sensitivity Checks 59
Seasonality Bias 59
Falsification Test 60
Permutation Tests 61
iii
Discussion 64
Chapter Four: How Do Students Make Math Placement Decisions When Given Greater
Autonomy 67
How Do Students Make Decisions? A Behavioral Perspective 70
Stages of Decision-Making 72
Simplification Techniques and Decision Biases 73
Moderators of Decision Bias: Decision Aids 76
Research Design and Methods: A Mixed Methods Approach 82
Study Sites 84
Sampling Strategy 85
The Hypothetical DSP Scenario 85
Quantitative Study 88
Qualitative Study 91
Placement Decision-Making under DSP 96
The Influence of Decision-Aids on Placement Decisions 97
Profiles of Decision-Making under DSP 101
Further Evidence Supporting the Existence of Selective Decision-Making 119
Discussion 122
Chapter Five: Conclusions, Policy Implications, and Future Directions 126
References 137
Appendix A 155
Appendix B 164
Appendix C 179
Appendix D 197
iv
DEDICATION
I dedicate this dissertation to all the students who seek a college degree but encounter
barriers that impede their success. It is you who motivate me to do the work that I do.
v
ACKNOWLEDGEMENTS
First and foremost, I want to express an enormous amount of gratitude to the administrators
and institutional researchers who gave me access to data and their community college students.
Without your assistance, support, and patience, my dissertation would have never been possible.
You help me think about problems and solutions in ways that no book or journal article ever could.
Thank you.
Second, I want give miles y miles de gracias to Tatiana Melguizo, for her unwavering
support and guidance as my advisor for the past five years. You have not only helped me to become
a thoughtful and competent researcher, you have also taught me how to be a good mentor and
colleague along the way. Thank you for pushing me beyond what I thought was possible and for
recognizing my growth as an independent researcher at the same time. Without your
encouragement, I doubt I would have ever applied to the Spencer Dissertation Fellowship or USC’s
Provost Completion Fellowship, or even have believed I was worthy of such awards. Thank you. I
would also give thanks to my dissertation committee members, Paco Martorell, Gary Painter, and
Robert Rueda. You three have been immensely helpful this past year, in not only guiding my work
but also helping me to interpret my findings in new ways, give a good job talk, and believe that
everything was going to be ok. Thanks also go to Diane Flores and Monica Raad of the Pullias
Center for helping me with the logistical side of my dissertation research. I would also be remiss if
I did not mention my Master’s thesis advisor, Ricardo Godoy, who inspired me to pursue a Ph.D.
and inculcated within me a love for doing research that matters. Thank you.
Third, I am also deeply indebted to the institutions that funded my dissertation research:
USC’s Provost’s Office and the Spencer Foundation. With your financial support, I was able to
dedicate my fifth year entirely to my dissertation, a luxury by any student’s standards. I am
vi
particularly grateful to the National Academy of Education, who selected me as a 2014-2015
Spencer Dissertation Fellow. Not only did this fellowship financially support me, it also connected
me to a wide network of individuals conducting cutting edge, impactful education research. Special
thank yous go to Helen Ladd, Carol Lee, Cecilia Rouse, and Alon Pinto who provided helpful
suggestions on how to improve and think about my dissertation research as well as offered insights
into how to approach and navigate the job market, negotiate job offers, and above all else find
work-life balance as a researcher. I would also like to give a shout-out to Christopher Chambers-Ju
and Kelsey Mayo who made attending the fall and spring Spencer Fellowship retreats even more
enjoyable and worthwhile. Chris, Pomodoro changed my life (and helped me finish)!
Fourth, I am also extremely grateful for my graduate school peers who provided unceasing
support and encouragement as I plugged away on my dissertation. I would especially like to thank
Federick Ngo for reviewing seemingly countless dissertation drafts and for continuously reminding
me why I do research. To Kristen Fong, who expressed unflinching belief in my work, and acted as
a sounding board on multiple occasions. Thanks also go to Kathryn Struthers for being one of the
best listeners around and for constructively and thoughtfully critiquing my work without making
me feel inadequate. To Jenna, who was one of the best, and most supportive, office mates anyone
could ever have. And to Matt, whose quick wit always helped me to re-center and persevere even
when I was thinking about jumping ship. I am also indebted to Rudy Acosta and Keith Witham, as
well as the women of the Saturday Night Dinner crew (Caitlin Farrell, Ayesha Hashim, Alice
Huguet, Stephani Wrabel, & Tracey Weinstein) for letting me know that that they would always be
there for me in times of need.
Fifth, I would also like give many thanks to my friends from SCAQ, particularly Ginny
DeFrank and Cam Sanders. Swimming has been a refuge this past year, and I really have you to
vii
thank for making me laugh and making sure I stayed distracted during practice. A special thank you
goes to, Ginny, for editing sections of my dissertation and keeping me entertained in the locker
room.
Sixth, to my Mom, Dad, and my brother, Ash, who always reminded me to be kind to
myself. Thanks for checking in on me as I worked on my dissertation, and making sure that I was
taking care of myself (and taking my vitamins!). You made me realize that there was more to life
than this piece of work, and that I should enjoy, not detest, life. Also a shout-out to Kristin, my
sister from another mother, for sending me lots of cat videos. They kept my spirits high!
Seventh, I also want to express an enormous amount of thankfulness to my soon-to-be
husband, Trey Miller. Even when my dissertation occasionally turned me into the wretch, you were
still kind and gentle to me. Thank you so much for your generosity, your patience, and giving me
words of encouragement on a daily basis. I am truly blessed to be with such a supportive,
thoughtful, and loving partner.
1
CHAPTER ONE
INTRODUCTION
Over the past thirty years, the percent of individuals seeking a college degree has grown
tremendously in response to the ever-more changing needs and demands of the labor market. As
higher education becomes increasingly seen as a portal to a more financially stable life, so does the
share of students who enroll in college seemingly underprepared to succeed in college-level
coursework (Strong Schools America, 2008). Each year, thousands of incoming students are tested
for and placed into developmental coursework, an area of education designed and principally used
to develop or reinforce weak English and math skills. Various estimates suggest that although over
half of all college students are placed into developmental education based on their scores on
placement tests or state standardized tests (Brown & Niemi, 2007; Melguizo, Kosiewicz, Prather, &
Bos, 2013; NCPPHE & SREB, 2010), less than half actually finish their developmental education
sequence, and far fewer pass an entry-level college course counting towards a four-year degree
(Bailey, Jeong, & Cho, 2010).
These statistics coupled with widespread evidence on the high costs of administering
developmental education have cast doubt on the effectiveness and the efficiency of developmental
education to bridge longstanding education inequities (Complete College America, 2012). In an era
when resources are consistently diminishing and the demands to provide quality higher education
are increasing, state and local policymakers are grappling with designing policies and practices that
not only can effectively diagnose students in need of developmental education but also can provide
the types of instruction and supports remediated students need to succeed in college.
This collection of three studies seeks to provide policymakers, practitioners, and researchers
much needed insight into the effectiveness and feasibility of two approaches currently gaining
2
popularity among efforts to reform developmental education: providing alternative models to
deliver developmental education instruction and giving students greater autonomy in decisions
determining placement into developmental education.
Each study in this collection focuses on and draws data from community colleges belonging
to a Large Urban Community College District in Southern California (LUCCD). The LUCCD is
one of the largest community college districts in the nation, places over 60 percent of its vastly
diverse student population into a developmental math course, and has significant autonomy in the
way it delivers and assigns students to developmental math. In this sense, studying LUCCD
colleges lends important insights into the ways reform efforts may play out in other districts and
colleges educating a highly diverse student population, who on the surface, seem underprepared for
college-level coursework.
The first study descriptively examines the ways in which LUCCD community colleges,
which have not received philanthropic support or been affected by legislative mandate to explicitly
alter developmental education instruction, have changed how they implement and deliver
developmental math. This paper contributes to current understandings of which types of alternative
modes of delivery (MOD) these community colleges employ, the extent to which their students, of
varying degrees of college-readiness, have access to alternative MODs, and which types of
alternative MODs have gained or lost traction over the past eight years.
Examining descriptions of approximately 9,000 course sections published across LUCCD
college catalogs between 2005 and 2013, my coauthors and I identified the number of course
sections offered for each developmental math course, whether the course section offered an
alternative MOD, and if so, which type of alternative MOD was offered. Results from this study
show that that LUCCD colleges employed familiar alternative approaches, but also offered ones
3
which lengthened the amount of time in developmental math or combined multiple alternative
approaches. While just 30 percent of course sections offered an alternative MOD, half provided
supplemental instruction, and most were located in the upper remedial math levels. Accelerated
delivery models gained traction after 2010, yet models contextualizing learning in developmental
math never took root.
The second study sheds light on the effectiveness of Directed Self-Placement (DSP), an
alternative placement regime used to assign students to developmental education focused on
improving achievement. DSP forces a student, with some support from faculty and administrators,
if and to the extent he or she needs additional academic support to succeed in college-level
coursework. Within recent years, DSP has gained currency among policymakers and reformers
across the U.S. even though there has been no systemic evaluation of its potential to improve over
current test-based placement models. Florida and Connecticut have both passed legislation
incorporating elements of DSP into state-wide placement policies, and Jobs for the Future and
Achieving the Dream, two nation-wide reform initiatives dedicated to improving college readiness,
have acknowledged DSP as a potential substitute to the current test-based placement regime.
Drawing on roughly 30,000 administrative records from the LUCCD, I exploit a college’s
unexpected failure to renew its COMPASS test license to measure the effects of DSP, relative a
test-based placement model, on achievement. Estimates from Difference-in-Difference analyses
imply that – in the aggregate – DSP adds value over test-based placement models across most
measures of achievement, with the exception of failing the first math class. However, results,
disaggregated by race and sex, suggest that White, Asian, and male students benefitted more from
DSP than African-Americans, Latinos, and females, raising concerns about whether DSP opens
opportunities for students to unintentionally incorporate stereotypes about themselves into
4
placement decisions. Because sensitivity tests could not isolate the effects of DSP on success from
normal fluctuations in achievement, further evidence is needed to assert that DSP is a superior
placement strategy to test-based placement models.
The third, and final paper, adds to existing literature on DSP and decision-making by
distilling how students make autonomous placement decisions. While advocates of DSP purport
students can make better placement decisions than a test (Roger & Gilles, 1998), decision research
has consistently demonstrated that individuals frequently engage in behaviors that compromise
rational decision-making, leading them, for example, to draw on irrelevant information or over-rely
on recent past experiences (Payne, Bettman, & Johnson, 1992). As such, this study provides greater
clarity on the diagnostic and decision-making strategies community college students employ under
DSP, and provides perspectives on how to reform current DSP implementation practices moving
forward.
Using a Concurrent Mixed Methods Design, this study collected and analyzed qualitative
data from interviews, and verbal protocols, as well as quantitative data from an online survey, on
students asked to hypothetically self-place into a math sequence given different informational aids
1
.
Overall, results suggest that community college employ strategies that limit the search for and the
processing of information intended to help them navigate the difficulty of choosing a math course
on their own. Further, the way that students engage in placement decision-making appears to be
more a function of differences in psychological characteristics and life objectives (e.g. perceptions
of math abilities, learning versus performance orientations, academic and career goals, and previous
math experiences), the way information if presented, and their susceptibility to decision biases and
less a function of a student’s sex, race, or ethnicity. While the inclusion of pre-requisite math
1
Participation in this study was restricted to students who had not taken a math placement test or a math course in
college.
5
problems in self-placement guides induced a higher percentage of students to change their initial
course choice, most students ignored their performance on those math problems, selecting a math
course above their level of academic qualification.
To conclude, I briefly summarize the findings from all three essays, discuss their
implications on reforms efforts currently underway in developmental education, and make
suggestions for future research avenues.
6
CHAPTER TWO
INJECTING INNOVATION IN DEVELOPMENTAL MATH: ISSUES OF USE AND STUDENT
ACCESS IN RESOURCE-CONSTRAINED COMMUNITY COLLEGES.
Developmental education has been the principal means used to prepare higher education
students who are deemed underprepared for college-level coursework (Arendale, 2002). However,
with such a small percentage of remediated students reaching college-level courses (Bailey, Jeong,
& Cho, 2010; Complete College America, 2012), developmental education is viewed as an area in
need of significant reform to ensure that it works to fulfill its potential to improve student success in
college (Collins, 2009; Complete College America, 2012; MDRC, n.d.)
2
. To this end, state
policymakers, foundations, researchers, and community colleges increasingly believe changing how
colleges deliver
developmental education to be a key strategy in reversing historically low
performance among remediated students and reducing costs associated with remediation (Carnegie
Foundation for the Advancement of Teaching, n.d.; Edgecombe, Cormier, Bickerstaff, & Barragan,
2013; Jenkins, Zeidenberg, & Kienzl, 2009; National Conference of State Legislatures, n.d.;
Rutschow & Schneider, 2011)
3
.
Within recent years, a number of foundations and states have translated this belief into
action. For example, The Ford Foundation, the Spencer Foundation, the Lumina Foundation for
Education, and other philanthropic foundations supported the Opening Doors Demonstration
project, a large scale initiative to identify innovations that improve academic achievement among
community college students, particularly those who were assigned to developmental education
courses across five states (MDRC, n.d.; Scrivener & Coghland, 2011). In concert with these efforts,
states have also passed legislation modifying the delivery of developmental education to better
2
In this paper, we use developmental and remedial education interchangeably.
3
Here, we define delivery as the way in which community colleges structure and teach developmental education.
7
serve students assigned to such coursework. For example, within the past two years, Florida (SB
1720) and Connecticut (SB 40) have each enacted statewide policies that give students who would
have been assigned to developmental education based on their placement score the option to enroll
in college-level math and English courses instead if they feel able to succeed in more rigorous
courses (Burdman, 2012). In both states, policymakers require mainstreamed students to enroll in
non-course based instruction where they receive targeted academic and non-academic supports to
help them succeed. In another effort, the Tennessee Board of Regents implemented the Tennessee
Redesign Initiative in 2006 to transform developmental education via the use of new technology
and instructional services that change how students learn in the classroom (Boatman, 2012). These
three examples are perhaps the most well-known among the many initiatives currently being
implemented by states and community colleges to change how academically underprepared
students receive developmental education.
4
Despite heightened rhetoric about the importance of altering how students receive
developmental education, research examining the use and uptake of alternative delivery models
in
response to this rhetoric has thus far focused almost exclusively on reform initiatives supported by
philanthropic or research funding, or mandated by state legislation (Bettinger, Boatman, & Long,
2013; Rustchow & Schneider, 2011; Visher, Weiss, Weissman, Rudd, & Wathington, 2012)
5,6
.
4
For a more comprehensive list of reforms currently underway to improve the delivery of developmental education,
please visit Education Commission of the State’s comprehensive database on changes occurring with developmental
and remedial education across states:
http://www.ecs.org/ecs/ecscat.nsf/WebTopicPS?OpenView&count=1&RestrictToCategory=Postsecondary+Academic+
Affairs-- Developmental/Remediation/Placement
5
We define alternative models of delivery as innovations that change the structure, curricula, or pedagogy of
developmental education with the intention to improve teaching and learning (Edgecombe, Cormier, Bickerstaff, &
Barragan, 2013).
6
The Accelerated Learning Program at the Community College of Baltimore, The FastStart program at the Community
College of Denver, Washington State’s I-Best Program, and initiatives stemming from the Tennessee Redesign
Initiative and the Opening Doors Demonstration Projects are examples of programs showcased in current literature.
8
While the emphasis on these programs, we suspect, is largely due to the fact that innovation is
championed by a small percentage of faculty and administrators successful at obtaining funding or
changing legislation, this emphasis has come at the expense of advancing our knowledge of how
community colleges without philanthropic money or support or the need to meet legislative
demands are shifting the ways in which developmental education is structured and taught. The
emphasis on philanthropically and state supported reform initiatives has also inadvertently centered
our attention on the implementation and effectiveness of specific delivery innovations (e.g.,
acceleration, contextualized learning, intensive advising). Lost within this myopic focus is an
understanding of how community colleges and their districts are employing the multiple structural
and instructional innovations available to them, and the extent to which they are granting their
students of varying levels of academic preparedness equal opportunities to access these new
approaches and techniques. These issues merit our attention because they can help to unearth the
degree to which resource constraints, institutional norms and culture, and statewide regulations may
impair the scaling up and reach of these initiatives.
To fill in these gaps in understanding, we draw our attention to a large urban community
college district in Southern California (henceforth called LUCCD) because it offers us a unique
opportunity to explore how resource-tied, locally-governed community colleges are using and
adopting alternative models of delivery to provide developmental math. We focus on math because
more students are assigned to developmental math than developmental reading or writing, and have
a harder time progressing through developmental math sequences (Bailey et al., 2010). We make
empirical contributions to research on innovations in developmental math delivery by answering
four research questions:
9
1) What types of alternative delivery models did LUCCD community colleges use to teach
developmental math?
2) To what extent did they align with current categorizations of alternative delivery models
(e.g., Rutschow & Schneider, 2011)?
3) To what extent did students of varying levels of academic underpreparedness have equal
access to selected alternative delivery models for developmental math?
4) To what extent did LUCCD colleges adopt alternative delivery models to teach
developmental math over time?
To answer these questions, we conducted a content analysis of publically available course
schedules published by all nine district colleges between the fall semester of 2005 and the spring
semester of 2013 to take stock of the use and allocation of alternative delivery models for
developmental math, and to more broadly gain insight into whether the nationwide push for new
delivery models for developmental education gained traction across colleges within the LUCCD.
We divide this paper in the following manner: First, we provide a brief description of the
traditional model used to deliver developmental math, and identify the most salient reasons why
researchers think it fails to boost achievement among academically underprepared students.
Subsequently, we present current categorizations of alternative models of delivery based on scans
of existing literature, and describe each one in detail. Second, we profile the LUCCD, providing
descriptions of its colleges, students, and math sequence. Third, we outline the methods we used to
conduct this study. Fourth, we present our results, and conclude with several explanations why the
traditional model continued to prevail as the dominant model to teach developmental math in the
LUCCD.
10
Alternative Models of Delivery of Developmental Math
Researchers contend that the way in which remedial education has been traditionally
delivered – through a sequence of lecture-based, stand-alone courses – stacks the cards against
student success in college in two ways. First, being placed into a developmental sequence
inherently increases the amount of time and money a student must spend in college (Melguizo,
Hagedorn, & Cypers, 2008), and as a result reduces their chances of meeting minimum degree
requirements (Jenkins & Cho, 2012). Second, traditional models of delivery go hand-in-hand with
traditional curricula and models of instruction, which may not adequately respond to the academic
needs and behaviors of remediated students (Grubb, 1999). Teaching in remedial math courses
often relies on skill and drill methods, and emphasizes rote procedural learning that lacks relevance
to real-world contexts (Grubb, 1999). Taken together these findings suggest that the traditional
model may being doing a poor job of providing remediated students the supports and environment
they need to benefit from enrolling in this type of coursework.
In response to the limitations of the traditional model, states, community colleges, and
foundations have introduced a number of alternative models to deliver developmental math. Here
we call these alternative models of delivery (MODs). These delivery models vary in how they
address the flaws of the traditional model, and fall into four distinct alternative categories
(Rutschow and Schneider, 2011): (1) strategies that help students shore up skills prior to college
and thus avoid developmental education; (2) supplemental instruction, which encompasses
programs that provide additional support alongside traditional remedial sequences, such as tutoring
and advising; (3) accelerated or compressed delivery, which includes programs whose main goal is
to reduce the amount of time students spend in remediation; and (4) contextualization, which entails
the integration of basic skills with occupational or academic content. We next discuss the rationale
11
for each model, and provide examples of implementation in colleges across the country. We do not
discuss the first category since those interventions consist of efforts made prior to the start of
college.
Providing Supplemental Instruction and Support
Perhaps the alternative MOD that aligns most with the traditional delivery model, in that it
does not radically restructure or redesign developmental education, is supplementing traditional
coursework with tutoring, advising, and other forms of academic support. Programs and courses
utilizing these approaches provide students with extra time to address academic needs, as well as
guidance that can help facilitate their transitions to college. For example, tutoring and non-course
based instruction can improve student achievement by providing additional academic assistance,
either through tutoring or a lab component. Some suggest that these types of support also aid
students in improving their study skills and developing positive attitudes towards math (Arendale,
2002; Gourgey, 1992).
Intensive advising and student success courses. Supplemental support also comes in the
form of intensive advising, which can help more students benefit from remedial coursework. This
approach is designed to actively engage students in their college environment and help them
overcome informational barriers that may hinder their success. Staff who provide intensive advising
regularly meet one-on-one with a student throughout the semester to provide guidance on how to
register for courses, how to choose courses appropriate for their academic and career goals, and
how and where to access various campus support staff. In a departure from the traditional academic
advising model, intensive advising takes a more proactive approach to helping students overcome
information constraints and academic disengagement. Advising can also be offered systematically
through student success courses, which can help introduce first-time students to the college
environment, provide help with academic and career planning, and support students in developing
12
study skills and basic life skills, such as financial literacy (Cho & Karp, 2013; O’Gara, Karp, &
Hughes, 2009).
There is suggestive evidence that these types of supplemental supports help remediated
students succeed in college. Descriptive studies intimate that academic based supplemental
instruction appears to positively benefit persistence and achievement (Arendale, 2002; Gourgey,
1992), and research examining intensive advising finds that it is positively associated with
successful completion of remediation (Bahr, 2009), passing rates of more rigorous courses (Boylan,
Bliss, & Bonham, 1997), retention rates (Rudmann, 1992), and the amount of time a student spends
studying (Cartnal & Hagen, 1999; Pfleging, 2002). There are also signals that participation in
student success courses is positively linked with improvements in learning strategies fundamental
for academic success (Boylan, 2002) and increased persistence, completion, and transfer rates (Cho
& Karp, 2013; Zeidenberg, Jenkins, & Calcagno, 2007). However, since many of the students in
these studies choose into receiving these supplemental supports, it is difficult to say with any
certainty that they do, and to what degree, promote school success.
Reducing Time in Remediation
Although supplemental instruction and support seems to help some students succeed in
remediation, this reform approach fails to address the consequences of having to spend a long
amount of time in developmental education coursework. The length of developmental math
sequences, which can range from one to four courses (Bailey et al., 2010), may actually discourage
students from pursuing their academic goals, since these courses typically do not count towards a
college degree. Setting aside the problem of time, the way in which these sequences are structured
also prevents students from seamlessly progressing from one course to the next. Studies examining
student attrition in developmental education sequences show that the task of enrolling in a course
13
poses a significantly higher barrier to student persistence than the task of passing each course in the
sequence (Bahr, 2009, 2012; Bailey et al., 2010)
7
. Removing these “exit points” by shortening or
accelerating the trajectory may make it harder for students to slip through, or feel like progressing
through remedial education is an insurmountable and expensive undertaking (Hern, 2012)
.
Here, we
focus this section on models that accelerate and modularize developmental math sequences to
address concerns of attrition.
8
Acceleration and compression. Students enrolled in acceleration or compression programs
have the possibility of moving more quickly through developmental education sequences than
students placed in traditional courses. Acceleration models typically combine developmental
courses in order to reduce the number of course requirements a student has to complete prior to
enrolling in college-level courses (Hern, 2012). In a similar vein, compressed or fast-track courses
maintain the same developmental math sequence while at the same time shortening the overall time
during which the content is taught (Hodara, 2013). Often, students are required to enroll in two
compressed courses the same semester, making the sequence more seamless (Hodara, 2013).
Accelerated and compressed developmental math programs can both decrease the amount of time
spent in remediation overall while also giving students access to higher-level math earlier in their
college careers than in the traditional approach.
Findings from evaluations of programs reducing time spent in remediation are mixed, but on
the whole are positive. Students participating in the California Acceleration Project were 2.3 times
7
Consider, for example, a student who is assigned to pre-algebra. That student must first enroll in pre-algebra. Then she
must pass it. Next, she must enroll in elementary algebra if she is to successfully advance in the sequence.
8
“Mainstreaming”, which essentially removes developmental education requirements completely, would also
technically reduce time in remediation. We do not discuss this practice here since our study focuses on types of course
offerings described in course catalogs. Mainstreaming is currently being adopted as a statewide practice in Connecticut
and Florida (Hu et al., 2014). Boatman (2012) provides evidence on the effects of mainstreaming in Tennessee (e.g.,
mixed effects on persistence and completion).
14
more likely to complete college-level courses and 4.5 times more likely to complete transfer-level
courses than their peers in traditional remediation (Hayward & Willett, 2014; Hern, 2012). In
another California study, students in compressed eight- or twelve-week math courses in a large
community college were more likely to complete the course, and less likely to withdraw from their
courses compared to their more counterparts in traditional remedial courses (Sheldon & Durdella,
2010). Using more rigorous causal methods, Boatman (2012), however, found that while Tennessee
students in accelerated courses completed more units, 1
st
to 2
nd
year persistence was negatively
affected. Similarly, compression also seems to impact long-term outcomes. Students in compression
programs in the City University of New York were more likely to pass college math, and slightly
more likely to complete their degrees (Hodara & Jaggars, 2014). However, these students
completed about the same number of credits as students in non-compressed courses (Hodara &
Jaggars, 2014). Based on this range of results drawn from different colleges across the country, it
remains unclear what the true effects of acceleration and compression are, and under what
conditions these approaches may be most beneficial to students in developmental education.
Modularization. Another strategy that potentially reduces the amount of time that students
spend in developmental education is modularization, which breaks semester-long developmental
education classes down into smaller competency-based units, often incorporating computer-based
instruction (Twigg, 2005). The logic underlying this approach is that developmental math students
lack specific skills that can be remediated in a series of specific content-focused modules instead of
in semester-long courses (Rustchow & Schneider, 2011). Students in modularized programs
progress through developmental education by demonstrating competence in each assigned module
before reaching college credit bearing coursework. Because this delivery model recognizes that
most students have mastered at least some content taught in semester-long courses, it essentially
15
allows students to move more quickly through remediation and target the skills that are most
necessary for success in college-level coursework and the labor market.
There are relatively few evaluations of modularization as an alternative MOD in
developmental math courses. The reform that has received the most attention has been Jackson
State University’s modularization of the developmental math sequence, which split developmental
math courses into twelve modules that emphasized mastery and incorporated learning technologies
(Epper & Baker, 2009). Findings from an evaluation study of Jackson State’s efforts indicated that
modularization was associated with higher passing rates, improved learning outcomes, and
increased retention and completion of developmental math (Bassett & Frost, 2010; Epper & Baker,
2009). Students in modularized courses also had reported being less anxious about math (Bassett &
Frost, 2010; Epper & Baker, 2009). Using a difference-in-difference methodological approach,
Boatman (2012) found that modularized developmental math increased the number of college
credits that students earned. However, it did appear to decrease semester-to-semester and year-to-
year persistence.
Redesigning Curriculum
While the above-mentioned time-oriented reforms may reduce time spent in remediation,
the core components of teaching and learning used in those reforms generally stay the same. We
group these next reforms together since they inherently change curricular content and students’
learning experiences in developmental education courses. One main curricular redesign in
developmental math focuses on contextualization of math content, while another groups students
into learning communities or cohorts that take a sequence of developmental education courses
together to help them build constructive relations with faculty and their peers.
16
Contextualization. Contextualized learning ties developmental education with the student's
academic and vocational interests (Perin, 2011; Rutschow & Schneider, 2011). Contextualized
approaches in developmental math are founded on the idea that pedagogical practices and curricula
promoting reasoning skills, conceptual understanding, and real-world applications, more so than
procedural knowledge, may be key to increasing student engagement, persistence, and success in
developmental education sequences, college, and the workforce (Hiebert & Grouws, 2007; Mesa,
2011).
Within the past ten years, several contextualized developmental program have become high-
profile among current delivery reform efforts. Statway, a program developed by the Carnegie
Foundation for the Advancement of Teaching, is perhaps one of the best known, and founded on the
principle that mastery of statistical concepts instead of algebraic ones yields more academic benefits
for students pursuing academic and career interests outside of STEM. Statway students enroll in
developmental math combined with college-level statistics, and learn statistical and data analysis,
and causal reasoning, (Cullinane & Triesman, 2010). Other contextualized programs embed
mathematics within applied vocational contexts. For example, the Community College of Denver’s
Certified Nurse Assistant to Licensed Practical Nurse (CNA to LPN) program, and a similar
program in Southeastern Arkansas Community College both link math competencies to healthcare
competencies (Epper & Baker, 2009; Klein-Collins & Starr, 2007). Another nationally recognized
program, Washington’s I-BEST program (Integrated Basic Education Skills and Training),
integrates basic academic skills into vocational education, like automotive repair (Zeidenberg et al.,
2010)
9
.
9
I-BEST is funded by the Washington State Board of Community and Technical Colleges.
17
There is limited evidence on the effectiveness of contextualized programs, but an evaluation
of the I-BEST program in Washington indicated that students in colleges with the program were
more likely to earn credits and obtain a credential relative to students in colleges without the
program (Zeidenberg et al., 2010). Studies of contextualization programs have also found evidence
of improved learning outcomes. I-BEST students demonstrated point gains on basic skills tests
(Jenkins et al., 2009) and, in another study, students in developmental math courses that
incorporated examples from allied health fields scored higher than traditional courses without this
contextualization (Shore, Shore, & Boggs, 2004).
10
Learning communities. A final example of redesigned curricula is learning communities,
where students enroll in developmental education courses as cohorts, and engage in collaborative
learning experiences around a particular curriculum with a core group of professors and peers. It is
thought that through learning communities students can build long-lasting beneficial relationships
with faculty and peers since this model, which centers around community building, can help break
down barriers that prevent faculty from having meaningful interactions and relationships with
students and vice versa (Tinto, 1997). Descriptive studies examining the effectiveness of learning
communities indicate that they yield improvements in GPA (Tinto, 1997), persistence (Engstrom &
Tinto, 2008), and overall satisfaction with academic and social engagement in college (Zhao &
Kuh, 2004). However, a recent study published by MDRC exploiting random assignment to
learning communities found only modest impacts on credit attainment (half a credit), and no
evidence of any positive effects on student persistence (Visher et al., 2012; Weissmann et al.,
2011).
10
The development of the allied health curriculum for remedial math was funded by FIPSE. Site of study was not
specified.
18
Use of and Access to Alternative Models of Delivery: What We Don’t Know
While there have been an increasing number of states and community colleges that have
explored and advocated for the use of these alternative approaches to deliver developmental
education, there is relatively little empirical research examining the degree to which community
colleges without explicit support to change their delivery methods have adopted these reforms. This
is a critical research focus because we suspect that community colleges may not be adopting and
sustaining these types of reforms absent state mandates or the additional financial and technical
support to do so. As this review highlights, nearly all of the evidence on the alternative models of
delivery comes from studies of “boutique” programs in single community colleges or community
college systems that have received external support. How are community colleges taking action to
reform developmental math in contexts where these supports are not readily available?
What is also missing from the literature is a description of the nature of student access to
these interventions. Rutschow and colleagues (2011) found that less than 10 percent of students
targeted by the expansive Achieving the Dream initiatives were actually affected by them. Also,
findings about how the effects of innovations vary by level of remediation (Boatman, 2012) raise
questions about the effectiveness of alternative MODs for students with different developmental
needs.
11
Which sorts of developmental math students are more likely to have access to different
kinds of alternative models of delivery? This is a question of vertical and horizontal inequities in
access, which may exist as a consequence of community colleges’ decisions to locate alternative
MODs at various levels of the developmental math sequence. Looking at equity in access from a
horizontal perspective allows us to determine whether all developmental math students have equal
11
Students enrolled in Developmental Algebra II at Austin Peay and Developmental Algebra I students at Cleveland
State benefitted more from redesign efforts relative to students in other levels of math.
19
access to alternative delivery approaches, irrespective of the level of math in which they enroll
(Stone, 2001). From a vertical equity perspective, we can evaluate whether students with greater
math remediation needs have greater opportunities to access alternative delivery approaches
(Gerdtham, 1997; Alberts et al., 1997). Examining equity from both perspectives lends insight into
the relationship between the delivery of developmental math and student opportunities to improve
their success, and we do so in the context of the LUCCD, described next.
Setting
The setting of this study is a large community college district in California, serving roughly
220,000 students each year (CCCCO Data Mart, n.d.). Here, we assigned the district the pseudonym
LUCCD. The LUCCD is comprised of nine colleges, and enrolls a diverse group of students, the
majority of whom identify as racial and ethnic minorities. Based on data from the entering fall 2012
cohort, about 53 percent of the district’s students were Latino, and about 15 percent classified as
African-American. Roughly two out of five LUCCD students reported that their native language
was not English (CCCCO Data Mart, n.d.)
Regardless of their student population, LUCCD colleges have similar developmental math
sequences. In eight out of nine colleges, the developmental math sequence starts with arithmetic
(four levels below transfer-level math), and is followed by pre-algebra (three levels below),
elementary algebra (two levels below), and intermediate algebra (one levels below). In one college,
a course focused on basic numeracy (titled World Numbers) precedes arithmetic, and is located five
levels below the level of math that is required for transfer to a four-year university in the California
State University (CSU) or University of California (UC) systems.
12
Since the fall semester of 2009,
12
Examples of courses counting towards transfer-level math include statistics, math for liberal arts majors, and pre-
calculus or trigonometry.
20
California has required students to pass intermediate algebra to receive an associate’s degree, one
math level higher than elementary algebra, the previous degree requirement. Based on our
calculations of 2012 assessment and placement data, LUCCD colleges place about 80 percent of
their students into a developmental math course upon enrollment, even though the majority of them
report bearing a high school degree.
For the most part, LUCCD colleges have received minimal or isolated support from
foundations and state-wide initiatives seeking to change the way they deliver developmental math
instruction.
13
While the LUCCD participates or has participated in initiatives seeking to improve
student success in developmental education, for example, the Achieving the Dream Initiative or
California’s Basic Skills Initiative, their efforts to improve student success within and across
colleges have been broad, and have not exclusively focused on how they deliver developmental
math. Even in the absence of pressure to change, two colleges within the district are participating in
the rollout of Statway, which was described in the previous section. Another college within the
district has developed a pilot compression model to teach combinations of algebra courses in a
single semester. In sum, LUCCD colleges – to the best of our knowledge – have participated in
very few external or foundation-supported initiatives that specifically seek to alter the way in which
they deliver developmental math. We believe this makes the LUCCD a unique context for
examining the adoption of alternative MODs and the nature of student access to these innovations.
Methods: Content and Descriptive Analyses
Content Analysis
To answer the first two research questions, we conducted a content analysis of course
schedules published by each LUCCD community college between 2005 and 2013, which were used
13
Personal communication with research staff from the LUCCD.
21
to inform students about the developmental math courses taught at their institution. Unlike other
qualitative data sources (e.g., interviews with administrators), course schedules can provide an
objective account of the utilizations of instructional innovations since they offer brief but accurate
descriptions of each developmental math course (e.g., prerequisites, unit-worth, and MOD). Course
schedules also publish information on the number of sections offered for each developmental math
course, which can be used to determine the prevalence of different MODs, particularly over time if
examining course schedules across years. Because of these characteristics, we argue that course
schedules can be considered reliable sources of data for examining the use and adoption of
alternative models of delivery for developmental math in the LUCCD. In this study, we treated
course sections used to teach developmental math courses as our unit of analysis, and reviewed
8,909 developmental math course sections offered by LUCCD community colleges during our time
frame of analysis.
To ensure that we captured and consistently categorized all the MODs used to teach
developmental math in the LUCCD, we adopted deductive and inductive approaches to code raw
data collected on each course section. According to Bradley, Curry, and Devers (2007), a hybrid
approach helps researchers to identify concepts already known in extant literature as well as new
ones. As a first step, we employed a deductive coding scheme by applying Rustchow and
Schneider’s (2011) taxonomy of alternative MODs to guide our initial analysis of course sections
(Miles & Huberman, 1994). Drawing on a sample of roughly 500 course sections representing the
full cross-section of developmental math courses, we specifically examined whether any course
section employed an alternative MOD. To direct our categorization, we answered the following
questions:
22
Supplemental Instruction: Were students in this specific math course section required to
co-enroll in labs, tutoring sessions, or student success courses?
Acceleration/Compression: Were students in this specific math course section enrolled
in a program designed to accelerate student progress in the developmental math
sequence?
Contextualized Learning: Was enrollment in this specific math course section restricted
to students in particular majors or with certificate aspirations?
In our coding, we discovered that some course sections offered students MODs that did not
squarely fit under Rustchow and Schneider’s (2011) taxonomy. To categorize those course sections,
we employed an inductive coding scheme that allowed us to capture new structural and
instructional innovations used to deliver developmental math in the LUCCD.
We employed this hybrid approach to code courses sections in iterations to develop our final
code structure (Bradley et al., 2007). During the coding process, we met weekly to discuss our
classification of that week’s course section data to calculate and assess our inter-coder agreement.
In cases of coding disagreement, we allocated additional time to our discussion to reach consensus
over interpretation (Bradley et al., 2007). In those instances, we revised our coding scheme to
ensure future interpretations of delivery models were consistent.
After we finished our coding of developmental math course sections, we developed a final
coding structure that could identify the range of structural and instructional innovations used to
deliver developmental math in the LUCCD. To test whether our final coding scheme adequately
categorized and described the alternative delivery programs found in the LUCCD, we calculated
Cohen’s (1960) kappa statistic on a randomly selected and independently coded set of 212 course
sections. Our calculation produced a coefficient of 0.83, which suggests that we were in close
23
agreement in the way we coded the alternative MODs in the LUCCD and as a result achieved
sufficient coding consistency (Landis & Koch, 1977; Miles & Huberman, 1994). We applied our
final coding scheme to recode all 8,909 developmental math course sections.
Based on our coding of course sections, we created a final dataset which contained one
record for each course section offered between 2005 and 2013; each record was linked to fields
identifying the semester-year the course was taught, the associated developmental math level, type
of delivery model (e.g., alternative versus traditional), type of alternative delivery model (e.g.,
accelerated, contextualized), and name of the alternative delivery model or the program through
which it was administered if there was one.
Descriptive Analysis
The heart of our descriptive analysis was focused on answering our last two research
questions. To examine student access to alternative MODs and determine whether vertical and
horizontal inequities in access exist, we first calculated the percent of course sections that offered
developmental math students an opportunity to enroll in an alternative course, and disaggregated
this statistic by math level. We then looked at the distribution of course sections offering alternative
MODs by math level to determine whether colleges focused their delivery reforms on
developmental education students who may need the most amount of support to succeed in college.
Finally, we examined the distribution of course sections offering alternative MODs within each
level of the developmental math sequence to examine whether students enrolled within each
developmental level had equal access to alternative delivery approaches.
We also explored time trends to decipher whether students of different developmental math
levels experienced changes in access to alternative MODs. To investigate this question, we took
stock of general trend changes in the percent of course sections offering alternative approaches. We
24
also analyzed where in the developmental math sequences these changes occurred by charting
trends in the use of alternative delivery approaches by developmental math level.
A Revised Typology for Developmental Math MODs
We found that most of the courses we observed largely fit into Rutschow and Schneider’s
(2011) taxonomy of developmental math models. For example, we found that community colleges
within the LUCCD often required students to concomitantly enroll in labs or student success
courses along with their developmental math courses. We also found that many colleges restricted
enrollment in certain course sections to students pursuing specific academic or career fields such as
one college that limited enrollment in pre-algebra to students learning to be automotive technicians.
Finally, at one college, students had the opportunity to take elementary and intermediate algebra in
a single semester by enrolling in courses sections offering the compression model. While these are a
few examples of alternative MODs used to provide developmental math in the district, they give a
sense of the breadth of the structural and instructional innovations that LUCCD colleges have
adopted without foundational support or a state impetus. Despite mainstreaming being a nationally
recognized alternative also found within Rutschow and Schneider’s (2011) taxonomy, we found
that the LUCCD employed placement policies that assigned students exclusively to developmental
math each year. This suggests that mainstreaming was not an intervention utilized in any of the
district’s colleges during the period of the study.
While Rutschow and Schneider’s (2011) typology by and large captured the wide variation
in the types of alternative MODs used in the LUCCD, we also found that some LUCCD course
sections utilized alternative MODs missing from their categorization. In our analysis, we found that
LUCCD colleges employ alternatives delivery which fit into four broad categories based on their
descriptions of course sections. These models: 1) reduced the amount of time a student spends in
25
developmental math, 2) changed the curricular experiences of developmental math students, 3)
provided supplemental instructional services to students assigned to developmental math courses, or
4) lengthened the amount of time a student spent in developmental math. In Table 1 in Appendix A,
we identify these categories as: 1) Accelerated Developmental Math, 2) Curricular Redesign of
Developmental Math, 3) Supplemental Instruction for Developmental Math, and 4) Extended
Traditional Developmental Math (Table 1, Appendix A).
Next, we identify the two types of alternative MODs that did not fit into the existing
taxonomy. These included MODs that extended traditional one-semester courses and those that
combined components of alternative MODs into a specific program.
The Extended Traditional Model
Contrary to alternative MODs that decrease the amount of time students spend in
developmental math education (e.g., acceleration and compression models), we found that seven of
the nine LUCCD colleges utilized a delivery model which actually increased the number of
semesters of developmental math a student takes. In our typology, we used the “Extended
Traditional” label to tag course sections which lengthened the traditional one-semester courses into
two- or three-semester long courses. These courses required students to enroll and pass two back-
to-back semester-long classes to successfully complete elementary algebra (two-levels below
transfer-level math) or intermediate algebra (one-level below transfer-level math). Based on
informal conversations with math faculty, math departments developed the Extended Traditional
model to ensure that less academically prepared students have sufficient time to master course
material. While the logic underlying the Extended Traditional model makes intuitive sense, its
failure to factor in student exit behaviors raises questions about its effectiveness to boost
achievement. As research shows, increasing the number of points where a student can exit a
26
developmental education sequence is correlated with the failure to progress through the sequence
(Bahr, 2012). Thus, while this alternative MOD benefits learners needing additional time, it also
may disadvantage students who have external commitments, resource constraints, or are unsure
about whether college is right for them.
The Combined Alternative Delivery Model
In contrast to previous literature, which discusses alternative delivery models as individual
and independently delivered (Bettinger et al., 2013; Rutschow & Schneider, 2011), we found that
LUCCD colleges offer developmental math students opportunities to enroll in course sections that
combined components of different alternative models of delivery. This alternative MOD is labeled
“Combined Delivery Model” in our typology. In Table 2 of Appendix A, we provide a snapshot of
what these programs look like, and the types of alternative learning opportunities they offer their
enrollees. Based on a review of these programs, we noticed that most grouped students into cohorts
and offered them chances to learn math in ways relevant to their career and academic pursuits;
almost all provided developmental math students supplemental instruction in the form of
tutoring/lab, advising or student success courses (e.g., ASAP, Adelante, PACE). However, few
combined alternative MOD programs decreased the amount of time needed to progress through the
traditional math sequence.
14
The use of combined models of delivery in the LUCCD are a
testament to the creativity and innovation of some college faculty and staff who are crafting
supportive programs to meet a range of student needs. The practice also highlights the difficulty of
14
One exception is Statway, which was offered at one campus beginning in the fall 2011 semester (and more recently at
another college within the LUCCD), and represented slightly over three percent of the total course sections offered per
semester at this campus. Because Statway has been one of the more popular alternative models of delivery across the
nation we include it in Table 2; however, these course sections were not included in our analysis since they required
students to also enroll in a transfer-level math course.
27
disentangling the effects of specific interventions and the reality that no one developmental math
innovation is a silver bullet solution.
Use of and Access to Alternative Models to Teach Developmental Math
Low Take Up of Alternative MODS for Developmental Math
Our overall findings from the descriptive analysis suggest that LUCCD colleges by and
large used the traditional delivery model to teach developmental math, independent of math level.
Figure 1 of Appendix A shows that LUCCD colleges offered the majority of developmental math
course sections (69 percent) in the traditional format. Although less than a third of course sections
employed an alternative delivery approach, close to half of these course sections offered students
supplemental instruction (45 percent) – an innovation most aligned with the traditional delivery
model. Six percent accelerated a student’s progress in developmental math, and approximately one
percent contextualized math instruction (see Figure 1, Appendix A). These findings suggest that the
more creative innovations in delivery have not taken root in all LUCCD math departments.
Students with the Greatest Remedial Needs are Not Receiving the Most Support
Disaggregating the data by math level, we find that the traditional format prevailed as the
dominant MOD across all developmental math levels (Figure 2, Appendix A). However, we found
slight variations in the use of alternative models to teach different developmental math levels. For
example, less than ten percent of course sections used to teach World Numbers – the lowest level of
the math sequence – were offered via an alternative MOD; 22 percent of courses at four-levels
below were alternative MODs. At the other end of the sequence, 20 percent of sections at one level
below were alternative MODs. In the LUCCD, alternative MODs were more likely to be used in the
middle levels of the math sequence (~40%). This finding suggests that students at either end of the
developmental sequence (i.e., four and five levels below and one level below college-level math)
28
experienced fewer built-in opportunities to experience new ways of structuring and teaching
developmental math that may be more effective at addressing their needs. The reason for
concentrating innovations in developmental math in the middle levels of the math sequence might
stem from the fact that the majority of students place, and subsequently, enroll into these two levels.
Roughly 60 percent of all students assigned to developmental math enrolled into these two math
levels based on 2013 district enrollment data. Despite the ostensible logic of locating the bulk of
innovations in the middle levels of the develop math sequence, what is clear is that students deemed
the least college-ready had less access to more innovative, more customized structural and teaching
approaches relative to those deemed more college-ready.
Disaggregating this analysis by type of alternative MOD, we also found that no single
alternative MOD dominated in the lowest levels of the developmental math sequence (Table 3,
Appendix A). In fact, the majority of course sections offering the combined delivery model, the
extended delivery model, and the curricular redesign model were located in the upper two levels of
the developmental math sequence. Additionally, over 50 percent of course sections offered
supplemental institution were located three-levels below transfer-level math. These results show
that not only did the most academically underprepared students lack overall access to alternative
MODs, but they also lacked access to particular alternative MODs. These students, unlike those in
the upper levels of the math sequences, are more likely to face a wider array of unique academic
and non-academic challenges, in part because they are assigned to longer series of math courses,
some of which do not count towards a college degree. That said, it might be more difficult for
colleges to retain these students without greater investment in interventions that can reduce the
influence of these challenges.
29
Students Do Not Have Equal Access to Different Types of Delivery Innovations
In addition to understanding where colleges located alternative approaches to delivering
developmental math, we were also interested in examining how LUCCD colleges distributed
alternative MODs within each developmental math level (Table 4, Appendix A). Doing so gave us
insight into whether students of differing levels of academic preparation have a variety of
alternative delivery options to choose from when they enroll in math. From our analysis, we found
that supplemental instruction prevailed as the main alternative delivery model for course sections
taught in the three lowest levels of the math sequence. By contrast, the extended traditional format
was the most prevalent alternative MOD used to teach the two highest levels of the math sequence
– 55 percent of course sections two levels below college-level math and 62 percent of course
sections one level below. Nearly a fifth of course sections one level below transfer-level math
allowed students to enroll in programs that combined multiple alternative delivery models
compared to just three percent of course sections offering the same type of support four levels
below transfer-level math. This analysis makes it clear that students enrolled within each level of
developmental math sequence have limited opportunities to access the diverse array of alternative
MODs in use.
There are a number of hypotheses explaining why colleges locate specific alternative MODs
at different levels of the developmental math sequence. One is the belief that students in lower
levels of the sequence will benefit more from additional instruction while students in the higher
levels may benefit more from additional time to understand the content. Another is that students
placed into higher levels of the sequence are more likely to be enrolled as full-time students and are
able to spend additional semesters of coursework to focus on mastering content. As a last example,
adjunct faculty are more likely to teach lower levels of the math sequence and may not have the
30
time or resources to learn how to teach a developmental course section delivered as an alternative
MOD.
Trends in the Use of Alternative MODs
Next, we examined whether adoption of alternative MODs gained or lost traction over time,
and whether adoption trends varied by type of alternative MOD. To analyze the usage and adoption
patterns of alternative MODs, we focused on the 2005 to 2012 fall-to-fall semesters in order to
account for variation in fall-to-spring course offerings.
Increased Adoption of Alternative MODs over Time, Sharp Decrease in 2010
In Figure 3a in Appendix A, we plot the percent of course sections LUCCD colleges offered
as alternative MODs over time. Our figure illustrates that while there was a decrease in the usage of
alternative MODs from 2010 to 2011, overall, the adoption of alternative MODs has increased over
time. It is possible that the LUCCD may have become more responsive to increasing evidence
showing low success rates among remediated students, and consequently, more open to introducing
innovations in developmental math. The dip beginning in 2010 may reflect budget reductions that
affected course offerings state- and district-wide (PPIC, 2013). It also may be linked to the fact that
around this time the California Community College Chancellor’s Office started to require
community colleges across the state to disclose the number of students who were actually
participating in non-course based instruction (e.g. tutoring sessions and labs), since colleges
received additional funding for providing these services. Because some LUCCD colleges found the
reporting requirements onerous, they abandoned offering these services as supplemental instruction.
31
Many absorbed these services into classroom instruction, or provided them as entirely separate non-
credit courses.
15
No Major Changes in the Use of Alternative MODs, Slight Uptick in Accelerated Models
Figure 3b in Appendix A disaggregates adoption trends by alternative MOD to determine
whether certain types of alternative MODs gained or lost traction over time. Overall, we found that
supplemental instruction and the extended traditional MODs were the most frequently employed
over time. We also observed from 2010 to 2011, course sections offering supplemental instruction
experienced a rather large decrease (54 to 37 percent) and continued to decrease to 2012 (23
percent). This decline after 2010 coincides with an uptake in alternative MODs which extend the
amount of time spent in developmental math (40 to 47 percent), reduce the time (2 to 13 percent),
and combine multiple MODs (4 to 16 percent).
Increased Use of Accelerated MODs in Lower Levels, Combined Approach in Highest Level
Our final analysis aimed at identifying adoption trends of alternative MODs by
developmental math level to understand the extent to which students’ access to alternative MODs
varied over time. In the following figures, we only include the two most common alternative MODs
utilized within each developmental math level, and plot the percent of traditional course sections as
a reference point. We also only present the figures for four, three, two, and one level below college-
level math, since close to 99 percent of course sections offered five-levels below transfer-level were
taught using the traditional model.
16
15
Based on conversations with LUCCD institutional researchers.
16
There was one semester (spring 2007) when the college with a five-level below course introduced course sections
with a supplemental instruction component.
32
Analysis presented in Figure 4 in Appendix A shows that though the traditional model was
the most common MOD used over time and across developmental math levels, the adoption
patterns of alternative MODs varied by math level. Starting in 2005, LUCCD colleges began to
offer students enrolled in the two lowest levels of developmental math (i.e. arithmetic and pre-
algebra) an increasing percentage of course sections with supplemental instruction; yet, in 2009,
this trend started to reverse itself, and fewer students enrolled in these courses had access to
tutoring or student success courses. Aside from some variation in the percent of courses offered
with supplemental instruction, the most interesting trend is that in recent years, colleges introduced
accelerated models in the course sections at these levels. Acceleration models accounted for none of
the course sections at four levels below in fall 2010 but jumped to nearly 10 percent in fall 2012.
Similarly, at three levels below, more course sections were offered via accelerated models in recent
years (2 percent in fall 2011 to 10 percent in fall 2012). These upticks imply that students at these
lower math levels had more access to accelerated courses than students in previous years.
In contrast to alternative MODs utilized for math at the lowest levels of the sequence, there
has been a constant trend to deliver the extended traditional model to students enrolled in the more
rigorous developmental math courses. However, the percent of intermediate algebra course sections
offering a combination of alternative delivery models increased by ten points. It is possible that
these more innovative interventions that utilize a combination of strategies are aimed at helping
those most likely to persist and succeed in college.
Discussion
Evidence from this study suggests community colleges without external pressure or with
limited support may face significant barriers impeding the use and adoption of structural or
instructional innovations to deliver developmental math. We find that within the LUCCD the
33
traditional model continues to dominate among approaches used to teach developmental education,
and that nearly half of the innovations identified offered students supplemental instruction, an
alternative most closely aligned with the traditional MOD. We also find that developmental math
students experience disproportionate access to new innovations. Compared with students enrolled in
the lower levels of the developmental math sequence, students enrolled in levels nearer to transfer-
level math were advantaged not only because they could access a greater number of course sections
offering an alternative approach but also because they could access a wider range of alternative
approaches. Looking over time, while we do find that there has been movement to adopt alternative
delivery approaches, this shift has been incremental at best.
There are several possible explanations for these findings, some of which have come from
conversations we had with LUCCD administrators and institutional researchers. For one,
implementing some of these MODs necessitates substantial resources and concerted efforts. For
example, launching a contextualization program such as one with an allied health focus would
require that math faculty work with industry partners to create relevant curriculum. Indeed, the
differences we observe by college in the offering of contextualized programs may be linked to each
college’s degree of partnership with local industries.
Second, regulations governing how community colleges deliver curriculum and course
content in developmental math are not flexible enough to allow innovation to happen quickly. In
California, faculty interested in re-engineering a course must enter a long and potentially costly
process to receive approval from several regulatory bodies located at the state, district, and
institutional levels. According to institutional researchers from the district, the average time it takes
faculty to receive the go-ahead could take well over a year, which dissuades some faculty from even
considering changing how they structure and teach their developmental math courses. It also
34
encourages faculty to change their teaching without notifying officials. Further, because the State of
California requires community colleges to document in detail the number of students attending non-
course based instruction, it appears that the provision of such services – at least as weekly
supplemental instruction tied to a developmental math course – was snagged. These types of
policies and requirements, while aimed at promoting efficiency, may also be inadvertently curbing
the introduction of reforms that may be beneficial to students. It is therefore important to examine
the ways in which governance processes can promote and help sustain innovative practices.
Third, organizational practices may deter innovation from occurring – particularly in the
lower levels of developmental math. One concern is the assignment of faculty to developmental
math courses. Gerlaugh, Thompson, Boylan, and Davis (2007) estimated that only about 20 percent
of developmental math courses nationally were taught by full-time faculty. In conversations with
LUCCD institutional researchers, we learned that tenure-track and tenured faculty often lay claim to
upper level developmental or transfer math courses, leaving part-time, non-tenured track faculty to
teach lower-level or developmental education courses. The concern here is not necessarily that
adjunct faculty are teaching these courses, but that they may lack opportunities and resources to
implement reformed approaches that may benefit developmental education students.
Related to this is a fourth explanation that the California community college system or
colleges themselves have not cultivated an environment that supports faculty and administrators in
thinking differently about how they deliver developmental math (Bickerstaff, Lontz, Cormier, &
Xu, 2014). Research shows that organizational change is often driven by leadership invested in
innovation (Brewer & Tierney, 2011; Kezar & Eckel, 2002), and faculty who have a clear
understanding of how their decisions, actions, and practices affect the success of their students,
particularly ones who have been historically underserved (Bensimon, 2007). A clear example of the
35
influence of leadership on the uptake of innovation is two LUCCD colleges’ decision to pioneer
Statway, whose core faculty believe in the Carnegie Foundation for the Advancement of Teaching’s
vision in combining algebraic and statistical concepts into a two-semester course to help students
succeed in math and transfer to a four-year institution. Nevertheless, it seems these efforts are only
beginning to take root because the bulk of course sections at both institutions remain focused on
using traditional ways of structuring and teaching developmental education. Despite the fact that
LUCCD colleges are only inching toward using more creative innovations to help students succeed
in developmental math, it seems as though some district colleges do recognize that their students
need a modified version of the traditional model. At one institution, the majority of course sections
in developmental math offered supplemental instruction, while another offered the majority of its
course sections as extended courses. Whether these delivery models are effective is unclear, but we
believe that their implementation is done in good faith by faculty who are interested in the academic
welfare of their students.
Finally, researchers and decision makers are uncertain about the academic and career
benefits these innovations yield because research on the whole has been inconclusive about their
impacts on success. As Karp and Fletcher (2012) mention, enacting reforms requires colleges to
invest significant amounts of financial and staff resources, and in lean economic times, faculty and
administrators must carefully select those that can generate the biggest bang for the buck. The fact
that many innovations are actually often a host of reforms bundled up in one program only
compounds the problem of selecting the most efficacious program. To be able to disentangle
effective from ineffective programs and practices, researchers ought to continue to quantitatively
and qualitatively analyze alternative MODs and, importantly, distill research findings into policy-
and practice-relevant information that colleges can more readily consume and apply. Future
36
research is also needed to more clearly identify the advantages and disadvantages of alternative
models absent from current taxonomies but currently being used by community colleges – such as
the extended traditional model implemented in the LUCCD.
While there are no easy answers to solving the problem of low success rates in
developmental education, injecting innovation in the way community colleges deliver
developmental math has the potential to make developmental education more responsive to the
needs and goals of students deemed underprepared for college. Colleges – especially in the early
stages of adoption – will require a steady of stream of funding and technical support to ensure that
innovations can be implemented. While money and technical assistance are important, if we are
invested in helping colleges in the long term draw on new and innovative ways of thinking to
improve delivery of developmental education, we need to create feedback loops that allow us to
determine what works and what does not, build institutional knowledge and capacity, and develop a
culture that is more open to risk-taking.
37
CHAPTER THREE
GIVING COMMUNITY COLLEGE STUDENTS VOICE: THE EFFECTS OF MATHEMATICS
SELF-PLACEMENT ON ACADEMIC ACHIEVEMENT
The economic crisis has dramatically heightened concerns among policymakers about the
state of community colleges to improve graduation rates among the increasing proportion of
students who enter their campuses underprepared for college-level coursework (The Century
Foundation, 2013).
At the heart of these concerns is developmental education, an area of education explicitly
designed and principally used by community colleges to develop or reinforce English, math, and
reading skills for incoming students with weak academic preparation. While estimates indicate that
over 60 percent of entering community college students are placed into at least one developmental
education course (NCPPHE & SREB, 2010), less than half actually finish their developmental
education sequence, and far fewer pass an entry-level college course that counts toward transfer to a
four-year institution after three years of initial enrolling (Bailey, Jeong, & Cho, 2010). Taken
together, these statistics serve to underscore that the Obama administration may fall short on
meeting the goals set forth in the 2009 American Graduation Initiative
17
if community colleges are
unable to solve the problems facing developmental education.
An area of developmental education that policymakers and researchers have identified in
need of significant overhaul is placement regimes that identify who is ready for college-level
coursework (Burdman, 2012; Scott-Clayton, 2012). While the vast majority of community colleges
use placement tests (e.g., ACCUPLACER and COMPASS) to assign students to developmental
17
In 2009, the Obama administration launched The American Graduation Initiative, which seeks to strengthen
community colleges by creating a center specifically dedicated to researching initiatives aimed to improve college
success, and providing financial and technical assistance. For more information, you can visit:
http://www.whitehouse.gov/the_press_office/Excerpts-of-the-Presidents-remarks-in-Warren-Michigan-and-fact-sheet-
on-the-American-Graduation-Initiative
38
courses (Primary Research Group, 2008), there is increasing evidence suggesting that these
placement tests misdiagnose a student’s need for developmental education. Researchers from
Columbia University find that over 25 percent of community college students who take a placement
test are being incorrectly assigned to developmental education1 (Scott-Clayton, 2012), and that
thousands of remediated students would have received an A or B in college-level courses based on
their high school performance (Hughes & Scott-Clayton, 2011; Scott-Clayton & Rodriguez, 2012).
To increase placement accuracy, some reformers advocate for giving students greater voice in
determining their level of college readiness (Royer & Gilles, 1998). This approach – coined by
Royer & Gilles (1998) as directed self-placement (DSP) – forces a student, with some assistance
from faculty and administrators, to decide if and the extent to which they need developmental
education to succeed in college. Support for DSP stems principally from elements of self-
determination and student engagement theories asserting that granting students latitude over their
education promotes their academic engagement, which positively relates with academic success
(Astin, 1985; Ryan & Deci, 2000).
Although a handful of community colleges in California and Oregon currently use DSP to
assign students to developmental education (Burdman, 2012), this approach nevertheless is gaining
popularity among policymakers and reformers seeking to improve the effectiveness of
developmental education. In 2013, Florida and Connecticut passed legislation giving students who
would have been assigned to a developmental course based on their placement score the option to
mainstream into college-level courses if they feel they can succeed (Fain, 2013). DSP has also been
mentioned as an alternative to current placement regimes by Achieving the Dream and Jobs for the
Future, two nation-wide initiatives dedicated to improving college readiness and career
advancement among disadvantaged students (Burdman, 2012).
39
Despite the potential of DSP to improve placement for developmental education, we
currently lack causal evidence on the effectiveness of DSP on improving student success in college.
I make original empirical contributions to the field of DSP and developmental education by
answering this research question: What is the impact of DSP versus a test-based placement regime
that assigns students to developmental math on scholastic achievement? To answer this question, I
exploit a community college’s failure to renew their COMPASS license with the ACT as a means
of addressing selection bias in determining whether self-placement regime adds values over a test-
based placement regime on student outcomes. For this study, I specifically focus on math because a
larger proportion of students place into developmental math than developmental English (Bailey,
Jeong & Cho, 2010), and DSP literature on math is scant. This empirical piece adds to the
burgeoning corpus of research dedicated to not only identifying more effective and efficient
interventions for improving success among underprepared college students, but also understanding
the conditions that stifle or promote the successful reform of developmental education (Bailey,
Jeong, & Cho, 2010; Fong, Melguizo, & Prather, 2013; Hodara & Jaggars, 2014; Jaggars, Hodara,
Cho, & Xu, 2014; Kosiewicz, Ngo, & Fong, 2014; Melguizo, Kosiewicz, Prather, & Bos, 2014;
Ngo & Melguizo, 2015; Rustchow & Schneider, 2010; Scott-Clayton, Crosta, & Belfield, 2014;
Scrivener, Weiss, Ratledge, Rudd, Sommo, & Fresques, 2015; Venezia, Bracco, & Nodine, 2010)
This study seeks to provide researchers and provide decision makers and researchers clearer
insight into the effectiveness and limitations of interventions that leverage student autonomy to
improve academic achievement. This study has particular salience for the state of California, which
is now in the process of reforming how community colleges are assessed for developmental
education. In 2011, the state legislature passed Assembly Bill 743, which requires the California
Community College Chancellor’s Office to develop a statewide placement instrument that redefines
40
the way community colleges identify students who require additional academic support to succeed
in college. Because of budget shortfalls, the State has stalled its design and implementation
(Burdman, 2012), giving researchers and community colleges the unique opportunity to more
thoroughly inform state policy by testing alternative placement approaches aimed at improving
success in college. As a result, this study is well-positioned to inform if and how DSP for
developmental math should be incorporated in placement policies in California, a state with the
largest community college system in the nation.
A Brief Overview of Remediation Assessment and Placement Policies in Community College
Although a number of assessment and placement instruments have been developed over the
years to assess a student’s level of college-readiness, a placement test is the most commonly
utilized. An estimate stemming from 2003 showed that over 90 percent of community colleges use
this method to test students for remediation (Parsad, Lewis, & Green, 2003), and that of these
institutions, roughly 62 percent of these institutions used the ACCUPLACER and 46 percent used
the COMPASS
18
(Primary Research Group, 2008), which are computer-adaptive placement tests.
Less used measures are diagnostic placement tests, such as the Mathematics Diagnostic Testing
Project, or what is commonly referred to as the MDTP
19
(NCPPHE, 2008), or DSP (Burdman,
2012).
There are a number of practical advantages of using a commercially available placement
tests for assessing college-readiness. For one, they have been tested for validity and reliability,
which can help cash-strapped community colleges save resources needed to prove the soundness of
placement instruments to governing bodies. In addition, some locally-developed instruments are
18
The ACCUPLACER is a computer-adaptive test developed by the College Board. The COMPASS is a computer-
adaptive test developed by the ACT.
19
The MDTP is a diagnostic test jointly developed by the University of California and the California State University
systems. Higher education institutions can administer the paper-pencil or the computerized version of the MDTP.
41
aligned with state standards, which can help to ensure students are tested on content covered in high
school. The MDTP measures student’s academic strengths and weaknesses on California
Mathematics Standards from grade 4 through high school (MDTP, 2000). Finally, commercially
available placement tests– notably the ACCUPLACER and the COMPASS - also allow
community college administrators to test students efficiently since they can refer students to
multiple subtests and produce a final placement within a single testing session (Melguizo,
Kosiewicz, Prather, & Bos, 2014).
Nevertheless, recent research casts doubt on the effectiveness of placement tests to
accurately assign to students to various developmental and college-levels math courses. Evidence
stemming from work conducted by Hughes and Scott-Clayton (2011) suggest that some placement
tests may be more effective in predicting success for higher-performing students than lower-
performing students. In subsequent work, Scott-Clayton, Crosta, and Belfield (2014) found that
nearly a quarter of students are being misplaced into developmental education as a consequence of
the use of placement tests.
There are a number of reasons why test-based placement policies may fail to discriminate
which students will or will not succeed in a course with precision. Boylan, Bonham, and White,
(1999) state that placement tests typically do not consider affective or non-cognitive characteristics,
like motivation, when assigning students to developmental education. Research has long
demonstrated that a student’s success in college is determined by multiple factors besides a
student’s cognitive abilities (Kuh, Kinzie, Schuh, Whitt, & Associates, 2010). Recognizing this
evidence, California and Texas require community colleges to factor in measures besides a student’s
test score in making placement decisions. These measures can include past educational experience,
college plans, or student motivation (CCCCO, 2011). Other states like Florida and Connecticut,
42
have incorporated elements of DSP
20
, into their state-wide placement policy legislation. In both
states, students have the option to mainstream into college-level courses if they feel capable of
succeeding, so long as they receive additional supports (Fain, 2013).
In the following section, I summarize the evidence on the effectiveness of Directed Self-
Placement on improving student success to justify this study.
The Efficacy of Increasing Voice on Improving Student Success: What Do We Know?
While research investigating increasing student voice in educational decision-making in K-
12 institutions is growing (Cook-Sather, 2006; Levin, 2000; Mitra, 2003; Rudduck & Flutter, 2000),
parallel work in higher education is markedly underdeveloped (Seale, 2010). DSP research on
developmental math and community colleges is particularly scarce since the bulk of this work has
centered on developmental writing within the context of four-year institutions.
Support for the use of a self-placement strategy to assign students to developmental or
college-level courses stems principally from two theories of psychology, self-determination theory
(Deci & Ryan, 1995) and student engagement theory (Astin, 1985), which together suggests that
engagement in learning is in part driven a human’s inherent need to feel autonomous (Deci & Ryan,
1995). Thus giving students latitude over important decisions that affect their educational trajectory
may empower students to take control over their success in college, persuading them become more
engaged and invested in college.
Although the linkages between self-determination, engagement, and success make intuitive
sense, to date, few studies have rigorously evaluated the impacts and benefits of DSP on student
success in college. On the whole, this research has produced inconclusive, if not, narrow findings
(Chernekoff, 2003; Felder, Finney, & Kirst, 2007; Luna, 2003; Royer & Gilles, 1998, 2003;
20
Directed Self-Placement is also known as Informed Student Self-Assessment or Informed Self-Placement.
43
Thompkins, 2003); the uncertainty about its benefits stems partly from several methodological
shortcomings in quantitative research. Quantitative analyses of DSP for developmental education
are descriptive, and do not systematically compare the outcomes of students exposed to DSP to
those placed under other assignment programs. These studies fail to control for the myriad
differences (e.g., academic backgrounds, engagement, motivation) that exist between students who
are and are not allowed to self-place into developmental education. Further, most of these analyses
use data from pilot DSP interventions, and thus lack statistical power to detect DSP’s true impacts
because of small sample sizes. Finally, these studies tend to focus on short-term achievement
outcomes (e.g., passing the placement or subsequent course), limiting what we know about DSP’s
effectiveness to increase a student’s chances of reaching critical academic milestones (e.g.,
satisfying degree or transfer requirements) considered crucial for graduation or upward transfer.
Taken together, these findings are difficult to interpret and may not provide an accurate depiction of
the causal impacts of DSP on a wide range of achievement measures. To address the evident
weaknesses of extant evaluations of DSP, this study employs an estimation approach that by
definition adjusts for unobservable and observable differences between students who did and did
not receive the opportunity to self-place into the developmental math sequence.
Natural Experiment: Exogenous Change of Placement Regime at College X
After the spring of 2008, a college (henceforth ‘College X’) in a large urban community
college district in Southern California (henceforth ‘LUCCD’) forgot to renew their COMPASS test
license with the ACT in time, and thus was forced to allow students to determine their math
placement level for the 2008 summer and fall semesters. Students at College X were told by
assessment staff during that time that they could take a math course that best matched their abilities
and skill set after reading through a self-placement guide and talking with a counselor. Because of
44
this abrupt change in placement regimes, this event can be considered a natural experiment, and
capable of generating causal claims on the benefits of employing DSP relative to the status quo on
scholastic achievement (Angrist & Pischke, 2009).
21
Before the placement policy change, College X employed COMPASS – a computer
adaptive placement test - to assign students to math courses. Students who test-placed into the math
sequence experienced the assessment and placement process in a detached fashion (Melguizo,
Kosiewicz, Prather, & Bos, 2014). Typically – in this order – students entered the assessment
office, filled out requisite paperwork, took a computer-adaptive test, and almost instantaneously
received their placement results after finishing. Students who self-placed into math experienced the
assessment and placement process entirely differently. Self-placed students entered the assessment
office, and were directed to a room where they were given a self-placement guide, which contained
course descriptions, pre-requisite problems for each math level, and questions about their previous
math achievement and course-taking in high school. Self-placing students were required to read
through this guide and subsequently talk with a counselor before choosing a math class that they
thought best matched their cognitive and non-cognitive skill sets. Students interested in taking
transfer-level math courses above entry-level were told to provide evidence of passing the A.P.
Calculus or A.P. Statistics test with a score of three or higher. College X reverted to using
COMPASS to assign students to developmental math in the spring of 2009.
College X annually serves approximately 10,000 full-time equivalent students
22
. Although
College X attracts an ethnically diverse student population, over 45 percent of students identify as
21
College X reverted to using the COMPASS placement test in the 2009 spring semester.
22
The estimate is calculated based on the total number of students enrolled in College X during the 2009-10 academic
year. Full-time equivalent student (FTES) is defined as a student enrolled in more than 12 credit hours. Calculations on
the number of full-time equivalent students enrolled in an institution are based on a combination of full-time and part-
time students. Full-time students are equivalent to 1.0 FTES; part-time students are equivalent to 0.5 FTES.
45
Latino based on 2008 data obtained from the LUCCD. The overwhelming majority of students who
attended College X were high school graduates, but approximately 60 percent of them assessed for
the first-time in math were assigned to a developmental math course during the 2008 school year.
23
Setting
The LUCCD is one of the largest community college district in California and in the
country, serving roughly 220,000 students each year (CCCCO Data Mart, n.d.). It is comprised of
nine colleges that enroll a diverse group of students, the majority of whom identify as racial and
ethnic minorities. Based on data from the entering fall 2008 cohort, most of the district’s students
identified Latino, and approximately 15 percent were African-American.
Colleges belonging to the LUCCD have similar developmental math sequences. In eight out
of nine colleges, the developmental math sequence starts with arithmetic (four levels below
transfer-level math), and is followed by pre-algebra (three levels below), elementary algebra (two
levels below), and intermediate algebra (one levels below). In one college, a course focused on
basic numeracy (named World Numbers) precedes arithmetic and is located five levels below the
level of math that is required for transfer to a four-year university in the California State University
(CSU) or University of California (UC) systems.
24
Before 2009, California required students to pass
elementary algebra to receive an associate’s degree. Now, they must pass intermediate algebra, one
math level higher than elementary algebra, to meet the minimum math requirement for an AA
degree. For the past 10 years, LUCCD colleges has placed about 80 percent of their students into a
developmental math course upon enrollment, even though most are high school graduates.
23
California state policy does not require community college students to be high school graduates.
24
Examples of courses counting towards transfer-level math include statistics, math for liberal arts majors, and pre-
calculus or trigonometry.
46
For this study, College X is considered the treatment campus, and the remaining eight
community colleges within the district are considered the control colleges. I present evidence
showing that the colleges that did not receive treatment serve as proper controls later in the paper.
Data Sources
This study employs a rich set of repeated cross-sectional data obtained from the LACCD. I
draw on roughly 30,000 administrative student-level records from all nine LACCD community
colleges that link a student’s demographic data, course placement, and enrollment and performance
histories, and the college where they placed and enrolled. Because of the way the data are
structured, I am able to track a student’s enrollment and performance patterns across time for
cohorts assessed for and enrolled in math for each semester between the summer of 2005 and the
summer of 2009. Enrollment and performance data extend from the summer of 2005 to the summer
semester of 2013. For this study, I restrict the analysis to first-time college students because many
states require students to assess for developmental education before taking any course in college.
For example, California just recently passed the Student Success Act, state-wide legislation
obligating all community colleges to assess and place incoming community college students into
either developmental or college-level courses before they matriculate in college. In Texas, students
are also assigned to developmental education based on their pre-college test scores. Any shifts
away from using placement test to assess college-readiness will likely affect only incoming
community college students, and this study aims to provide empirical evidence that speaks to how
this population would respond to a change that would give them greater latitude over decisions that
assess their degree of preparedness for college.
47
It should be noted that data collected on students allowed to self-place is limited relative to
data on students who test placed and enrolled into math
25
. Based on conversations with district
officials, College X did not collect data on the total number of students who were given the
opportunity to self-place into math sequence. Instead, they collected data on self-placers who were
given the opportunity to self-place and actually enrolled in the math sequence. Therefore,
measurement of the impacts of self-placement versus test-placement on success is restricted to
students who assessed and enrolled in the math sequence. 386 first-time community college
students out of approximately 760 self-placed and enrolled in the math sequence in the summer and
fall of 2008 at College X
26
. A total of 29,515 of first-time community college students test-placed
and enrolled into math between the summer of 2005 and the summer of 2009 (see Table 1 for
sample size distributions across treated and control institutions in Appendix B). To minimize bias,
the sample is restricted to students who assessed and enrolled in math the same semester.
A Difference-in-Difference Estimation Strategy
This study employs a difference-in-difference estimation strategy to identify the average
treatment effects of a self-placement model relative to a test-based placement model on proximal
and distal achievement outcomes for students who assessed for and enrolled in math
27
between the
summer of 2005 and the summer of 2009. Here, the treatment effect estimate measures the
combined impacts of the opportunity to place into any class constituting the math sequence as well
as the self-determined course placement on academic outcomes
28
for first-time community college
25
I observe where a self-placed student assigned themselves to math by examining where they initially enrolled in
math. In contrast, I observe where a test-placed student assigned themselves to math by their placement score.
26
Roughly 58% of students who test-placed into the math sequence enrolled in a math course.
27
I restrict the analysis to students who assessed for and enrolled in math because I can only observe self-placement by
examining where the student initially enrolled in the math sequence.
28
Based on the identification strategy I propose, disentangling one treatment effect from the other is impossible.
48
math enrollees. I use a difference-in-difference estimation strategy to control for observable and
unobservable factors that jointly correlate with the opportunity to self-place in the math sequence
and academic success. Model 1 is the reduced form difference-in-difference equation estimating
this treatment effect
29
:
(1) yitc =γ0 + λ*TreatedSemesterst + β*Self-placementc + δ*(TreatedSemesterst*Self-placementc) +
λt + λc + θ*Xitc + εitc
Here, yitc is the academic outcome of interest for individual i in college c in time t; δ
measures the treatment effect of self-placement regime on academic outcomes; θ is the coefficient
of a vector of individual covariates that are likely to predict academic outcomes such as student sex
and race; λt are semester-cohort fixed-effects
30
; λc are time-invariant institution-level fixed-effects
31
;
and εitc is the individual-level error term. Institution-level fixed effects are included in the model
because colleges within the LACCD attract different student populations and receive different
amounts of funding from local property taxes. Both of these factors may confound the overall
estimation of treatment. I also include semester-cohort fixed effects to control for cohort specific
linear time trends. Although the main estimation model specified above assumes a linear functional
form, I estimate the effects of DSP on binary academic outcomes using a probit model. Because
achievement may be influenced by the institution which the student attends, I cluster the standard
errors at the institution of enrollment.
To estimate the value that a self-placement regime adds over a test-placement regime on
student success, I examine outcomes and milestones considered important for community college
29
i = individual; c=college; t=time
30
Since I can only observe enrollment for self-placed students, semester-cohort fixed effects are measured as the
semester in which a student enrolled, not assessed. Doing other may introduce bias in results.
31
Since I can only observe enrollment for self-placed students, institutional fixed effects are measured as the institution
in which a student enrolled.
49
students (Melguizo, Bos, & Prather, 2013): a) whether a student withdrew from their first math
course on their first attempt, b) whether a student failed their first math course on their first attempt,
c) whether a student successfully passed the math requirements for an Associate’s Degree, d)
whether a student successfully passed a transfer-level math course, e) whether a student completed
30 degree-applicable credit hours (i.e. half of the credit hour requirement to complete an
Associate’s Degree) f) 60 degree-applicable credit hours, and, g) the total number of degree-
applicable credits earned. I measure whether a student achieved distal outcomes (i.e. outcomes c-g)
four years after their first semester of initial math enrollment. I impute zeros in cases where an
outcome is unobserved
32
. In those cases, students who attempted and failed a course are treated
equally as students who never attempted a course. Failing to impute zeros for students who did not
obtain grades for certain courses would likely bias results upwards.
Assumptions for Identification
To produce valid causal estimates on the effect of self-placement regime relative to test-
based placement regime on student success, achievement trends between College X and the
remaining eight colleges using a test-based placement policy must have been probabilistically
equivalent prior to the switch in placement regimes. Else, differences in achievement trends before
the change in placement policy will lead to a biased estimation of the treatment effect of self-
placement on achievement. To test if this assumption holds, I regressed each achievement outcome
on an interaction between year and College X, controlling for cyclical achievement patterns that
32
Consider the case where a student never reaches a course that counts towards transfer-level math credit. In this
instance, we do not know whether that student would have passed that course had they had the opportunity. However,
since the goal of this study is to examine whether self-placement encourages greater success in these outcomes,
attributing zero to students who never reached, and therefore, never obtained this outcome is justified.
50
may make the coefficient measuring differences in achievement trends pre-policy change difficult
to interpret.
(2) yitc =γ0 + λ*Springt + β*Fallt + ζ*yeart δ*(Yeart*Self-placementc) + εitc
Here, yitc is the academic outcome of interest for individual i in college c in time t; δ
measures differences in achievement trends between College X and the control colleges prior to the
switch in placement regimes; λ and β control for cyclical patterns in achievement trends across
enrollment cohorts; and εitc is the individual-level error term. Like the main model, standard errors
are clustered by the institution of enrollment.
Table 2 in Appendix B presents results from this model, and finds that trends in all
outcomes between College X and the control colleges before the change in placement regime are
practically the same, with point estimates as well as standard errors converging to zero. Figure 1 in
Appendix B shows these trends visually. These results indicate that the achievement trends
experienced by control colleges serve as valid counterfactuals for what would have happened in
College X in terms of achievement had the college not switched from using a test-based placement
to a self-placement regime.
Another assumption underlying the internal validity of difference-in-difference estimation
strategy is that the change in placement regimes at College X is unrelated to other trends that may
have impacted academic achievement. Interviews with faculty members at College X, indicate that
the failure to renew its test license with the ACT was truly a mistake, and was not precipitated by
low student success rates in developmental education. Despite the random fashion in which DSP
was adopted, it is possible that other changes that might have also impacted achievement occurred
at the same time as self-placement. One particular concern is that the opportunity to self-place may
have attracted different sorts of students to its campus since it provides students a way out of taking
51
developmental math courses. To test for the existence of this threat, I switched out the dependent
variables in Model 1, and replaced them with student-level demographic characteristics. Table 3 in
Appendix B presents results from these regressions, and finds that switching from a test-based
placement regime to self-placement regime changed the profile of students who assessed and
enrolled in math. Specifically, self-placement increased the proportion of enrollees who considered
themselves of “other” race by approximately 2 percentage points, and those who reported English
as their native language by roughly 5 percentage points. It also decreased the proportion of students
who considered themselves White by a little under 1.5 percentage points.
If the relation between race and native English status and self-placement into math is
spurious, this would lend credence to the assumption that there were no concurrent shifts at the time
when College X switched to using self-placement. And yet, this evidence suggests that DSP may
have induced shifts in other student observable and unobservable characteristics, which cannot be
detected here. These results thus intimate that it may be difficult to isolate the effect of DSP from
other changes affecting academic achievement.
Results
Table 4 in Appendix B presents summary statistics for the sample of students who assessed
and enrolled in math between the summer of 2005 and the summer of 2009, split for the colleges
that did and did not employ a self-placement regime, and by period of test-placement versus self-
placement. Descriptive results show that on the whole the percent of students who were female or
African-American stayed relatively the same between the period of test-placement and self-
placement in College X and the treatment colleges. However, results also suggest that a notably
higher percentage of Latinos, and native English speakers assessed and enrolled in math during the
period of self-placement relative to the periods of test-placement in both treated and control
52
colleges. That said, results show that College X and control colleges experienced similar
demographic shifts.
Table 4 also shows that a higher proportion of math enrollees in College X during the period
of self-placement, met the minimum math requirement for an A.A. passed a transfer-level math
course, completed 30 degree-applicable units, and completed 60 degree-applicable units relative to
students in College X during the period of test-placement. Students in the control colleges across
these two time periods experienced similar gains, except for passing a transfer-level math course.
Table 4 also suggests that a lower proportion of students in College X during the period of self-
placement withdrew from their first math course compared to College X students who test-placed
and enrolled in math. The propensity to withdraw from their first math course was six percentage
points lower in College X relative to the control colleges across both time periods.
Differences in Math Assignment and Enrollment Patterns between Test- and Self-Placers
In this section I examine differences in math assignment and enrollment patterns in College
X and control colleges, during test-placement and self-placement periods. During the test-based
placement, Table 5 in Appendix B shows that the majority of students in College X and control
colleges were placed in the middle to lower levels of the developmental math sequence. This means
that the majority of test-placed students needed to take a minimum of two developmental math
courses before reaching math courses counting towards upward transfer. While a small percentage
of students who assessed into transfer-level math in the control colleges during this period, no
students in College X test-placed into that math level. Overall, these results suggest that the
majority of incoming students were deemed by a placement test to have extensive remedial math
needs. Over time, we see a larger proportion of test-placed students gaining access to transfer-level
math courses, which suggests that younger cohorts might be performing better in high school than
53
the older cohorts. This result implies that there might be is a positive secular achievement trend
with time
33
.
Table 5 in Appendix B also finds that the distribution of first math courses almost mirrors
the distribution of assigned math courses in both College X and the control colleges during test-
placement. This similarity can be explained by the fact that nearly 88 percent of students who test-
placed into the sequence complied with their placement results. According to district policy,
students are allowed to challenge their placement results if they feel that they were underplaced (i.e.
placed too low) by taking an ad-hoc test typically administered by the chair of the math department.
Test-placed students are also allowed to choose a lower, less rigorous math class if they felt that
they were overplaced (i.e. placed too high) without having to talk to a counselor or a math faculty
member.
Focusing on enrollment patterns between test-placers and self-placers, Table 5 shows that
self-placed students enrolled more evenly across courses constituting the entire math sequence. In
fact, a significantly larger proportion of self-placed students (13.5%) started their math trajectory by
taking a transfer-level math course compared to 8 percent of test-placed students who assessed and
enrolled in the control colleges that same period. These patterns suggest that a self-placement
regime induced some incoming students to enroll in transfer-level math who otherwise would have
been likely placed into lower math levels had they taken a placement test. Table 5 also shows that
roughly 13 percent of self-placers also enrolled in the lowest level of the math sequence, suggesting
that some students may feel severely underprepared for college-level coursework in math.
Table 6 in Appendix B examines the proportion of test-placed and self-placed students who
enrolled across the different levels of the math sequence in College X by their race and sex. In
33
An alternative explanation may be that colleges set their cut scores lower during the 2009 spring and summer,
therefore allowing a higher percentage of test-placed students to assess into transfer-level math.
54
analyzing the results, there are multiple takeaways. For one, across all races, a small percentage of
students enrolled in transfer-level math during the period of test-placement. Asians students
represented the bulk of transfer-level math students; however, they nevertheless made up just 4.3
percent of all math enrollees who reported being Asian. Less than one percent of test-placed
African-Americans, and just over one percent of Latinos matriculated in transfer-level math.
Second, Table 6 shows that self-placed students from all racial groups made significant gains in
enrolling transfer-level math under a self-placement regime. Females did as well. For example,
African-Americans experienced a 12-fold increase in enrolling in transfer-level math when College
X employed self-placement. The percent of Whites enrolled at that math level also dramatically
increased to nearly 27 percent, up from just a little under two percent when COMPASS was used.
Although Asians, females, and Latinos also experienced substantial growth in terms of
representation in this math level, this growth was weaker, ranging from a 6-fold to an 8-fold
increase.
Despite the fact that substantially larger percentages of students from across the board found
their way into transfer level math under self-placement, evidence presented in Table 6 in Appendix
B shows that larger shares of African-American, Latino, and female students enrolled in the lowest
level of developmental math after College X gave up using a test-based placement regime. Results
show that the percent of African-American students who enrolled in a course four-levels below
transfer-level math increased by almost 11 points, up from 15 percent during the period of test-
placement. Larger proportions of Latinos and females also enrolled in the least rigorous math
course when dealt with greater autonomy to make placement decisions. Nearly 16 percent of Latino
and female students, respectively, matriculated in the least rigorous math course during self-
placement, compared to only 8 percent during test-placement. On the whole, Whites, Asians, and
55
students who could not identify with a categorized race or ethnicity, experienced overall decreases
in percent of students within their demographic group who decided to enroll in the lowest math
level. The percent of males in that level only marginally increased during DSP.
While this evidence hints that a self-placement regime may empower some marginal
students to take more difficult math courses, it also suggests that DSP may create opportunities for
students who have been socialized to think they are bad at math to incorporate these stereotyped
beliefs when making placement decisions. If such is the case, then self-placement may also create
opportunities that widen already existing racial and gender disparities in completion rates, since
evidence shows the lower the math course in which a student enrolls, the harder it becomes to
complete a degree (Bailey, Jeong, & Cho, 2010; Fong, Melguizo, & Prather, 2013). These statistics
are important to consider when determining whether a self-placement is a viable substitute for, if
not supplement to, a test-based placement regime.
The Effects of a Self-Placement Regime on Student Success
Table 7 in Appendix B presents the effects of switching from a test-based placement regime
to a self-placement regime on student success for math enrollees. All models presented in this table
control for student-level demographic variables, and include time-invariant institutional-level and
semester-cohort fixed effects. Results from models that vary in the types of controls used are
available upon request. Table 7 presents results using data pooled across cohorts of students who
assessed and enrolled in the math sequence during the periods of DSP and test-placement.
Column 1 in Table 7 indicates that changing from a test-based placement regime to one that
allowed students to determine the math class most appropriate for them did not increase the
probability that a student failed their first math course. However, results presented in Column 2
suggest that self-placement decreased an enrollee’s chances of withdrawing from their first math
56
course by roughly 5 percentage points. This is an interesting finding, as it supports the notion that
giving students greater latitude over educational decisions may increase their persistence within the
math course they initially selected.
Turning to math-specific outcomes, results in Column 3 show that self-placers were close to
1.5 percentage points more likely to meet the minimum math requirement for an A.A than test-
placers; they were also 8 percentage points more likely to pass at least one transfer-level math
course than their counterparts (Column 4). Examining these results together with the summary
statistics suggests that students who assigned themselves to transfer-level math might have
benefitted more from self-placement than students who enrolled in a course that counted only
towards the A.A. degree requirement (i.e. one or two-levels below transfer-level math). I investigate
this issue in the next section of the paper.
Columns 5-7 show the effects of DSP on credit accumulation. Starting with Column 7,
results suggest that self-placement boosted the number of degree-credits obtained by over two
points. This positive result appears to translate to higher proportions of self-placers relative to test-
placers who obtained half and the total the number of degree-credits needed for an associate’s
degree. Column 5 intimates that DSP increased the probability that a student acquired 30 degree-
applicable credits by 5.4 percentage points, and 60 degree-applicable credits by 3.8 percentage
points (Column 6).
Viewing these results collectively implies that, in the aggregate, DSP positively affects
achievement for students who took the opportunity to self-place and subsequently enrolled in the
math relative to test-placed enrollees. Not only does DSP encourage enrolled students to stick with
their first math course until the end of the semester, but it also seems to help them pass math
57
courses that count towards a two-year and a four-year degree, and acquire the necessary number of
degree-applicable credit critical for graduation and upward transfer.
Heterogeneous Effects
This section presents evidence exploring the heterogeneous effects of DSP on the success of
different student populations. This analysis is couched in research showing persistent gender and
racial differences in math self-concept, math self-efficacy, math course-taking patterns (Eccles,
2007), achievement gaps in math (Catsambis, 1994; Vermeer, Seegers & Boekaerts, 1996), and
decisions to pursue math related careers (Jacobs, 2005). The analysis is also motivated by recent
evidence suggesting that students assigned to longer developmental course sequences have the most
difficult time reaching critical academic benchmarks compared to students assigned to shorter
developmental course sequences (Bailey et al., 2010). Thus, failing to consider sex, race, and level
math in an analysis about the effectiveness of DSP on academic success would likely lead
policymakers and practitioners to come to the wrong conclusions about the effectiveness of DSP for
different student groups.
Results presented in Table 8 in Appendix B examine the interactions between DSP and an
enrollee’s initial math level, gender, and race, respectively on achievement. In the interest of space,
only results that are statistically significant are discussed. In the first set of results, we show the
impact of DSP on students who enrolled in different math levels. On the whole, these results show
that students at each level of the math sequence benefit in some way from being allowed to self-
place into the math sequence. For example, Table 8 shows that self-placers who enrolled in a math
course four levels below transfer-level experienced significantly lower rates of failure and
withdrawal relative to their test-placed counterparts. Similarly, self-placers who enrolled in a course
considered two-levels below transfer increased their chances of passing a transfer-level math course
58
by nearly 10 percentage points over test-placers. These same self-placing enrollees in contrast to
test-placed enrollees, were also more likely to obtain 30 degree- and 60 degree-applicable credits,
critical benchmarks for acquiring a college degree. Although evidence presented in Table 8 shows
that self-placers benefit in one way or another, results also suggest that some students experience a
significantly higher payoff from DSP. On the one hand, self-placers who decided to enroll in
transfer-level math, in particular, significantly outperformed test-placers, on all positive measures
of success. On the other hand, self-placers who enrolled in tutoring courses or a course considered
as one-level below transfer-level math did worse than test-placers at the same level. The source of
these differences may be rooted in the sorting accuracy of placement tests for specific math levels,
or the fact that some students had to exert more effort than others to reach the same milestone. For
example, students who enrolled one-level below transfer-level math could have enrolled two-levels
below transfer and still have met the minimum math requirement for an A.A. degree. It is also
possible that students who enrolled in a tutoring course realized that that course did not count
towards any college degree, and, in fact, did not help students advance in the developmental course
sequence.
Table 8 also shows that females as well as racial minorities benefitted from self-placing into
the math sequence but to a lesser extent than their less stigmatized counterparts. Compared to self-
placing males, who fared better on every academic measure examined here, females – under a self-
placement regime – were more likely to fail their first math course and unsuccessfully meet the
minimum math requirement for an A.A. degree. Although self-placing females experienced higher
chances of accumulating 30 degree- and 60 degree-applicable credits compared to test-placing
females, the magnitude of these coefficients are noticeably smaller than those obtained for self-
placing male students. Table 8 also suggests that African-American and Latino enrollees responded
59
in similar fashion to females under a self-placement regime in terms of achievement. They, too,
were less likely to pass their first math class, and meet the minimum math requirement for an A.A.
degree relative to test-placed students. Lower success rates among these groups of students may be
linked to decision-making that is more influenced by their attitudes toward and experiences in math
rather than a rational assessment of the possibility of success in each available math course.
Sensitivity Checks
Seasonality Bias
It is reasonable to think that students enrolled in math in the summer and fall would be
observably and unobservably distinct from those who enrolled in math in the winter and spring.
After all, summer and fall enrollees may be more likely to be high school graduates following a
seamless path towards college than students who chose to enroll in the winter and the spring. As
such, these two student groups may differ not only in age, but also in their degree of self-efficacy,
motivation, and level of math preparation. Ultimately, such factors have the potential to confound
estimates of the treatment effect. To control for such confounding, I re-estimate the main model,
which originally pools data across semesters before and after the change in placement regimes, by
examining mid-term outcomes for students tested and self-placed into the math sequence during
similar periods of the year (e.g. summer + fall). This restriction ensures that students in the treated
and control group exhibit analogous enrollment propensities and characteristics and rids the
analysis of seasonality bias.
Table 9 in Appendix B presents results from the main model, restricted to students who
assessed and enrolled in the summer and fall semesters. For the most part, these results do not differ
much from the results that pooled enrollment cohorts together. Results from this model suggest that
DSP did not appear to increase the probability of meeting the minimum math requirement for an
60
A.A. degree or the number of degree applicable units completed. Differences between the pooled
and restricted semester-cohort model convey that by including winter and spring enrollees, the
effects of DSP may be slightly overestimated, since these students did have a propensity to do
worse academically.
Falsification Test
The falsification test employed here uses a fake treatment period to tests the assumption that
students in College X and the control colleges exhibited the same achievement trends prior to the
implementation of DSP using observed data. In other words, this placebo test performs another
differences-in-differences estimation on a semester where self-placement was known to not have
taken place. Because one would expect students in the treatment and the control colleges to have
similar trends in terms of achievement before the summer of 2008, results from the falsification test
should point to zero. Put differently, DSP should have no effect on achievement for cohorts that
were not actually affected by it. Finding estimates deviating significantly from zero would suggest
that there were other unobservable circumstances or trends that stimulated changes in achievement
to that of self-placement.
Table 10 in Appendix B presents results produced by the falsification test using observed data
and specifying that the change from using a test-based placement regime to a self-placement regime
occurred one year prior to its actual implementation. These results show that most estimates
converge to zero in terms of practical and statistical significance, though three do not. The
significance of the three non-zero estimates may be overstated, since each coefficient is very close
to zero. If anything, these three non-zero estimates suggest that one year prior to the
implementation of self-placement enrollees at College X relative to those in the control colleges
were actually performing worse. Table 10 shows that if DSP occurred during the 2007 summer and
61
fall semesters, students at College X were two percentage points more likely to withdraw from their
first math course, and 3 percentage points less likely to pass a transfer-level math course. Only on
the indicator measuring whether a student obtained 30 degree applicable credit hours did students at
College X fare better than students enrolled in the control colleges. Ultimately, what the estimates
intuit is that the results produced from the main model may be underestimating the effects of self-
placement for math enrollees on withdrawing from their first math course and passing a transfer-
level math course. Further, the results measuring the impact of self-placement on obtaining 30-
degree credits may be overestimated.
Permutation Tests
Economists have recently argued that a difference-in-difference estimation approach is
particularly susceptible to understated standard errors (Bertrand, Duflo, & Mullainathan, 2004).
Understated standard errors are produced when a policy is targeted at a small number of groups or
clusters. Analyses examining the effects of different state-policies help to explain why this problem
occurs. In the U.S., states typically enact a state-wide legislative measure that cuts or expands
access to benefits for its population. Consider state gun laws as a case. In measuring whether
changes in gun laws in one state reduced the number of violent offenses, researchers would have to
rely on the remaining 49 states as controls. The small number of “clusters” - or states - researchers
can employ in such an analysis risks violating asymptotic theory, which asserts that the validity of
standard errors, and corresponding test-statistics, rest on the number of clusters approaching infinity
(G ∞). In studies where the number of clusters is fixed and small, drawing true causal inferences
is difficult simply because standard errors cannot be consistently estimated. This is a problem
because it can lead researchers and policymakers to draw invalid inferences about the efficacy of a
policy intervention.
62
Several economists have devised methods to make inference robust to cluster sampling.
Building on work conducted by Donald and Lang (2007), Bertrand, Duflo, and Mullainathan
(2004), and Wooldridge (2006), Buchmueller, DiNardo, and Valletta (2009) designed a modified
version of the classical permutation test, which improves over conventional statistical techniques in
that it is free of strict assumptions underlying parametric tests (Feinstein, 1973).
At its basic level, the classic permutation test generates placebo estimates, which function to
test the validity of main estimates produced by a difference-in-difference model. Drawing on
observed data, this test selects two groups from a set of clusters, and randomly assigns one to
receive treatment independent of whether it, in reality, participated in the intervention. The test then
calculates a difference in means to determine whether the treatment has an effect on an outcome.
This estimate is stored, and another permutation test is conducted, and another difference in mean is
calculated. This sequence of tests is conducted until all possible cluster permutations are exhausted.
Instead of relying on standard errors, researchers can ascertain statistical significance by examining
whether the main treatment estimates falls outside the 2.5 percentile and 97.5 percentile marks of
the placebo difference distribution (Buchmueller, DiNardo, & Valletta, 2009). If it does, inferences
are deemed valid or “exact”.
Buchmueller, DiNardo, and Valletta (2009) modify the classic permutation test in their
paper examining the impacts of Hawaii’s Prepaid Health Care Act, an only employer health
insurance mandate. In their analyses, these authors produce 50 placebo estimates of their main
impact model; each estimate uses a different state, including the District of Columbia, to serve as a
substitute for Hawaii. They hypothesize that if Hawaii’s health insurance mandate did have any
real effect, the coefficient measuring that effect would lie beyond the 2.5 and 97.5 percentile
distribution of all 50 placebo estimates, and that placebo estimations should approximate zero.
63
These authors argue any placebo estimates larger in size than the main estimate would suggest that
the impacts of the state insurance mandate could not be distinguished from unobservable
characteristics that change in tandem with the change in health insurance policy.
Like the 2009 Buchmueller, DiNardo, & Valletta study, this research also grapples with
making valid inferences about the effectiveness of a policy intervention that took place at a single
institution belonging to a district with only nine colleges. I employ Buchmueller, DiNardo, and
Valleta (2009) permutation test variant to determine the existence of concurrent trends that may
have also boosted achievement at the time of DSP by estimating eight "placebo" versions of the
main model, in each case with a different college serving as a substitute for College X. In other
words, I tested the impact of self-placement on academic achievement in the event that one of the
eight control colleges implemented self-placement when it actually did not using observed data. In
each case, the remaining eight colleges serve as the control colleges in the estimations. Akin to the
main model, each permutation test controls for student-level demographic characteristics, as well as
time-invariant institutional and semester-cohort fixed effects.
Results from the permutation tests shows that across all outcome measures – except that of
passing a transfer-level math courses – at least one placebo estimate is larger than the real
difference produced by the main model. Additionally, results show that many placebo estimates do
not converge to zero (Table 11, Appendix B). These results are presented visually in Figure 2 in
Appendix B for each outcome of interest. Together, these results convey that the switch to a self-
placement regime took place at a time when other unobservable shifts also induced changes in
achievement for students in the control colleges. The fact that most estimates measuring the impacts
of self-placement fall within the range of placebo estimates makes it challenging to distinguish the
effects of self-placement from normal fluctuations in achievement that occurred within this group of
64
community colleges at the time when self-placement was implemented at College X. These results
serve to remind us that it is important to test the assumptions underlying difference-in-difference
estimation strategies in order to make the right sort of inferences about the effectiveness of a policy.
Discussion
This paper presents evidence on the short- and long term achievement for math enrollees
given the opportunity to self-place into the math sequence. Using a difference-in-difference
framework that exploits a college’s mistake to renew their license with the ACT – provider of the
COMPASS exam, this paper finds suggestive evidence that in, the aggregate, incoming, community
colleges students who enrolled in math, for the most part, responded positively to DSP. Not only
were self-placed enrollees less likely to withdraw from their first math course, they were also more
likely to pass required math courses and acquire the amount of degree-applicable credit necessary
for a college degree.
Despite these positive findings, the potential value of DSP on improving student success
hinges on two critical factors: 1) whether DSP has the ability to perform equally well for all
students, irrespective of their race or sex, and 2) whether experimental studies testing the
effectiveness of DSP also produce similar positive findings.
Evidence from this study casts doubt that DSP, in the way it was implemented, fully
benefitted Latinos, African-Americans, and females. With self-placement, these students enrolled in
the lowest math courses at higher rates, and while they performed better on most outcomes, they did
much less so than Whites, Asians, and males. One reason why this may be happening may be tied
to the types of information or advice these students factor into their placement decisions. Studies in
psychology, sociology, and education all suggest that underrepresentation of women and ethnic and
racial minorities in math and science related fields in part stem from perceptions that White males
65
are innately superior in math ability (Buck, Clark, Leslie-Pelecky, Lu, & Cerda-Lizarraga, 2008;
Ladson-Billings, 1997; Scherz & Oren, 2006). Because such stereotypes hold sway over the types
of courses students take while in K-12, and the career paths they decide to pursue later on in life
(Auger & Blackhurst, 2005; Bandura, Barbaranelli, Caprara, & Pastorelli, 2001), it is likely that the
math placement decisions that females, Latinos, and African-Americans make are also informed by
these perceptions.
Secondly, selection into the lowest developmental math levels may also be a student’s
response to their desire to maintain, protect, or enhance their self-worth or self-esteem. Defensive
pessimism, a theory from psychology, suggests that people naturally seek options that allow them to
avoid failure in areas where their self-worth is at stake (Cantor & Norem, 1989; Martin, Marsh, &
Debus, 2001; Crocker, 2002). Thus, students of color or who are female may have chosen the least
rigorous math course to increase their chances of success.
Finally, it is possible that counselors may have encouraged higher percentages of female,
Latinos, and African-American students to enroll in lower math courses under self-placement.
Research show that students of color are overreferred to special education and mental retardation
programs (Artiles, Harry, Reschley & Daniels, 2002), and that their overrepresentation in such
programs may be due to biased perceptions of academic abilities (McLoyd, 1998). While this
evidence stems from K-12, it may well be that these perceptions carry on to higher education, and
affect how community college assessment staff and counselors appraise a student’s chances of
succeeding in various courses constituting the math sequence.
To keep biases at bay, policymakers may want to consider creating resources that help
students mindfully recognize their feelings and their perceptions about math before they chose a
math course. Equally important is aiding counselors and assessment staff in understanding their
66
own gender and racial biases, and detecting whether a student is expressing anxiety, stress, or fear
when selecting a math course. Decision-makers should also consider whether a math test, like the
COMPASS or the ACCUPLACER, should serve as a supplement to self-placement since it may
help to reduce the underestimation or the overestimation of math ability.
Because this study could only produce suggestive evidence, we lack causal evidence
showing that DSP improves achievement above and beyond what one might see due to random
chance. Therefore, foundations and decision-makers ought to invest in experiments that randomly
assigns students to the opportunity to self-place to get a clear sense of whether DSP adds value over
test-based placement approaches. Right now, the results are too preliminary, and should not be
taken as proof in the pudding that DSP works.
67
CHAPTER FOUR
HOW DO STUDENTS MAKE MATH PLACEMENT DECISIONS WHEN GIVEN GREATER
AUTONOMY?
Directed Self-Placement (DSP)
34
is an alternative placement approach used to sort students
into developmental and transfer-level courses. The recent take-up of DSP across community
colleges is in direct response to an increasing amount of evidence showing that predominantly used
test-based placement models fail to accurately determine who should and who should not be
assigned to developmental education (Scott-Clayton, Crosta, & Belfield, 2014). Florida,
Connecticut, and several community colleges have adopted elements of DSP to curb misdiagnoses
in developmental education (Burdman, 2012; Fain, 2013; Hu, 2015), and several nationwide efforts
to improve college persistence and success have mentioned DSP as a viable alternative to extant
placement methods (Burdman, 2012).
At its core, DSP empowers students to take control over decisions determining if, and the
extent to which, they need developmental education to help them achieve their collegiate and career
goals. Support for this placement strategy is rooted in psychological theory, specifically self-
determination theory (Ryan & Deci, 2000) and engagement theory (Kuh, 2001), which, combined,
posit that granting individuals latitude over consequential life decisions inherently boosts their
commitment to achieving success.
Although DSP holds appeal among some policymakers, the adequacy of DSP as a
placement strategy has been mainly judged against whether it improves student attitudes and
satisfaction (Royer & Gilles, 1998; Royer & Gilles, 2003). While results from these studies suggest
that DSP may rouse more positive dispositions among students, they fall short of providing an
34
Directed Self-Placement is term coined by Royer & Gilles (1998). In the literature, Informed Self-Placement is used
synonymously with DSP.
68
understanding of how community college students interact with the self-placement process itself,
and whether the information given to guide their decision-making impacts the decisions that they
make. While DSP research purports students can assess their ability to perform better than a test
(Roger & Gilles, 1998), decision research has consistently demonstrated that individuals
misestimate their own abilities based on imperfect information, overreliance on recent experiences
or irrelevant features, or misattribution of outcomes (Payne, Bettman, & Johnson, 1992). Because
of these limitations, individuals frequently employ strategies that lead them to ignore alternatives or
draw on irrelevant information (Croskerry, 2003), effectively increasing their risk of making an ill-
informed decision (Christensen, Moran, & Wiebe, 1999; Payne, Bettman, & Johnson, 1992).
Without clarity on how community college students process information and, more broadly, make
judgments at each stage of the decision-making process to reach placement decisions, states and
community colleges will continue to face challenges with respect to implementing DSP so that it
can improve over current test-based placement models.
This paper fills the gap in current literature by providing greater clarity on how students
make autonomous placement decisions, and the extent to which the informational supports that they
receive (e.g., math course descriptions, math problems) impacts and informs their decision-making.
Thus, the goal of this research is not to determine whether community college students make
optimal placement decisions, but rather to clarify whether the strategies and the informational
inputs they use are likely to produce sound placement decisions. I focus this research on self-
placement into developmental math because a larger proportion of students place into
developmental math than developmental English (Bailey, Jeong & Cho, 2010), and DSP literature
on math is scant. This research matters, particularly in states like Connecticut and Florida, because
it bears on how much autonomy we give students, and how much we must invest in developing and
69
providing the supports they need to make decisions that have the potential to positively impact their
academic and career outcomes.
The main question guiding this study is: How do community college students make
placement decisions when given the option to self-place into math? To answer this question, I ask
the following subquestions:
1) How do students approach decision-making when tasked with choosing a math course?
2) How and to what extent do students use decision aids to make self-placement decisions?
3) What is the influence of different informational contexts on math placement decisions?
To answer these questions, I employed a concurrent mixed methods research design
(Tashakorri & Teddlie, 2003) – described below, which allowed me to conduct parallel quantitative
and qualitative studies to more holistically examine the phenomena of how students interact with
math DSP. Through the implementation of a hypothetical exercise conducted in a controlled setting,
I collected original data from a select group of community college students who reported never
having taken a math placement test or a math course in college. By analyzing and integrating
qualitative and quantitative data from computerized surveys, interviews, and verbal protocols, this
work expands what we know about the strengths and weaknesses of current implementation DSP
practices and adds to decision theory that has largely overlooked student decision-making in higher
education.
This paper is divided as follows. I, first, provide a brief overview of Behavioral Decision
Theory to give conceptual structure to the paper. In this section, I discuss the theory’s main tenets,
the stages of and the biases influencing decision-making, typical decision aids used to improve
decision-making, and current implementation and support features of DSP models. Second, I
describe the qualitative and quantitative studies conducted to the answer research questions stated
70
above, defining, for each, the sample, data collection methods, analytic methods, and how I
integrated qualitative and quantitative results. Fourth, I present the results. Finally, I discuss these
results, their implications on the effectiveness of DSP on adding value of current test-based
placement regimes, and chart paths for future research.
How Do Students Make Decisions? A Behavioral Perspective
Behavioral Decision Theory (BDT) is a theoretical framework used to explain how goal-
oriented individuals make decisions under constrained circumstances (Simon, 1955). Rather than
viewing individuals as “omniscient calculators” capable of adequately and rationally evaluating the
costs and benefits of each alternative (Lupia, McCubbins, & Popkin, 2000), behavioral theorists
argues that judgment or choice is highly sensitive to factors related with individual differences (e.g.
memory retrieval, prior experiences, demographic and psychological characteristics), the task’s
complexity (e.g. choosing between multiple alternatives), the task’s display (e.g. the type of and the
way information and informational supports are presented) (Häubl & Trifts, 2000, Payne, Bettman,
& Johnson, 1992). Thus, according to BDT, judgment and choice are constructed rather than
calculated (Slovic, Griffin, & Tversky, 1990). BDT is more descriptively rather than normatively
oriented, focusing mainly on how individuals employ strategies, collect and use information, and
draw on cues to make choices between prospects (Payne, Bettman, & Johnson, 1992).
This perspective stands in contrast to Rational Choice Theory (RCT), which argues that
people should impartially and deliberately determine choice by carefully evaluating each alternative
for its potential to maximize expected utility or interests and minimize undesirable consequences
(Beach & Lipshitz, 1990). RCT contends that choice should be based on a fixed set of preferences
(March, 1994), and determined in additive fashion until the utility of all alternatives have been
assessed (Hastie, 2001; Neumann & Morgenstern, 1947). An individual is cast as an irrational, if
71
not a poor, decision maker if he or she does not choose the alternative with the highest utility. With
decades of application in various fields, researchers have come to question the validity of RCT’s
normative and prescriptive orientation in explaining human decision-making, given widespread
evidence showing that people’s actions taken to solve a dilemma frequently contradict the logic of
methodical decision-making (Beach & Lipshitz, 1990; Simon, 1967).
For the reasons stated above, BDT, in contrast to RCT, is a useful framework for describing
and understanding how community college students self-place into a math sequence. Indeed, the
task of making a math placement decision can be thought of as a potentially complex, difficult
process by which a student must make a series of judgments in order to choose a math course where
success is not entirely certain. By employing DSP to assign students to either developmental or
transfer-level math courses, community colleges force a student to consider multiple course
alternatives; gather, use, and appraise different pieces of information; and weigh the potential
benefits against the potential costs of each alternative to determine which course is best suited to
help them reach their academic and career goals. Given community college students are diverse, not
only in terms of their academic experiences and goals, but also in terms of the ways in which they
interpret and value information and have been socialized to think and feel about math, it is likely
that they engage with the math DSP process differently, and as a result, construct choice differently.
While the literature on decision-making is replete with general descriptions and discussions about
the strategies and inputs people use to make complex decisions, and for which the outcomes are
uncertain, the literature spanning DSP offers little in the way of concrete evidence about how a
community college student, with no previous knowledge about his or her level of preparation for
college-level math, responds to the task of choosing what they think should be their first math
course in college. In the following sections, I briefly describe the stages of the decision-making
72
process, and then center the rest of the literature review on describing the simplification techniques
and decision biases that shape decision-processes; identifying the differences between and the
effectiveness of decision aids used to improve decision quality and accuracy; and outlining current
implementation and support practices of DSP models.
Stages of Decision-Making
Most models explaining individual decision-making typically are composed of five distinct
stages: 1) problem recognition and approach, 2) task diagnosis, 3) selection of decision strategies,
and 4) implementation of a decision strategy and choice among alternatives, and 5) the execution of
a final judgment (Beach & Mitchell, 1978). While decision-making can progress in a
straightforward, linear fashion, it can also be non-linear or circuitous (Einhorn, 1970). Research
shows that, at each stage, decision-makers commonly apply simplification techniques, or heuristics,
that reduce cognitive load but nevertheless may bias decision-making
35
.
In the first stage (the problem recognition stage), the decision-maker identifies the problem
or dilemma. Here, decision makers formulate their goals and objectives, and estimate gaps between
these targets and their performance (Beach & Mitchell, 1978; Lyles, 1981, Lyles & Mitroff, 1980;
Schwenk & Thomas, 1983). Based on work of Kahneman and Tversky, decision-makers at this
stage typically “frame” the decision problem, or orient themselves towards a perspective that helps
them to formulate their goals and focus exclusively on different elements of a task (March, 1994).
In the second stage (the task diagnosis stage), decision makers are confronted with the task
of diagnosing what they will need to problem solve. Here, decision makers ask "What do I need to
know?" or "What are the parameters of the problem?" (Beach & Mitchell, 1978). Typically, at this
point of the decision process, the decision maker retrieves a “database” of evidence from memory,
35
In decision research, simplification techniques and biases are often used synonymously.
73
which are typically scrambled and piecemealed, and does not represent the full set of information
needed to develop a sound decision strategy (Pennington & Hastie, 1988). In this stage, decision
makers also edit and focus on the information they must process based on internal cues and memory
retrieval (March, 1994). Through editing, individuals concentrate their selective attention on
features of a problem that possess personal resonance or meaning.
In the third stage (the selection strategy stage), decision-makers develop a plan of action to
solve a problem. Here, decision-makers use either compensatory or non-compensatory decision
strategies, or both (Beach & Mitchell, 1978). Compensatory strategies require decision makers to
apply complicated and sophisticated rules to synthesize information and ignore cues that may blind
an individual from evaluating all available options (Billings & Marcus, 1983; Einhorn & Hogarth,
1981). Non-compensatory approaches are simpler to implement because they tend to ignore some
attributes of an alternative and do not require decision makers to make tradeoffs (Chu & Spires,
2003). Some non-compensatory approaches are applied by rote to decision tasks (Beach &
Mitchell, 1978), and typically governed by homiletic rules, societal conventions, and experience or
habit (Beach & Mitchell, 1978; March 1994). Table 1 in Appendix C presents examples of each
one applicable within the context of this study.
In the fourth stage of the decision process, decision-makers implement their decision
strategy, making a choice among the prospects. In the final stage, they reach an absolute decision.
Simplification Techniques and Decision Biases
Researchers who examine how people make decisions in a variety of contexts, and under a
variety of conditions, have found that they consistently violate the strict rules of rational decision-
making. While these violations can reduce cognitive load, and increase efficiency and
predictability, they also have the potential to lead decision-makers to make unsatisfactory decisions
74
based on the normative principles of rational decision-making. Here, I identify the decision biases
and simplification techniques that may arise when students engage in math self-placement, and
provide examples of how each one may manifest during that decision-making process. Table 2 in
Appendix C provides an inventory of these simplification techniques and decision bases, their
effects on decision-making, and examples of how they may affect decision-making under DSP.
Anchoring. Anchoring is one of the most commonly reported decision biases, and reflects a
person’s tendency to base final decisions on initial impressions or diagnoses of initial information
(Bodenhausen, Gabriel, & Lineberger, 2000; Croskerry; 2002; Tversky & Kahneman, 1974). For
example, self-placers may chose a math course based on initial feelings rather than on a thorough
examination of information provided in self-placement guides.
Ascertainment bias. Decision-makers are susceptible to this bias when judgment and
choice are pre-constructed based on what others expect or perceive of us or what we hope to find
(Croskerry, 2002). Stereotype threat and gender bias are examples of ascertainment bias. Self-
placers who are of color or female may choose lower placement levels because they may be
socialized into thinking people with their characteristics are not predisposed to be good at math.
Availability bias. Availability leads decision-makers to overestimate the frequency of vivid
or frequent experiences and to underrate the probability of occurrence when events are unemotional
or vague in nature (Croskerry, 2002; Schwenk, 1988; Zacharakis & Shepherd, 2001; Tversky &
Kahneman, 1973). For example, self-placers who have had strong memories of poor performance
and pedagogy in pre-Calculus may fail to seek additional information about pre-Calculus in
determining course fit.
Confirmation bias. Confirmation bias is considered one of the most prevalent problems in
human reasoning (Nickerson, 1998). This bias occurs when a decision-maker forms judgment on
75
tenuous, incomplete data, and subsequently ignores or dismisses disconfirming evidence stemming
from new information (Croskerry, 2002; Elstein, 1999). Within the context of DSP, self-placers
may settle on taking a transfer-level math course even if they fail to meet the academic
qualifications for that level.
Framing. The framing fallacy affects how an individual approaches a problem, challenging
a fundamental assumption of RCT, which states that preferences are fixed and independent of how
a dilemma is presented (Tversky & Kahneman, 1986). For instance, students tasked with choosing a
math course on their own may start the decision-making process contingent on their academic and
career goals or how they have been socialized to think and feel about math.
Order effects. Decision-makers tend to weigh recently presented information more heavily
than information presented earlier when formulating a decision (Croskerry, 2002; Elstein &
Schwarz, 2002). As such, information provided typically at the start or in the middle of the task can
be inadvertently excluded in the evaluation of prospects. Thus, self-placers may fail to factor in
their performance on pre-requisite math problems into their decision-making if it is followed by
descriptions of available alternatives.
Overconfidence bias. Another common decision bias, overconfidence reflects an
unwarranted certainty in the correctness of one’s decision (Croskerry; 2002; Koriat, Lichenstein, &
Fischhoff, 1980). In more cases than not, decision-makers think they know more than what they
actually do, which, as a consequence, may lead to biased (as in confirmatory) or incomplete
information searching and processing (Koriat, Lichenstein, & Fischhoff, 1980). Self-placers who
are over-confident in their ability to make “good” decisions may discount information leading to
potentially more suitable course alternatives.
76
Pre-mature closure. Pre-mature closure occurs when decision-makers choose an alternative
option too early in the decision-making process (Croskerry, 2002). The need for closure may be
precipitated by apathy or internal or external pressure to find a solution (Meltsner, 1972; Croskerry,
2002; 2005). Self-placers who are not emotionally invested in the process of DSP may choose a
course pre-maturely to save on effort and time.
Search satisficing. Search satisficing broadly identifies non-compensatory strategy (or
heuristic) used by decision makers to discard alternatives that do not meet a predetermined
minimum set of criteria, and keep those that do (Chu & Spires, 2003). For example, self-placers
may automatically stamp out course alternatives that are unfamiliar or too academically challenging
to reduce the number of course alternatives to investigate.
Visceral bias. Certain problems rouse positive and negative feelings within decision-
makers, which can strongly influence how we frame a problem and design a plan of action to solve
that problem (Croskerry, 2002; Kuhberger, Schulte-Mecklenbeck, & Perner, 2002). Visceral factors
constrain thinking, inadvertently focusing a decision-maker’s attention on elements of decision-
making that can abate negative thoughts and feelings (Loewenstein, Weber, Hsee, & Welch, 2001).
Self-placers who have strong negative feelings towards math may let their emotional responses to
math in lieu of a cognitive evaluation dictate their placement choice.
Moderators of Decision Bias: Decision Aids
The core goal of decision aids is to free up a decision-maker’s limited cognitive resources
that he or she needs to arrive at a decision (Feldman-Stewart & Bundage, 2003). Descriptions of
decision-aids show little standardization in their design and implementation, which may be in part
to their various objectives, which include improving knowledge, intensifying engagement in
decision-making, and reducing uncertainty in judgment (Barnato, et. al, 2006). In some cases, some
77
decision-aids offer structured general presentations of textual information; in others, decision-aids
ask that decision-makers clarify their values and goals (Feldman-Stewart, & Brundage, 2003;
O’Connor, Drake, Fiset, Graham, Laupacis, & Tugwell, 1999). Decision-aids are also provided
through a variety of platforms. Summary letters, videos, decision boards, interactive computer-
based programs, pamphlets, and one-on-one consultations have all been employed in a variety of
contexts (i.e. business, health, education) to help decision-makers make judgments and decisions in
complex environments (Charles, Gafni , & Whelan, 1999; Gaston & Mitchell, 2005; O’Connor,
Drake, Fiset, Graham, Laupacis, & Tugwell, 1999). In what follows, I review some of the major
fault-lines separating decision-aids, and, when possible, draw on rigorous empirical evidence from
decision research and higher education literature to shed light on the effectiveness of elements of
design and implementation on improving decision quality and accuracy. Following this review, I
describe common design features shared by publically available DSP guides, which can be
considered decision aids, as a way to provide perspective on their potential utility in reigning in
biases that may influence placement decision-processes.
The provision of generic versus personalized information. Even though many decision
aids provide generic information, there is increasing evidence that personalizing information to
meet the unique needs of the decision-maker can improve decision-making satisfaction and quality
(Davidson, et. al 2007; Duhan et al., 1997; Gilly et al., 1998; Olshavsky & Granbois, 1979; Price &
Feick, 1984). For example, Davidson and colleagues (2007) discovered that the provision of
personalized versus generic information significantly increased a cancer patient’s satisfaction with
the information they received as well as the role they played in determining treatment. In higher
education research, Bettinger and Baker (2014) found that academic coaching, which helped
students develop cognitive skills (e.g. time management, self-advocacy, and study skills) and a
78
clear vision of their goals, increased their within year and year-by-year college persistence.
There are several hypotheses that might explain why personalized information is superior to
generic information. One hypothesis is that personalization inherently requires decision-makers to
clarify their preferences, goals, and values, which may help a decision-maker hone in on more
appropriate alternatives (Feldman-Stewart & Bundage, 2004). Another hypothesis is that it forces
people to become more invested in their decision-making, which can serve to increase their level of
motivation to find the best decision (Edwards, Davies, & Edwards, 2008).
The display and presentation of information. The way information is sequenced,
organized, and displayed in decision-aids can influence how individuals process information and
use it to make decisions (Johnson, Payne, & Bettman, 1988; Kleinmuntz & Schkade, 1993). The
large literature examining how information displays impact decision-making shows that it can alter
information search and acquisition (Schkade & Klienmuntz, 1994; Sunstrom, 1986), information
processing (Bettman & Kakkar, 1977), investment in effort (Johnson, Payne, & Bettman, 1988),
strategy selection (Sundstrom, 1987), and choice preference (Johnson, Payne, & Bettman, 1988).
Organization and sequencing of information. The structure and the order in which
information in decision-aids is presented can affect how information is processed and used in
decision-making (Bettman & Kakkar, 1977; Slovic, 1972). According to Slovic’s principle of
concreteness, “a decision maker tends to use only the information that is explicitly displayed in the
stimulus object and will use it only in the form in which it is displayed” (Slovic, 1972, p.14).
Experiments testing the validity of concreteness found that people, in general, process information
in the order in which it is displayed (Bettman & Kakkar, 1977; Croskerry, 2002). The organization
and sequencing of information may have implications on how easily and readily decision makers
process and recall information (Croskerry, 2002; Tversky,1969)
79
Textual, numerical, graphic information formats. Information in decision-aids is
communicated in a variety of formats (e.g. textual, numerical, graphic) (Lohse, 1997; Zhang &
Whinston, 1995). The choice of format depends in large part on the type of data being displayed
(Brundange, et. al, 2005). For example, information about absolute or relative risk is often
presented in numeric form; in health aids, bar graphs are often used to demonstrate success rates for
different types of treatments. Based on current empirical evidence, it is unclear which format
produces a better decision, and in which contexts (Edwards, et. al 2005). Further, the effectiveness
of each format type is likely to be influenced by the demographic and the cognitive characteristics
(e.g. socioeconomic status, skills, knowledge) of the decision-maker (Brundange, et. al, 2005;
Edwards, et. al. 2005).
Passivity versus interactivity of decision aids. The advantages of using decision-aids to
improve decision quality may be offset by the behavioral impacts of decision aid use (Glover,
Prawitt, & Spilker, 1997). In several studies, Todd and Benbasat (1993, 1994) found that the design
of decision-aids may induce decision-makers to even further minimize the resources that they invest
in decision-making. As Glover, Prawitt, and Spilker (1997) argue, such behavioral effects may be
problematic because it suggests that decision-makers use decision-aids in such a way that limits
learning, which may impair judgment in the end. This evidence suggests that decision-aids that
actively involve decision-makers may facilitate deeper exploration and learning in decision-making,
which may optimize the decisions that are made.
Common implementation and support features of current DSP models. DSP
implementation models published in Royer & Gilles’s (2003) seminal book on DSP, and on
websites of colleges implementing this placement strategy share four common features. These
features include: 1) a descriptive orientation of DSP (an overview of the requirements and
80
informational aids; consequences of misplacement), 2) a written or visual description of the math
sequence, 3) descriptions of each course alternative, and 4) a series of math problem sets.
The first feature shared by DSP models is an orientation session that familiarizes eligible
students about DSP. This orientation typically informs students that they are charged with choosing
their first math course, and should consider taking a course that can bring about the highest chance
of success. Students are told to consider factors like previous academic performance, collegiate and
career goals, and commitment to college when calculating the likelihood of success in each course.
They are also encouraged to approach placement decisions with earnestness.
The second feature of DSP models relates with the order in which courses must be taken.
For example, Royer and Gilles (1998) inform students that they can place themselves into two
courses - English 098 or English 150 - and that English 098 precedes English 150. In similar
fashion, Shasta College and Moorpark College, two California community colleges implementing
math DSP, provide students with a visual depiction of the math sequence, which acquaints them
with a semester-based progression of math courses. Visual depictions of math sequences typically
neither inform students that each course in the sequence lasts a semester nor that the length of time
needed to meet different degree requirements varies by where the students starts in the sequence.
The third feature among current DSP models is abbreviated descriptions of accessible
courses. While descriptions vary, their core purpose is to provide students with basic information
about course content and the degree to which a course fits with a student’s academic and career
goals. For example, students at Governor’s State University in Illinois are given information about
the math courses required for their major or field of study. Wake Forest University informs students
about the pace at which a course is taught, and Shasta College provides students with information
81
about which topics are covered in each class. Most of these descriptions are in text-format, and do
not provide visual representations of math content covered in each course.
The fourth, and final, feature, common among DSP models, particularly those used to help
students self-place into math, is math problem sets that give students the opportunity to
“objectively” evaluate their readiness for each course. While some colleges direct students to
external websites, where they can answer math problems that are not tightly linked to the math
courses being offered, other colleges, like American River College, display four problems testing a
student’s pre-requisite knowledge for each math course in their sequence. From the examples
reviewed, it was unclear whether students received feedback on their performance on math
questions.
A survey of these features indicates that the way in which they are designed does not elicit
interaction or personalize information to the needs of the student. Further, information intended to
improve decision making does not follow a set order or display format. Based on the research
presented above, it is unclear how students engage with or respond to math DSP when it is designed
and implemented in this way. How do they use the information contained in the decision aids?
What kinds of biases pervade decision-making even when decision aids are introduced?
To summarize, evidence from behavioral decision research shows us that people’s capacity
for rational thinking is compromised by the cognitive load imposed by the task, their susceptibility
to bias, and the way they interact with supports intended to improve their decision-making. Here, I
add to studies of decision-making and math self-placement by examining how community college
students engage in decision-making when tasked with choosing their first math course in college.
82
Research Design and Methods: A Mixed Methods Approach
This study employed a mixed methods research design strategy to develop a more complete,
nuanced understanding of how students make math placement decisions under current DSP
implementation practices. Unlike studies that rely exclusively on either qualitative or quantitative
research methods, the blending of both allowed me to achieve two goals. The first goal was to
harness the strengths of each methodology to enhance the validity of certain conclusions; the
second goal was to provide fresh insights that may have not surfaced if only one analytic
methodology were used (Creswell, Clark, Guttmann, Hanson, 2003; Greene, Caracelli, & Graham,
1989; Jick, 1979; Miles & Huberman, 1984, Kidder & Fine, 1987).
Among the various extant mixed methods designs, the research questions that guide this
study specifically called for the use of a Concurrent Mixed Methods Design (CMMD) (Tashakorrie
& Teddlie, 2003). Since the aim of this study was to provide empirical evidence on not only how
students make math self-placement decisions, but also about the influence of different kinds of
information on student decision-making, it was necessary to use a combination of qualitative and
quantitative methods.
According to Greene, Caracelli, and Graham (1989), the utility of a CMMD is based on the
principle of complementarity, or the notion that qualitative and quantitative methods can be used
together to paint a thick, elaborate description of an experience or response to an intervention. Thus,
the goal of the CMMD approach is to jointly measure overlapping but also distinct features of a
phenomenon, such as the behavioral responses and interplays. In the case of this study, I
accomplish the goal of CMMD by asking different types of research questions that elicit different
forms of data, and generate a variety of inferences that create opportunities for researchers and
policymakers to understand the strengths and weaknesses of current DSP implementation practices
83
from a variety of angles (Greene, Caracelli, & Graham, 1989; Rossman & Wilson, 1985;
Tashakorrie & Teddlie, 2003). While there is considerable debate about the sensibility of
combining applications of qualitative and quantitative methods in research given the fact that they
are premised on fundamentally different paradigmatic assumptions
36
(Morgan, 1998), I adopt the
pragmatic perspective espoused by Miles and Huberman (1984), Reichardt and Cook (1979), and
Boyatzis (1998), all of whom contend that researchers can divorce the paradigmatic and practical
features of distinct research methods to advance what is known about certain phenomena.
Implicit within mixed methods research is the fusing of stages defining qualitative and
quantitative research
37
(Creswell, Plano Clark, Guttmann, Hanson, 2003). Determining at which
stage, and to what extent, mixing should occur depends on the purpose and the goal of the study
(Greene, Caracelli, and Graham, 1989; Morgan, 1998). Since the intent of the quantitative and
qualitative components of this study considered distinct research questions about student interaction
with DSP that drew different types of data, I chose to integrate both methods when I reached the
final stage of the interpretation of results. Figure 1 in Appendix C graphically depicts how I
employed the CMMD to conduct this study.
In what follows, I first describe the exercise where I asked students to hypothetically self-
place in a math sequence with different informational supports. Second, I outline the sampling
strategy and sample used for this study. Third, I provide profiles of the community colleges where I
conducted this study and offer rationales that explain why they are appropriate in the context of this
36
For example, quantitative research is typically premised on a perspective that phenomena can be objectively analyzed
and assessed. In contrast, much of qualitative research is based on the notion that knowledge is constructed from
multiple view points.
37
The stages of research include data collection, data analysis, and the interpretations of qualitative and quantitative
results.
84
study. Fourth, I provide descriptions of the quantitative and qualitative strains of the study
separately.
Study Sites
Two community colleges (henceforth ‘College X’ and ‘College Y’), both of which belong to
a large community college district in Southern California, participated in this study. These two
community colleges were selected based on several criteria.
First, researchers, administrators, and faculty from both institutions expressed interest in
learning more about advantages and disadvantages of using DSP as a strategy for math placement.
Second, these two colleges resemble many other community colleges around the country in
that they serve large proportions of students deemed underprepared for college-level coursework,
implement a standard approach to sequencing their math courses, and attract a mix of student
populations that have a propensity to start their postsecondary education at a community
college
38,39,40
. Thus, by sampling students from these community colleges, and asking them to
engage in an exercise requiring them to hypothetically choose courses from those that may be
available if they were allowed to self-place into math, it may be possible to develop a better sense
of the way other community college students around the country would engage with this particular
placement strategy.
38
In the fall of 2011 alone, analyses of administrative assessment records revealed that roughly 80 percent of students at
College X, and 92 percent of students at College Y, were placed into a developmental math course.
39
The sequence at both colleges is modeled after math progressions cemented by K-12 standards set by the state and
transfer-level standards set by the University of California (UC) and California State University (CSU) systems (Ngo &
Kosiewicz, 2014; Melguizo, Kosiewicz, Prather, & Bos, 2014).
40
In College X and Y, Latinos represent sizable proportions of the student population (80% in College Y; and 35% in
College X). African-American students make up close 10 percent of the student population at College X. Although the
student composition of these colleges jointly mirror the community college student population at large, they also enroll
white students as well as students from unique ethnic groups (e.g. native born Chinese, Middle Eastern). While females
are overrepresented in both colleges, they make up approximately 60 percent of the student population at each college.
85
Sampling Strategy
To be considered eligible to participate in either the qualitative or quantitative component,
students at a minimum had to demonstrate that they had not previously been assessed for
developmental math or taken a math course in college
41
. Eligibility criteria ensured that students
could not draw on their math experiences in college to inform their self-placement decisions. In
addition to these two criteria, I used sex and ethnicity or race to purposively select students for the
qualitative component in order to appreciate how students of different demographic backgrounds
made placement decisions under a DSP model. This purposive sampling strategy was motivated by
evidence showing racial and gender disparities in math achievement (Catsambis, 1994; Vermeer,
Seegers & Boekaerts, 1996) as well as math self-concept, math self-efficacy, and math course-
taking patterns (Eccles, 2007). Because the student populations at both colleges are predominantly
Latino, it was difficult to find an ethnically and racially diverse group of students. Because this
study endeavored to speak to degree-seeking, college-age student populations, there were groups
excluded from the sample: Students who were concurrently enrolled in high school, younger than
18, older than 65, or non-degree seeking
42
. Students were recruited either on campus or over email
by me or institutional staff, and compensated for their participation in the study.
The Hypothetical DSP Scenario
The structural backbone of this study lies in a contemplative exercise in which participants
were asked to hypothetically self-place into a math sequence with various informational supports.
This scenario’s design was based on examples of math DSP models, which were publically
41
Assessment staff and the institutional research office from both colleges provided support in preventing students
failing to meet selection criteria from participating in the study. As an additional precautionary measure, I embedded
screening questions at the start of the survey to filter students who slipped through the first stage of selection.
42
I made these restrictions because they excluded students who do not resemble the typical degree-seeking community
college student population.
86
available on the internet or in print
43
. In the introduction of the scenario, students were told that
they would be enrolling in Mountain Top Community College, a fictitious community college,
which granted students who had never taken a math placement test or a math course in college the
opportunity to choose their first math course in college. There, they were also encouraged to select
a math course that best matched their academic abilities, collegiate and career goals, and
commitment to college; students were also informed that faculty and staff were not responsible for
misplacement, and that their placement decision lied solely with them.
To aid decision-making, students received a self-placement guide containing various
informational supports. These supports included: a visual depiction of the math sequence, a set of
course descriptions, and sets of pre-requisite math problems.
The visual depiction of the math sequence presented information about the order in which
students could take developmental and transfer-level math courses (see Figure 1 in Appendix D).
Illustrated as an abstract tree, the trunk showed the progression of development math courses,
starting with arithmetic, followed by pre-algebra, elementary algebra, and intermediate algebra
being at the top. Five transfer-level math courses – college algebra, statistics, pre-calculus, calculus,
and Math for Liberal Arts Majors – represented the “stems”, which grew from intermediate algebra.
Through this visual, students could potentially learn that, for example, arithmetic preceded pre-
43
Some examples include: Royer & Gilles (2003), Directed self-placement: Principles and practices. Hampton Press,
NY, NY. Math Self-Placement, Shasta College -
http://www.shastacollege.edu/student%20services/enrollment%20services/assessment/pages/12251.aspx
Informed Student Self-Assessment, Diablo Valley College -
http://www.dvc.edu/enrollment/assessment/issaassessment.html
Math Self-Assessment, Moorpark College, http://www.moorparkcollege.edu/apply-and-enroll/orientation-and-
assessment/self-placement-guides
87
algebra, and about the division between developmental versus transfer-level courses
44
. In line with
current implementation practices, no interactive features were embedded in the visual.
Course descriptions included textual information about course content, course pre-
requisites, and course transferability to universities within the UC or CSU systems, presented in that
order (see Figure 2 in Appendix D). Course content introduced students to material covered in the
course; course pre-requisites informed students about the course she or he must pass before
enrolling in said course; and course transferability community communicated whether the courses
counted towards an associate’s degree or upward transfer. Math faculty from College X reviewed
all course descriptions to check if the information provided was accurate.
Math problem sets, created for each available math course, contained four multiple-choice
math problems (see Figure 3, Appendix D). To be considered academically qualified to take a
course, participants were told that they had to answer three problems correctly.
Students had the option of choosing one of nine math courses as their first math course:
arithmetic, pre-algebra, elementary algebra, and intermediate algebra (developmental math
courses), and College Algebra, statistics, pre-calculus, calculus, and Math for Liberal Arts Majors
(transfer-level math courses). With the exception of giving students the opportunity to meet with a
counselor after a student reads through their self-placement guide, to the best of my understanding
community colleges typically implement DSP in this fashion
45
.
44
Because examples of visual depictions of math sequences did not explicitly relay the number of semesters needed to
meet any math requirement contingent on different starting points in the math sequence, this information was not
included in the visual.
45
In some instances, self-placement guides include questions that gauge a student’s motivation or self-efficacy.
Because it is unclear how these questions would affect placement decisions, they were excluded from this study.
88
Quantitative Study
The primary goal of the quantitative component of this study was to determine the extent to
which different types of informational supports typically included in self-placement guides
impacted math placement decisions (i.e. math sequence, course descriptions, math problems), and
the considerations that went into those decisions (e.g. race/ethnicity, gender, previous academic
performance, math self-concept, math self-efficacy). Secondary goals included gathering student
input in improving current DSP self-placement guides and support structures. Figure 2 in Appendix
C shows the design of the quantitative component.
Survey Design. I used Qualtrics to run the hypothetical DSP scenario through a
computerized, online survey. Students who took the survey were randomly assigned to one of two
treatment groups (explained below) to assess the influence of including pre-requisite problems sets
on placement decisions. Descriptions of math courses were presented on a single page in the
survey; to view a description students had to scroll over it with their mouse. For each problem set,
students were required to declare that they wanted to answer problems tied to that math course by
clicking “yes”. Students would then be taken to a page containing four multiple-choice math
problems. After choosing what they thought were the correct answers for all four math problem,
students would be apprised of the number of problems they answered correctly. Because students
were not forced to provide answers to each math problem, an incorrect math answer could be
interpreted as not having answered a math problem. Students were required to reveal their
preference for answering math problems for each math course.
Subsequent to the randomization, student answered questions about their conceptions of
their ability as well levels of efficacy in math. They also reported basic demographic data (e.g.
race/ethnicity, sex, age, first-generation status), and their previous math course-taking patterns and
89
performance (e.g. last high school math course, grade received in last high school math course; year
last took high school math).
Sample. A total of 131 students took the online survey. Approximately sixty five percent of
students took the survey at their college campus; the remaining percent took the survey online on a
computer connected to the internet. Table 3 in Appendix C presents summary statistics for the full
sample, and for the sub-samples of participants from each college for the quantitative component of
the study.
Math self-efficacy. Drawing on Bandura’s work on self-efficacy, I define math self-efficacy
as a student’s “judgment of their capabilities to organize and execute courses of action required to
attain designated performance” within the domain of mathematics (Bandura, 1986, p.391).
According to Bong and Skaalvik (2003), self-efficacy represents an individual’s belief or
conviction that he or she can succeed in a target domain (i.e. math problems, math related courses)
with the skills and abilities that they have. Methods used to measure math self-efficacy typically ask
respondents to report the perceived competence in solving problems, or completing math
homework. I draw on Fast et al.’s (2010) scale of perceived math self-efficacy, which was adapted
from Pajares and Miller’s (1995) math-self-efficacy scale, and Midgely et. al’s (2000) Patterns of
Adaptive Learning Scales (PALS) to measure academic self-efficacy. On a 5-point Likert-type scale
(not at all true=1; very true=5), students rated their degree of math self-efficacy on four items: “I'm
certain I can master the skills taught in class this year”,“I'm sure I can figure out how to do the most
difficult math”, “I can do almost all the work in math class if I don't give up”, and “Even if the math
is hard, I can learn it”. Chronbach’s alpha measuring scale reliability of math self-efficacy was 0.84.
Math self-concept. Self-concept can be defined as an individual’s perceptions of his or her
own skills and competences (Shavelson, Hubner, & Stanton, 1976). According to Skaalvik (1997),
90
an individual’s self-concept is influenced by five factors: 1) perceptions of ability in relation to that
of others, 2) attributions of success or failure to competences and skills, 3) perceptions or appraisals
of self-concept made by others, 4) previous experiences in particular domains, and 5) self-esteem. I
adopt Marsh’s (1990) scale of academic self-concept to gauge self-concept within the domain of
math. On a 5-point Likert-type scale (not at all true=1; very true=5), students rated their degree of
math self-concept on six items: “Compared to others my age, I am good at math”, “I get good
grades in math classes”, “Work in math classes is easy for me”, “I'm hopeless when it comes to
math”, “I learn quickly in math”, and “I have always done well in math”. Chronbach’s alpha
measuring scale reliability of math self-concept was 0.89.
Randomization. A randomization within the online survey was conducted to test the impact
of including math problems in addition to course descriptions on student math placement decisions.
Following the introduction of the hypothetical scenario and a visual depiction of the math sequence,
students selected a course they felt should be their first college math course. After making their
initial course selection, students were randomly assigned to one of two treatments. Students in the
first treatment group received descriptions for each available math course. Students in the second
treatment, in addition to the math course descriptions, received problem sets
46
. Table 4 in Appendix
C shows that students in the first and the second treatment group were equal in expectation across
race, sex, age, first-generation status, average math self-efficacy, average math self-concept, and
high school math performance and course-taking, suggesting that randomization was conducted
properly. After evaluating the additional information in their treatments, I gave students the chance
to change their initial math course if they felt another course was more appropriate. I did not
provide a course recommendation based on any information the students provided, instead allowing
46
I chose a two treatment design for reasons related to statistical power.
91
students to make decisions entirely on their own. Qualtrics recorded the decisions that students
made as they worked through the hypothetical DSP exercise. These decisions included the selection
of the initial course, the problem sets they answered, the number of problems they answered
correctly.
Data analysis. I conducted descriptive and regression analyses to examine data collected
through the online survey. Descriptive analyses included determining which math courses students
selected before and after receiving their treatments, pinpointing the number of math problems
reviewed and answered correctly. To ascertain the impact of combining math problem sets and
course descriptions, relative to course descriptions alone, on changing the initial course selection, I
ran the following OLS regression model:
yi =γ0 + β*MathProblemsi + θ*Xi + εi
Here, yi captures the probability that a student changed their initial math course; β measures
the treatment effect of combining math problems with course descriptions on course switching; θ is
the coefficient of a vector of individual covariates that may influence course switching such as
previous academic performance; and εitc is the individual-level error term. The main estimation
model specified above assumes a probit link function. I also ran regression analyses to determine
which factors predicted selection of transfer-level versus developmental courses pre and post
receipt of treatment. I used an inductive coding strategy to find patterns in answers to open-ended
questions.
Qualitative Study
The primary goal of the qualitative component of this study was to examine how students
make placement decisions when given greater autonomy. Specifically, it sought to understand the
strategies, biases, and informational inputs community college students use when they make
92
autonomous placement decisions. I collected basic demographic information (e.g. race, ethnicity,
sex, and high school math course-taking) on each participant.
Research Design. Two data collection methods were used to conduct the qualitative
component of the study: opened-ended interviews, and concurrent and retrospective verbal
protocols. Students participated in an open-ended interview first to get an unbiased account of the
information students would draw to make a placement decision absent the support of a self-
placement guide. Following the open-ended interview, students participated in a concurrent and a
retrospective verbal protocol, where they were asked to vocalize pre-decision thoughts about how
they would reach a placement decision with the assistance of a self-placement guide. Because the
goal this component was to understand how students made a placement decision, students received
the complete self-placement guide (a visual depiction of the math sequence, course descriptions,
and pre-requisite math problems). Informational supports were presented simultaneously, instead of
in a particular order, to determine whether students groups used their informational supports in
different ways. Figure 3 in Appendix C presents the research design of the qualitative component.
Participants. A total of 16 students were recruited for the qualitative component. Table 5 in
Appendix C presents descriptive statistics for the qualitative component of the study.
Interviews. Students first participated in a broad, unstructured interview, in which they were
asked to report the types of information they would require or draw on to make a math placement
decision if they had an opportunity to choose their first math course in college. I left this question
broad and unstructured for several reasons. First, I wanted to identify the kinds of information and
factors students initially thought were critical for informing their math placement decision. While
advocates of DSP state that it corrects for imperfections inherent in test-based placement strategies
by allowing students to factor in important non-cognitive characteristics (e.g. motivation, self-
93
efficacy) in placement decisions (Royer & Gills, 1998), it is unclear whether students actually do.
Second, I wanted to get a sense of whether students mentioned decision inputs absent from DSP
research (Creswell, 2007).
Concurrent and retrospective verbal protocols. Following the short interview, students
participated in two verbal protocols, a process tracing method centered on capturing the strategies
and informational inputs that mediate the relationship between the introduction of the task and the
making of a decision (Hastie, Schkade, & Payne, 1998). I chose to use this method because it is
designed to gather information about how a decision maker identifies potential alternatives,
attributes outcomes to each alternative, and ascribes meaning or weights to various features of each
alternative. Verbal protocols can be considered verbatim records, providing very detailed,
unrestricted accounts of how people make decisions with minimal interruptions and outside
influence (Arch, Bettman, & Kakkar, 1979; Newell & Simon, 1972). Kuusela and Paul (2000)
argue that verbal protocols are particularly useful when a well-founded theory explaining decision-
making behaviors is absent from literature. Given this investigation seeks to elucidate expose the
unstudied strategies and informational inputs used to direct student decision-making behaviors
under a DSP model, verbal protocols were an appropriate data collection strategy.
To elicit pre-decisional thoughts, I employed concurrent and retrospective verbal protocols
to discern how students make autonomous math placement decisions
47
. According to Taylor &
Dionne (2000), combining both methods, as opposed to utilizing just one, increases the breadth and
depth of data describing problem-solving strategies and knowledge (Taylor & Dionne, 2000)
48
, and
is particularly helpful when discerning what decision makers actually know about problem solving
47
In the literature, verbal protocols are called talk-alouds.
48
The challenge with using concurrent verbal protocols to capture pre-decisional thoughts is that they may change the
structure of an individual’s thought processes, and at times, retard the time it takes for an individual to complete a task
(Schulte-Micklenbeck, Kuhberger, & Ranyard, 2011).
94
and what they actually do when they solve problems. I employed concurrent verbal protocols
(CVP) to observe a student’s decision-making behaviors. I used retrospective verbal protocols to
enhance the validity of verbal report data as well as check my understanding of the decision-making
strategies and informational inputs a student employed.
Before students were tasked with selecting what they thought should be their first math
course, they engaged in two warm-up exercises that helped them become comfortable with the act
of vocalizing their thoughts guiding their decision processes (Ericsson & Oliver, 1988; Taylor &
Dionne, 2000)
49,50
. After completing the warm-up exercises, I gave students an envelope containing
slips of paper marked with the names of available math courses, and a complete self-placement
guide, which included a visual of the math sequence, math course descriptions, and math problem
sets. In their self-placement guide, I also presented students with an overview of the self-placement
guide, which apprised them of information they could glean from each resource to help them
choose a math course. I informed students that the ways in which they used the contents of their
self-placement guide in steering their decision-making was entirely up to them, and that the task
required them to talk aloud as they worked through making a decision with the support of their self-
placement guide. I checked each student’s understanding of the task before I allowed them to
proceed with engaging in the exercise.
After choosing a math course, I asked students to recount the steps they took to reach that
decision by watching themselves on a video recording
51
of their engagement with the exercise. I
49
Warm-up activities have been able to minimize response variability attributed to unfamiliarity with this type of data
collection method (Taylor & Dionne, 2000).
50
In the first warm up exercise, students, who were told that they were financially constrained and health conscious
were tasked with choosing one of five beverages, each differing by price and in caloric content. In the second exercise,
students who were told that they attended school full time, worked part-time, and had no car, were charged with
choosing one of four apartments for themselves and a roommate. Each apartment varied in number of bedrooms,
bathrooms, square feet, noise level, distance to a grocery store, and distance to the bus stop.
51
Students gave permission to be audio and video recorded.
95
specifically focused students on the moments in their decision-making where they became silent or
had noticeable difficulty expressing their cognitive thoughts. To minimize threats to validity that
commonly emerge while conducting retrospective debriefs, I asked participants matter-of-fact
questions, emphasizing the “what”
52
rather than the “why” (Taylor & Dionne, 2000). I also used
video-recordings to check my understanding of the decision-making processes students took to
solve the dilemma of self-placing into math. All interviews and verbal protocols were transcribed,
including tone, pauses, and emphases in speech when possible to capture affect as best as I could.
Data analysis. I analyzed verbal protocol data during and after the data collection stage to
describe student decision-making processes. I divided each protocol into blocks that reflect discrete
stages of decision making (i.e. problem recognition, task diagnosis, strategy selection, decision
stage.) I then divided each block into statement units, each containing a single main idea (Newell &
Simon, 1972). Each statement was coded according to one of Ranyard’s (1987) five categories:
approach, diagnosis, choice, relative judgment, absolute judgment, and strategy (Ranyard, 1987).
Approach reflected an attempt to understand the problem task; diagnosis indicated search for
information; choice indicated elimination of an option; relative judgment indicated a comparison of
two or more alternatives; absolute judgment indicated an appraisal of a single alternative; and
strategy identified the method used to judge or choose between options.
After identifying the decision-making processes used by each student, I created a decision-
making profile for each one describing how she or he reached a placement decision. In developing
each profile, I paid attention not only to the steps and the informational inputs students employed
but also to their expressions of affect (e.g. anxiety, assuredness) caught in their audio-recordings
and video recordings but not captured in the transcripts. To identify patterns in decision-making I,
52
For example, “What you were thinking before you made your placement decision?”
96
first, looked for commonalities in how students of different races and ethnicities, and sexes
approached the task of self-placing into math given evidence suggests that framing plays a key role
in shaping diagnostic and decision-making processes (Tversky & Kahneman, 1986). Given that
none were found, I then took a more inductive approach to uncover traits or characteristics shared
across profiles. Next, I looked for commonalities in how students searched for and diagnosed
information to eliminate alternatives, up to the point when they made a final placement decision. As
a sole researcher, I reviewed these profiles multiple times to ensure that I could explore them in
alternative ways and with nuance (Barbour, 2001). Decision-making patterns were finalized when I
reached the point where I could not interpret the profiles differently (Patton, 2001). From there, I
synthesized these patterns into narratives describing the most salient strategies and inputs used in
the decision-making process, in general, and across different types of students.
Integration of quantitative and qualitative findings. For this study, the goal of mixing
quantitative and qualitative methods was about finding corroboration between results produced by
both methods about expanding our understanding of how community college students approach the
decision of self-placing into math (Onwuegbuzie & Leech, 2005). As such, I first reviewed
the quantitative and qualitative findings separately to draw independent conclusions from each set
of data analyses. As a second step, I compared both sets of findings to uncover ways in which the
quantitative findings corroborated or conflicted the qualitative findings and vice versa. Findings
that were corroborated across the different methods strengthened conclusions; findings that
conflicted provided insight into areas where more research should be conducted.
Placement Decision-Making under DSP
The results from this study, presented below, demonstrate that community college students
engage in the process of self-placement in unique and complicated ways. These ways appear to be
97
more a function of differences in psychological characteristics and life objectives (e.g. perceptions
of math abilities, learning versus performance orientations, academic and career goals, and previous
math experiences) and the susceptibility to decision biases and less related to a student’s sex, race,
or ethnicity. For students who took the online survey, the order in which information is also
presented seems to affect the kinds of information students draw on in their decision-making.
Regardless of the way students engaged in placement decision-making, evidence from this
study shows that community college students employ strategies that limit the search for and the
processing of information intended to help them choose a math course on their own. The results
presented hereafter raise questions about the efficacy of current DSP implementation practices
designed to improve the quality and accuracy of autonomously made math placement decisions.
In the following sections, I outline the results in two sections. First, I present findings
demonstrating the influence of self-placement guides on math placement decisions, and the
considerations or factors that went into them. Second, I profile three groups of students – based on
their approaches to decision-making - to examine how different types of students appear to tackle
the dilemma of choosing a math course on their own. Within this section, I present quantitative
survey results supporting the notion that students engage in more in-depth processing of a select
number of information components.
The Influence of Decision-Aids on Placement Decisions
Figure 4 in Appendix C plots the distributions of “naïve” and “informed” placements in the
math sequence. I define “naïve” placement decisions as decisions that relied solely on information
presented in the introduction of the hypothetical exercise and the visual depiction of the math
sequence. By contrast, I define “informed” placement decisions as decisions made following the
98
presentation of course descriptions and/or math problem sets. I obtained these distributions by
running basic, unconditional tabulations of placement decisions pre and post receipt of treatment.
Naïve placement decisions. As depicted in Figure 4 in Appendix C, the distribution of
naïve math placement decisions is skewed left, indicating that the majority of respondents believed
they were qualified to place into a course that was no more than one-level below transfer. Over two-
thirds of respondents selected a math course at the transfer-level or one-level below transfer. Within
the transfer level, roughly two-thirds of respondents chose “College Algebra”, which implies that
respondents may have believed that “College Algebra” followed in direct succession of high school
Algebra II. Figure 4 also indicates an inverse relationship between the number of levels below
transfer and the proportion of naïve placements. This evidence shows that perceptions of fit of each
course diminish in tandem with its academic rigor.
According to survey respondents, the three leading factors influencing naïve placement
decisions, in rank order, related with their high school math course-taking, the need to refresh their
math knowledge, and negative associations they had with math. Table 6 in Appendix C shows that
roughly a third of survey respondents stated that the last math course they took in high school
informed their naïve placement decision, which also played a critical role in the way students who
participated in the qualitative component of the study reached a placement decision (described
later). Estimates also show that 20 percent of respondents reported the need to refresh their math
knowledge or that their feelings about their inadequacy to succeed in math impacted where in the
sequence they chose to place themselves. By contrast, less than four percent of all respondents
factored the opportunity to transfer credit to a four-year institution into account when making their
placement decision.
99
While regression analyses reiterate the importance of previous math course-taking in the
decision-making of students, they also suggest that it is not the most important contributor to
placement decisions. Results from Table 7 in Appendix C show that math self-concept (the belief
that one is good in math) is the most powerful predictor of the likelihood of choosing a transfer-
level math course, prior to receiving a self-placement guide. Specifically, a one point increase in the
scale measuring math self-concept was associated with 30 percentage point increase in the
probability of selecting a transfer-level math course. Whether the student’s last high school math
course could be considered college-math, and whether that course was taken in 2014 were also the
second and third strongest predictors of choosing a transfer-level math course. Interestingly,
demographic factors such as race and sex, or math self-efficacy (the belief that one can succeed in
math) did not emerge as having an influence over naïve placement decisions.
On the whole, respondents reported feeling confident in the correctness of their placement
decision, even without basic information about each course (i.e. course content, course pre-
requisites, course transferability) or an objective assessment of their chances to succeed in those
courses. On a scale from one to ten, respondents rated their degree of confidence in their placement
decision at an average of 7.13 points, which suggests that most respondents believed they had
chosen the appropriate math course. Interestingly, respondents who selected a transfer-level course
or were female felt even more confident than the average respondent, reporting average confidence
ratings of 7.43 and 7.53 points, respectively.
53
Informed placement decisions. As shown in Figure 4 in Appendix C, informed math
placement decisions follow a slightly different distribution. While still skewed left, receipt of
treatment pushed a greater percentage of placement decisions into the lower levels of the math
53
Respondents who were African-American, Latino, or Pacific Islander reported confidence ratings similar to that of
the average respondent.
100
sequence. Results from a Chi-square test indicate that there is a significant difference between the
distributions of informed and naïve placement distributions (χ2= 276.44, p< 0.01). Indeed, of the
quarter of respondents (henceforth called switchers) who changed their naïve placement decision,
70 percent opted for a lower placement level. A third of altered placement decisions were one-level
below transfer, and close to a quarter were two-levels below transfer, demonstrating that
adjustments to placement preferences were, for the most part, marginal. Nevertheless, these shifts
do suggest that including information about course content, course pre-requisites, and course
transferability, and opportunities to gauge degrees of readiness for courses in the math sequence,
may temper over-confidence bias, or an over-inflated assessment of one’s abilities to succeed in
more rigorous math courses. An alternate interpretation is that they may also induce stress or
anxiety about math which may also serve to drive placement decisions into the lower levels.
Interestingly, math self-concept and previous math experiences continued to be the strongest
predictors of selection of transfer-level math post receipt of treatment (self-placement guide),
casting doubt about the effectiveness of decision-aids to stamp out the influence of conceptions of
math ability on placement policies. Referring back, Table 6 shows that while the sizes of these
coefficients decrease, they do so only somewhat, and remain large and statistically significant. This
evidence demonstrates that students still rely heavily on these factors to guide decision-making
despite receiving informational support from self-placement guides.
Judgments in confidence in the correctness of the respondent’s placement decisions also
increased as a result of exposing respondents to course descriptions and math problems. Post receipt
of treatment (self-placement guide), respondents reported an average confidence rating of 7.68, one-
half point higher than the one reported following their naïve placement decision. The difference in
these two means is statistically significant at the one percent level (t= -3.40, p<.01), which conveys
101
the notion that self-placement guides may assuage, to some extent, feelings of uncertainty in the
accuracy of placement decisions.
The influence of the inclusion of math problems in self-placement guides. Thus far, the
influence of information on math placement decisions has been reviewed in the aggregate. Here, I
examine differences in the way information aids influence placement decision-making. Controlling
for previous course-taking and performance, race, sex, first-generation status, math self-concept and
math self-efficacy, respondents assigned to the second treatment group were 11 percentage points
(p>.17) more likely to switch their naïve placement decisions relative to respondents in the first
treatment group.
54
Figure 5 in Appendix C shows that about a fifth of respondents assigned to the
first treatment changed their naïve placement decision, whereas close to a third of respondents
assigned to second treatment carried out the same action. This statistic suggests that students
respond strongly to math problems in self-placement guides in re-evaluating their chances of
success in courses in the math sequence. The fact that this coefficient is not statistically significant
may be related to the relatively small sample size participating in this study, which may be
insufficient to detect the true effect of adding math problems to a self-placement guide on changing
a naïve placement. However, the evidence here suggests there is likely a meaningful difference,
which may have been detected with a larger sample.
Profiles of Decision-Making under DSP
Three perspectives on decision-making under math DSP emerged from analyses of verbal
protocol data. The first perspective framed decision-making around the notion that choice should
maximize the mastery or the learning of math rather than performance in math or in school. The
second perspective couched decision-making in the idea that preference should be guided by
54
I could not investigate the extent to which the inclusion of math problems in self-placement guides impacted
switching into different math levels because I lacked statistical power.
102
whether a particular course helped students reach their academic objectives (e.g. major, degree
credit, and/or upward transfer). The third perspective structured decision-making around feelings
about math or previous experiences in math. As evidenced below, students adopted mental frames
that influenced how they searched and processed information in their self-placement guides to reach
a final placement decision, even though all used non-compensatory strategy. Table 8 in Appendix
C provides information about the demographic and academic characteristics of each student, and an
overview of their approach and diagnostic and decision-making strategies used to solve the
dilemma of choosing a math course on their own.
“I’m here to learn math”. Central to the “I’m here to learn math” perspective is the notion
that the choice of a math course should maximize learning in math, competence in math, or the
mastery of new math content (Dweck & Elliot, 1983; Pintrinch, 2000). Less emphasized within this
perspective was an orientation towards performance, or the need to demonstrate competence in
math or do better in math than others (Pintrinch, 2000). Nevertheless, the focus on mastery of and
self-improvement in math, for some students, seemed to be informed partly by perceptions of low
levels of confidence in math ability. Students who subscribed to this perspective exhibited an
inclination towards ignoring informational supports from self-placement altogether, or failing to test
hypotheses that had the potential to disconfirm initial impressions of course fit.
Student backgrounds. Six students fit this profile, and had different backgrounds with
respect to age, ethnicity, and educational experience. With the exception of Rachel and Daniel who
had recently attended high school, Jeff, Cameron, Linda, and Gloria were older and had decided to
return to school to pursue a postsecondary credential
55
. Jeff was 24 years old, Cameron was 27
years old, and Linda and Gloria were 30 years old. Both Jeff and Cameron are Caucasian males;
55
All names are pseudonyms.
103
Daniel is a Latino male; Linda and Gloria are Latina females; and Rachel is female of mixed
ethnicity (Latino/Caucasian).
These students enrolled in college for a variety of reasons. Jeff had completed a military
tour in the Middle East, and was using the benefits from his G.I. Bill to major in business.
Cameron, who appeared more career oriented than the other students, expressed an interest in
pursuing a career in 3-D graphic design. Rachel wavered on pursuing a degree in law, math, or
health and was persuaded to enroll in college upon her father’s insistence. Linda, Daniel, and Gloria
found value in attending school but were unsure about what they wanted to study. When asked for
the name of their last high school math class, only Rachel and Daniel could clearly remember that
they took Algebra II. Cameron, Linda, Jeff, and Gloria all struggled to answer this question, but
guessed that their last math class was either Algebra I or Algebra II but they were uncertain.
Approach. Each of these students approached the task of choosing a hypothetical math
course by remarking that they were interested in increasing their competence in math much more so
than about proving to others that they were good in math. For instance, Linda put it this way:
- I mean, I would definitely choose something that I am comfortable with to begin
- and then…
- because it’s about learning.
- It’s not about finishing a class or anything,
- so to me like understanding something and
- then kind of growing from there.
Rachel stated that she did not want to “go over stuff that I know how to do because it’s just
going to be boring and repetitive” but instead wanted to take a course that “seemed above my
level…[that] I would want to try, because I wanted to broaden my knowledge.” Gloria made this
point: “I'd rather go through the process and learn again …like work my way up.” In contrast to the
rest of the students who appeared intrinsically motivated to learn, Jeff seemed to be extrinsically
motivated to focus on learning, considering “the government is paying for me to go to school
104
because of the military.” Because – according to him – he would have to pay for any courses he
failed, he was afforded time to take a course that allowed him to develop a solid academic
foundation as he moved through college.
Couched within the students’ perspectives were utterances reflecting self-confidence in
math and academic abilities, in general. Whereas Cameron, Jeff, Linda, and Gloria mentioned that
they were not good at math, Daniel and Rachel expressed high assessments of their math abilities.
For example, Daniel recounted how he took Algebra 1 and Geometry in middle school, and went
straight into Algebra II as a freshman in high school, a sequence that students normally take a year
later. In contrast, Cameron stated that he was “not good at math” and would “probably suck” in the
math course he hypothetically chose. As such, Cameron, Jeff, Linda, and Gloria could be classified
into high mastery / low math self-concept category, and Daniel and Rachel into high mastery / high
math self-concept students. The connection between the orientation towards mastery and
orientations towards performance is well documented in education psychology (Ames, 1992; Ames
& Archer, 1988; Pintrich, 2000), and is thought to influence and motivate the educational pathways
or trajectories students take (Pintrich, 2000).
Information search and processing strategies. Before searching for information presented
in their self-placement guides, all six students reported in the interview that they would reference
their past course-taking and performance in high school math to begin the process of choosing the
right math course. Rachel also reported that she would search for information about faculty
teaching those courses (i.e. teaching style, teaching quality), the material covered in each course,
and the times at which each course section were being taught to ensure that she could make class.
Data from the verbal protocol analyses show that previous math experience influenced the
initial elimination of courses for most students. Rachel and Jeff used EBA to disqualify pre-algebra
105
and elementary algebra because they had taken those courses in high school. Because Gloria,
Cameron, and Linda could not recall with precision which math course they took in high school,
they employed EBA to ignore every alternative but arithmetic because it was the easiest math
course. From Gloria, Cameron, and Linda’s actions, it appears that the reliance on fuzzy
reconstructions of math experiences may have driven decision-making to focus on the least rigorous
math courses.
After the first round of elimination of alternatives, Linda, Gloria, Rachel and Daniel,
employed exclusively non-compensatory strategies to search for and process information found in
their self-placement guide. Each started with examining the course descriptions; Linda, Gloria, and
Rachel, read solely the course content for the courses they considered viable alternatives; Daniel
read information about course content, course pre-requisites, and course-transferability, for all of
the available alternatives. From there, each evaluated course information differently. Gloria, who
used the Satisficing Plus heuristic, settled on arithmetic after determining the content covered in
that course matched her math knowledge. Linda, who came to the conclusion that arithmetic was
too easy based on her read of the material covered in that course, subsequently compared the
content of pre-algebra against the content of elementary algebra. Unable to determine preference for
either, she answered four pre-requisite problems for pre-algebra, correctly solving three. She settled
on pre-algebra after incorrectly solving a single problem in the elementary algebra problem set.
Rachel, used EBA to eliminate all but pre-calculus and college algebra after deciding arithmetic and
intermediate algebra were too easy, Math for Liberal Arts would limit her academic choices, and
Statistics and College Algebra were too difficult, before flipping a coin to finally select college
algebra. Daniel, who tentatively chose elementary algebra after employing the FGB heuristic to
disqualify alternatives primarily based on his comfortability with course content and whether he
106
met the course-pre-requisites, correctly solved three pre-requisite math problems for elementary
algebra, qualifying him to take that course. He subsequently chose elementary algebra as his final
decision. While the evidence proves that students do not adopt a single way of searching for and
diagnosing information intended to help reach a sound placement decision, it also shows that
students focused on learning may rely heavily on course content to guide placement their decision-
making.
Notably, and in stark contrast to the other students, Cameron and Jeff completely ignored all
information from their self-placement guide to immediately select a course. Figure 6 in Appendix C
provides a visual for how Rachel – one student who solely used EBA and disjunctive strategies -
reached her placement decision.
Decision biases. Besides search-satisficing, there were three decision biases that emerged in
the analysis of verbal protocols for this group of students: pre-mature closure, anchoring bias, and
visceral bias. Pre-mature closure – a bias which influences individuals to make decisions early as a
result of apathy or recklessness - was particularly evident in the decision-processes employed by
Jeff and Cameron. Both ignored information in their self-placement in making their final decision.
In explaining why they did not employ their self-placement guide, Jeff said, “I don't really
understand how this information could sway my opinion”. In the same line of thinking, Cameron
asserted in this excerpt:
- Honestly, I don't really know if I care for this information…
- I'm just barely starting so –
- And like I said, I didn't really do good in high school,
- so I'm going to act like college is going through high school all over again,
- at least in the beginning.
Although Cameron and Jeff may be considered extreme cases of premature closure bias,
Linda also exhibited the tendency to choose a math course too soon in the decision-making process.
107
She decided against choosing elementary algebra as her math course after incorrectly solving the
first problem in the EA pre-requisite problem set, arguing that if she could not “solve this
one,” then she “definitely” could not solve the remaining three. Her decision to forgo completing
the full set of the problems may signal that some community college students may seek to find
information that upholds their impressions of ability. Based on the decision actions taken by
Cameron, Jeff, and Linda, it appears that the desire to learn math, while shown to promote
motivation and persistence (Pintrich, 2000), may actually encourage students to entirely discount or
at least stop short of evaluating information intended to adjust and improve decision-
making. Personalizing self placement guides may help to protect students against pre-mature
closure by tailoring and aligning the information found within to their respective goals and
backgrounds. Embedding a tool that intentionally obligates students to answer all pre-requisite
level may also help to stem beliefs that convince students of their own underpreparedness.
Nevertheless, it may also diminish the quality of decision-making if students do not want to invest
more cognitive effort than they already have.
Anchoring bias provoked by visceral bias was evinced in the way Rachel engaged with
tackling the task of self-placing into math. In deciding between pre-calculus and college algebra,
she hesitated to answer math problems that could have tested her level of preparation for both
courses, stating “I’ll feel horrible if I can’t answer them.” Because of this anxiety, she did not
solve the pre-requisite problems for either course. Rachel’s decision to refrain from gauging her
readiness for pre-calculus and college algebra revealed an authentic fear that she would find
evidence challenging her belief of readiness for transfer-level math. In lieu of using her would-be
performance on both problem sets to determine course preference, Rachel used a flip of a coin to
108
decide between college algebra over pre-calculus, a common tradeoff faced by decision-makers
when there is little visible difference between two alternatives (Bettman, Payne, Johnson, 1991).
In summary, placement decision-making for this group of students seems to be affected not
only by a desire to master “the basics” but also by the inclination to exclude evidence that could
potentially undermine the validity of pre-conceptions of math ability. The explicit focus on mastery
appeared to come at the expense of fully appreciating the consequence of choosing a lower math
class on the chances of reaching a degree. The fact that some students expressed ambiguity about
their goals for college may only exacerbate this problem.
“I need a math course that meets my academic goals”. Decision-making for the second
group of students was driven by the desire to take a course that satisfied general education
requirements for their academic goals. Unlike students oriented towards increasing competence in
math, most students in this group were of typical college age, had clearer conceptions of their
collegiate and career goals or interests, and exhibited higher levels of math self-efficacy. While this
group exhibited a propensity to selectively use all, as opposed to just some informational supports
from their self-placement guides, they were nevertheless still susceptible to decision biases,
particularly over-confidence and confirmation bias.
Student backgrounds. All seven students reported being either 18 or 19 years old. Students
were evenly split male and female, and racially and ethnically diverse. Jonathan is an African-
American male; Cynthia, Kristen, and Araceli are Latina females; Matt is a male of mixed ethnicity
(Latino/Caucasian); and Mani is Middle Eastern of Iranian descent.
Setting them apart from others, these students at the start of the participation in the study
declared an intent to pursue a subject-specific major or degree or to transfer to a four-year
institution. For example, Jonathan was interested in pursuing a communications degree; Matt
109
indicated that he was taking courses to major in sociology; Cynthia wanted to pursue an accounting
degree; and Kristen wanted to study Child Development. Even though half of the students in this
group took Algebra II in high school, Matt took Math Analysis (the equivalent of pre-calculus) and
Mani took Statistics, demonstrating that a few of these students had taken advanced math courses in
high school
56
.
Approach. All students in this group adopted a mental frame that structured course choice
around meeting the general education requirements for an associate’s degree, a major, or upward
transfer. Matt, in articulating how he would deal with the problem of self-placing into math,
unequivocally asserted that he only wanted to take a course that allowed him to transfer to a four-
year institution. He stated adamantly: “I think about is as … someone who is very driven, and I
know that for a fact that I need to transfer, and I want to transfer sooner than later. “
Jonathan started his decision-making by recognizing that pursuing a communications degree did not
require advanced math. In this excerpt, he shared:
- say if I were to take a major in engineering or something like that
- something that involved like a high level math,
- then it would make sense for me to pick up a higher level math course,
- but I want to do communications or
- something that just virtually doesn’t really involve math,
- I wouldn’t need a course like that.
Others within this group repeated Matt and Jonathan’s logic in how to approach choosing a math
course. Kristen made it clear that she would pursue a course “if [the course] counts for anything”,
because she did not want to matriculate in one that does “not give you any credits”. Similarly,
Araceli said she wanted to select something that “count[ed] as a course,” and Mani expressed desire
to “meet the [transfer] requirements on time.”
56
One student reported taking geometry as their last math course.
110
In analyzing both concurrent and retrospective verbal protocols, the approach to structure
choice around meeting math requirements may have been influenced by a general consciousness
that choosing a non-credit bearing or lower level math course could delay progress in reaching a
degree or completing a major. Jonathan, for example, just “wanted out of community [college], just
like, you know, two years and transfer.” Matt paralleled this sentiment by saying “ [I] don’t want to
spend too much time in community college on math”, and that he knew he “wanted to take one
semester” of math. While just two students relayed the importance of time in the way they would
address self-placing into math, it nevertheless may suggest that others within this group may have
thought the same thing but did not or were not able to vocalize it. It also may suggest that students
who fit this profile may have a better wherewithal or awareness about the costs (financial and time)
of attending community college that other students may lack.
Information search and processing strategies. In the interview, students in this group
mentioned that they would want to draw on a number of factors to guide their search for the most
appropriate math course. While course-taking and performance in high school math registered as
one data point for five students, it was just one of many cited. Half of these students reported a
desire to see course descriptions providing information about whether the course counted as credit
(Jonathan, Cynthia), had any pre-requisites (Matt), and covered material that was familiar to them
(Matt). Matt went even further to say that he wanted to see an actual syllabus for each course to
infer “how difficult [a] course was actually going to be and what [he] was going to need to know
for that class”. Besides detailed course descriptions, Araceli stated she wanted to get an objective
read of her math skills, saying: “[I] would give myself a self-test to see where I’m at and how much
I really do know the course”. Finally, Mani was the only student to report wanting to get a second
opinion on which course he should choose, sharing “I would ask my counselor to see which course
111
is good for me”. Together, the factors mentioned here mirror the types of supports typically given to
students who self-place into math.
As illustrated in the concurrent and retrospective verbal protocols, the approach to structure
placement decision-making around satisfying pre-requisites for academic targets seemed to mediate
how students searched for and evaluated information in their self-placement guides. For instance,
four students, at the start of their engagement with their guides used the EBA strategy to quickly
eliminate courses that did not count as credit towards a degree. As a first step, Matt picked up and
reviewed the visual depiction of the math sequence, stating:
- I 100% would get rid of all of the pre transferable classes,
- and that being said,
- like all of those ones go away,
- because I need something that is going to be,
- in other words,
- I want to take one course and
- that be the course
- and that [it] be okay to transfer.
Cynthia took an approach that was different from Matt, but similar in purpose, reading information
about pre-requisites and transferability for all course alternatives to eliminate all but college algebra
and pre-calculus. Jonathan, who expressed that he “was never good at math,” hesitated in picking a
transfer-level math course, opting instead for elementary algebra after seeing that it was only two-
levels below transfer in the math sequence. Mani, who eliminated all but college algebra and
statistics, felt like he met the course pre-requisite for both courses (intermediate algebra), and was
comfortable with the content taught in each. Although personal comfort did emerge for some
students as a factor in the elimination of course, it was largely secondary in importance to the
possibility of applying a course towards college credit.
Following the first round of elimination of courses, students employed a variety of non-
compensatory strategies to search and evaluate information from their self-placement guide.
112
Despite the fact that the strategies by and large mirror those used by other students (e.g. satisficing,
EBA), what is interesting is that these students in particular took cues from pre-requisite math
problems to govern their decision-making. All six students scanned, if not solved, math problems
testing pre-requisite knowledge for course alternatives that appealed to them. In fact, all students –
with the exception of Jonathan – drew on their performance on math problems to cement their final
course choice. For instance, Cynthia chose intermediate algebra after discarding pre-calculus and
college algebra for her inability to solve pre-requisite problems for both courses.
This pattern of search and evaluation suggests that students within this group used course
descriptions and the math sequence to detect courses that could put them on track towards reaching
their academic goals, relying on math problems to adjust or narrow in on a final course. Figure 7 in
Appendix C provides a visual for how Matt reached his placement decision.
Decision biases. Three decision biases stood out among this group of students: over-
confidence bias, confirmation bias, and ascertainment bias.
Over-confidence bias was evinced in how students handled pre-requisite math problems.
Half of all students in this group scanned but did not solve the pre-requisite problems for potentially
viable courses. For instance, in judging his prospect for success in arithmetic, Matt said: “It's like
this right off the bat, I know how to do it, 100%.” Similar to Matt, Jonathan, after solving two math
problems, browsed over the remaining two, insisting: “I'm familiar with these problems, and just
like you know, a month of review or something, I'd get back to where I was with math, and, like, it
would be easy then.” Cynthia scanned the pre-requisite problems as well.
Mani decided on intermediate algebra despite correctly solving only two pre-requisite
problems for that level, providing evidence for confirmation bias. He attributed his failure of
113
meeting the qualification threshold to solving the problems in his head rather than on paper. He
said:
- so maybe I should slow down and
- like trace my steps and try and
- answer them and
- like write them down,
- because I was trying to do them in my brain in the first place,
- and then I got like maybe it'll be better if
- I write it down to help me solve them.
The way in which Matt, Jonathan, and Mani approached the pre-requisite math problems
shows that they superficially assessed their level of preparation for courses in the math sequence.
While their actions may have reflected a desire to bolster confidence in their hypotheses that they
should be taking courses leading to transfer – suggesting the presence of decision bias – it also may
reflect a high degree of math self-efficacy. Students who feel like they can succeed in math if they
take the right course of action may take shortcuts that limit what they may gain from solving math
problems. The relationship between self-efficacy and decision-making has been largely researched
within the context of career or managerial decisions (Betz & Luzzo, 1996; Taylor & Betz, 1983;
Taylor & Popma, 1990; Wood & Bandura, 1989.) Although much of this research suggests that
self-efficacy is associated with more effective analytic thinking, the research suggests the contrary
if we consider searching and evaluating information as part of that analytic process.
Both Cynthia and Matt relied on other people’s perceptions of or recommendations for math
courses in college to make initial assessments of the suitability of available courses, providing
evidence of ascertainment bias. For example, Cynthia referenced that she had spoken to “people
[who have] already taken those courses, and told me about those courses, and [if] I should do them
or I shouldn’t.” Matt also mentioned that he “heard” about content in different courses from others
who were already enrolled in math in college. Interestingly, after focusing on the transfer-level
114
math courses, he dismissed the idea of reading the course description for Math for Liberal Arts,
stating that he knew he was “not doing liberal arts” even though he intended to study sociology, a
major typically housed in liberal arts departments. Although research has shown that belonging to a
network of individuals with experience in college can help incoming students gather critical
information for overcoming roadblocks to integration (Tierney, et. al, 2009), students who rely on
unsubstantiated information may make suboptimal decisions. Therefore, while incoming students
should stay connected with peers to navigate the ins and outs of college, it is, nevertheless,
imperative that faculty and staff dispel misinformation students receive.
To summarize, students who have a clear sense of their academic goals may be susceptible
to decisions biases that steer them away from courses that may potentially put them off-track from
reaching these goals. Nevertheless, it seems that students categorized into this group view taking
math in college largely as a stepping stone, or a barrier for some, rather than a key component in
their postsecondary education.
“I’m not good at math” / “Math is difficult for me”. Students in this group used their
thoughts and feelings about math to lead their decision-making. Largely absent from their approach
to decision-making were vocalized references to the importance of transfer or mastery of math
content. Students, in this group, prioritized comfort/familiarity with material covered taught in
course alternatives, and were the most faithful among all participants in adhering to the 75 percent
more-forward rule, stating that to be deemed qualified to take a course, you must answer three of
four questions correctly.
Student backgrounds. Four students, who categorized into this group, were diverse in age
and along racial and ethnic lines. However, counter to research showing lower math self-concept
among females (Eccles, 2007), three out of four students in this group were males. Chris is a White,
115
18 year old male; Mark is a 24 year old male of mixed race (Asian/White); Eric is a 20 year old
Latino male; and Angela is a 30 year old Latina female.
The last math course that students took in high school fell in the lower end of the
distribution in terms of academic rigor. Chris and Mark reported that their last math class in high
school was Algebra II; Eric - Algebra I; and Angela did not remember if she had taken pre-algebra
or Algebra I as her last math course. Like some students who framed their decision-making around
mastering math concepts, Chris, Eric and Angela also did not know what they intended to study in
college. Mark, on the other hand, after having served in the military, had clear career goals of being
a battalion chief of a fire department. Nevertheless, it seemed that he had an opaque understanding
of the academic path that would lead him to his goal, citing that he “only knew that to become a
battalion chief down the road, you have to have a bachelor’s in science.”
Approach. At the outset of their participation in the study, all four had noted particular
difficulty in succeeding in previous math courses or expressed low self-confidence in their math
abilities. For instance, Mark said “I didn’t do very well in [math]”, following this statement later
with “I’m not that good at math.” When referencing her experiences in her last math course, Angela
said “I remember I had a hard time – I had a difficult time with – with some of the formulas.”
Statements about being out of school were closely intertwined with statements related with low
math self-concept. For instance, Angela expressed that “it’s been so long” since she took math, and
Mark remarked, “I haven’t done [math] in six years!” Eric and Chris, who had more recently taken
math in high school, also relayed thoughts and feelings referencing low math ability. In tackling the
dilemma of self-placing into math, Eric said, with hesitation in his voice: “it's just me, though,
because I'm not very -- I was never very good at math.” Chris said that he “didn’t understand the
work or the course”, but unlike the other students in this group, he attributed his poor performance
116
to taking the wrong kinds of math courses in high school. He shared that the “school’s tests haven’t
placed me in classes that really were the correct classes.”
Information and evaluation search. Chris, Angela, and Mark considered their experiences
in high school math as important pieces of information for their placement decision-making. Chris
articulated that, in addition to his performance in previous math courses, he would also search for
information about who was teaching and who was planning to take the math course. He sought such
information because as he says, “if there was something wrong with the course, or I don't
understand something”. Chris’s statement indicates that students with low-conceptions of math
abilities may seek courses where help is accessible. Eric, when asked what kinds of information he
would use, said: “I don't even know how to answer that question if anybody were to ask me just like
-- I would need to be handed a test.” His response suggests that students, who may struggle with
figuring out with how to self-place into math, may want to have that decision made for them.
Akin to the others, all four students used EBA to quickly eliminate some course alternatives
before engaging in more in-depth search and evaluation of information in self-placement guides. At
least for Mark, Angela, and Chris, the decision to eliminate certain courses appeared to be, in part,
informed by the last math course they took in high school. For example, Mark discarded all but
elementary and intermediate algebra, noting that he took Algebra II in high school; Angela honed in
on elementary algebra since she thought it naturally followed pre-algebra; and Chris eliminated
arithmetic and the transfer-level courses, with the exception of pre-calculus, since he took
intermediate algebra in high school. Eric, who appeared the least certain about his abilities to
succeed in math, targeted his cognitive efforts on estimating fit with arithmetic, the lowest math
course in the sequence.
117
With the exception of Mark - who reviewed the math sequence first - all students started the
information search process by examining course content. For the most part, students within this
group used cues from descriptions of course content to intentionally discard alternatives that, in
their mind, were not calibrated with their math knowledge or skills – a clear use of the EBA
strategy. For example, Eric came to the conclusion that arithmetic was the right course for him after
reflecting on his success with specific math concepts taught in that course. He said: “…the
percentages and measurements and all that, I was never good at those, so I would … just probably
start at the bottom, [at] arithmetic.” Angela and Chris engaged in a similar evaluation process.
Whereas Angela eliminated elementary algebra upon discovering that it surveyed linear equations,
expressing “I don’t know what linear equations are, so I don’t know [about elementary algebra],”
Chris, eliminated pre-calculus based on the courses overview, saying “I don't think my high school
prepared me for any of this”. In short, the actions taken by Angela, Eric, and Chris underscore the
notion that unfamiliarity or uneasiness with material taught in course alternatives can drive the
elimination of potential suitable courses for students with low conceptions of math ability.
Interestingly, Angela and Mark were the only participants in the qualitative study who
faithfully followed criteria set for determining academic readiness for potential course alternatives.
Angela, who scored under the 75 percent move-forward threshold by correctly answering two out
of four pre-requisite math problems for elementary algebra, opted for pre-algebra as her final course
choice as a result. Mark also heavily weighed his subpar performance on problems testing readiness
for intermediate algebra in his decision to choose elementary algebra. In reflecting on his failure to
reach the 75 percent threshold, he said: “I got half right…and it was a struggle. So -- I would just go
and take elementary algebra. And I would go and see if it's any easier or too easy.” It seems as
though Angela and Mark took their performance on pre-requisite math problems as a clear
118
indication of their qualifications to succeed in particular courses, setting them apart from students
who either dismissed their poor performance to carelessness or believed that could succeed in a
course even without solving the pre-requisite math problems. Such evidence suggests that students
exhibiting low math self-concept may err on the side of caution when choosing their math courses
because they weigh their performance on math problems more heavily than others. Figure 8 in
Appendix C displays how Eric reached his placement decision.
Decision biases. Like all other participants in the qualitative study, students within this
group employed strategies and techniques to avoid or simplify the difficulty of choice by ignoring
or discounting information given to them. Over-confidence bias was also evident in Chris’s
decision-making processing. Akin to Mani, Jonathan, and Matt, Chris chose intermediate algebra
based on his hunch that he could solve pre-requisite math problems for that level if pressed.
Aside from these decision biases, perhaps what may be most interesting about this group is
that their personal, emotional considerations regarding their task of self-placing into a math
sequence in large part preceded any rational judgments about goodness of fit for any of the
alternatives. From analyses of survey data, we also see that feelings about math and conceptions of
math ability are also powerful predictors in how students make placement decisions without the
support of self-placement guides. This evidence suggests that, for some students, feelings or
emotional responses lead placement decision-making, which is line in with other psychological
research investigating decision-making (Zajonc, 1980). In fact these affective states seem to be
aroused throughout the different stages of decision-making process given that conceptions of math
ability seem to pervade even with the support of decision-guides. Matt’s statement below
explaining why he chose elementary algebra shows that emotional reactions are inescapable, even
in the last stage of decision-making. He shared:
119
- I would still do the elementary algebra,
- only because it would be a confidence booster
- I wouldn’t want to waste my money on something I’m going to fail,
- because that looks really bad.
- So not only on paper, but personally.
To summarize, the decision-making of this group of students seemed to be strongly
influenced by their feelings about math, so much so that they, by and large, guided their diagnostic
and decision-making processes. Similar to those who approached decision-making with the intent
of mastery, students here also relied heavily on course content to determine choice.
Further Evidence Supporting the Existence of Selective Decision-Making
Results from analyses of survey data support the notion that students employ simplification
techniques which leave them susceptible to decision biases.
In this section, I present a series of results that shows the extent to which survey respondents
assigned to the second treatment used math problems to guide their placement decision-making.
57
Specifically, I examined the number and the types of math problem sets respondents declared an
interest in answering, as well as the number of problems they answered correctly in each set in
order to determine the highest math level for which they were academically qualified, and compare
it to their final placement level. I also examine the reasons why students, assigned to different
treatment groups, chose their math to get a sense of the existence of order effects.
Selective information searching and process. Figure 9 in Appendix C visually
demonstrates that survey respondents fell short of examining all six problem sets available to them.
Only one-fifth of respondents answered the full array of problem sets, which suggests that the
majority of respondents are selective in evaluating their competence to succeed in each math course
57
I could not assess which course descriptions that students reviewed because Qualtrics captured limited information
about how they accessed course descriptions. Yet, one can conclude if respondents selectively focused their attention on
specific math problem sets, it is likely that they also selectively focused their attention on the course descriptions.
120
available them. In fact, evidence from Figure 10 in Appendix C shows wide dispersion in the
number of problem sets respondents answered. The majority of students reviewed less than three
problem sets, and around five percent reviewed none.
This pattern of selectivity also appears in types of problem sets respondents solved. Figure
10 reveals a negative correlation between the academic rigor and the take-up of answering a math
problem set. While over three-quarters of respondents answered questions testing readiness for pre-
algebra (the second least rigorous math course), only a quarter of students solved problems testing
their preparation for Calculus (the most rigorous math course)
58
.
There are two possible reasons why respondents elected to answer a selective number of
problem sets, and focus on the least rigorous. The first reason may be tied to the cognitive load
generated by the number and the sequencing of math problems (Ariely, 2000). In the survey,
respondents had a chance to answer six problem sets; the least rigorous math problem sets were
presented first, and subsequent problems sets followed in order of academic rigor, with the most
difficult problems presented last. While no clear evidence from this study indicates that the design
of the math problems section impacted how respondents searched for and evaluated information,
results from other studies suggest that the amount and the flow of information can have
consequential effects on the utilization of information in their decision-making (Ariely, 2000;
Saunders & Jones, 1990; Speier, 2006). Ariely (2000) discovered that the need to invest more
resources to not only understand but also process additional information bore significant costs for
the decision-maker to a point where they ignored information. The fact that a significant proportion
of respondents only answered a limited set of problems, and further, did not seek to answer the
58
Since Arithmetic was first in the sequence of developmental and transfer-level math courses, it had no pre-requisites.
121
most rigorous one, may indicate that the cognitive burden to incorporate their performance on all of
the math problems presented may have been too high to factor into their decision-making.
The second reason may be associated with the value respondents assigned to testing their
level of preparation for the more difficult math courses. Findings from the qualitative study
demonstrate that most students, at the outset of solving the dilemma of self-placing into math,
eliminated math courses as potential alternatives before diving deeper into their self-placement
guides because they felt that they were too easy or they lacked preparation to succeed in them.
These findings are in line with work conducted by psychologists who find that perceptions of
ability can mediate an individual’s course of action (Aarts, Verplanken, & Van Knippenberg,
1998). Therefore the decision to avoid answering harder math problems may reflect a general
sentiment among students of their underpreparedness for those courses.
Results, further, show that the majority respondents (roughly 70 percent) assigned to the
second treatment did not select a math course deemed appropriate for their level of academic
qualification based on performance on pre-requisite math problem. Over three-fourths of
mismatchers (or students who did not pick the level of math they were qualified for) selected a
higher level math course, while the remaining percent picked a lower level math course. Overall, a
little over 10 percent of students qualified for a transfer-level math course; most students – close to
two-fifths – qualified for elementary algebra. These results strongly connect to the qualitative
findings from this study showing students appear to ignore information that may potentially conflict
with initial impressions of course fit. Results from regression analyses do not show that a student’s
high school math experiences (e.g. whether his or her last math class was college-level, passed last
math course) predicted who is and who is not a mismatcher.
122
Discussion
This research illustrates that community college students who make autonomous math
placement decisions employ diagnostic and decision strategies that compromise their ability to
make good decisions. Evidence from the qualitative and quantitative threads of this study suggest
that students do not carefully calculate the pros and cons of each available course option but rather
construct preferences for each one contingent upon they how they cognitively and emotionally
respond to the task of self-placing into math. At each stage of the decision-making process, results
show that students used a repertoire of strategies and heuristics that not only influenced how they
approached the problem of self-placement, but also how they searched for and processed
information to actually make their placement decisions. The analyses of qualitative and quantitative
study suggest that the selection of these strategies and susceptibility to decision biases appears to be
linked more to the student’s prior math experiences, academic goals and expectations, and
perceptions of math ability, and less to their race or ethnicity or sex. This research thus contradicts
studies suggesting that students who are female and of color are likely to underestimate their math
abilities because they are more vulnerable to the unconscious belief that they are worse in math than
others (Eccles, 2007).
Meanwhile, exposure to self-placement guides seems to induce students to make only
minor, if any, changes in the way they think about selecting a math course on their own. Results
show that only a quarter of survey respondents actually changed which math course they thought
they should take after receiving their self-placement guide, and while most of these students
selected a lower math course, these downward shifts were marginal in nature. The fact that
subjective conceptions of math ability and previous math experiences continued to play powerful
roles in determining the selection of transfer-level math course even after the introduction of the
123
self-placement guide suggest that current DSP implementation practices may not be designed in a
way that encourages students to consider goals and other metrics of their math ability. And while
the inclusion of pre-requisite math problems in self-placement guides induced a higher percentage
of students to change their initial course choice, most students ignored their performance on those
math problems, selecting a math course above their level of qualification. Increasing the number of
students participating in the randomization of information supports would give us a better sense of
their impacts of placement decisions.
These results point to the complexity of student decision-making, and, further, the need to
better understand how current implementation practices, and the support features built within them,
should be designed to more effectively account for the decision-making behaviors of community
college students in order to support sounder decision-making. At present, policymakers and
practitioners have little empirical research to draw on to guide any sort of reform. As a result, they
are left with questions and speculation about how to improve DSP. How do we encourage students
to pick courses congruent with their values, preferences, and goals without confusing them or
increasing cognitive burden? To what extent would personalizing information to the needs of the
student help remove blind spots (i.e. future career or academic goals, their level of preparedness)
that have the potential to bias their decision-making? To what degree should results from placement
tests be used in conjunction with a student’s own opinions about placement to inform final
placement decisions? How do math self-concept and math self-efficacy inform placement decision-
making? Answers to these questions would ultimately help to define the design, the structure, and
the content of the self-placement supports, but also fill in theoretical gaps in decision and education
research.
124
Because the areas for future research are numerous and far-reaching, here are two potential
interventions that researchers and practitioners may consider implementing together as they move
forward to remedy some of the shortcomings of student-driven placement decision-making. The
first intervention relates to helping students achieve a state of mindfulness, or the act of increasing
one’s awareness about their thoughts and feelings before and when taking action. As Langer and
Moldoveanu (2000) state, mindfulness has important consequences on how decisions are made
because it can make us more sensitive to our environment and to new information, and enhance our
awareness about the multiple perspectives that are often times needed to solve complex problems.
While empirical evidence on the effectiveness of mindfulness training is just beginning to surface in
the literature, some emerging research suggests that it can help decision-makers block irrelevant
information and build parts of the brain that are of particular significance to controlling cognitive
thinking (Tang & Posner, 2009; Teichert T., Ferrera V.P., & Grinband, J., 2014). The second
intervention is related to the use of placement tests to guide autonomous placement decision-
making. Evidence from this study shows that students may be susceptible to over-estimating or
under-estimating their math abilities when making placement decisions on their own due to their
feelings or beliefs about math. Obligating students to take a placement test may help students
calibrate their perceptions of their math ability with what is arguably a more objective account of
their knowledge and skills in math. Because the predictive validity of placement has been
questioned (Scott-Clayton, 2012), community colleges should encourage students to treat their
results as another metric of achievement, and ultimately give them discretion over how they use that
information to guide their decision-making.
To conclude, while it is important that students adequately assess their abilities to succeed in
the math courses available to them, it is equally important to understand how student-driven
125
placement decisions impact persistence and achievement to determine how much to invest in
reform. Comparing the performance of students randomly assigned to self-place against those
assigned to test-place into math, keeping current DSP implementation practices intact would be an
important and meaningful first step in figuring out how to move forward. While such an analysis
has yet to be conducted, Kosiewicz (2014), in the most rigorous study to date on the effects of self-
placement on performance, found suggestive evidence that students who self-placed into math
outperformed those who test-placed into the same math sequence on a number of proximal and
distal achievement measures (e.g. accumulating 30 degree applicable credits, taking a transfer-level
math course, persisting in the self-placed math course). An examination of the results from this
study and from Kosiewicz (2014) suggests that the mechanism driving better performance among
self-placers may stem from the psychological benefits of being given greater latitude over
placement decisions and not so much from how students make placement decisions or the factors
they incorporate into their decision-making. Because the results of this study are based on how
students engaged with a hypothetical DSP scenario, replicating this study in a real world setting is
important because students may exhibit different behaviors when the stakes of proper placement on
success are real. Nevertheless, this study lends important insights into the some of the potential
shortcomings of DSP implementation practices.
126
CHAPTER FIVE
CONCLUSIONS, POLICY IMPLICATIONS, AND FUTURE DIRECTIONS
Now, more than ever, community colleges are educating an increasing number of
students who come through their doors underprepared for college-level coursework. Statistics show
that while over 60 percent of community college students are deemed not ready for college-level
coursework, less than half assigned to developmental courses as a result will progress through
remediation, and those who do may still be unable to pass a course that counts toward a two- or a
four-year degree (Bailey, Jeong, & Cho, 2010). Given community college disproportionately attract
students who are low-income, of color, and of first-generation status (Parsad, Lewis, & Greene,
2003), these statistics are even more troubling. Thus, fixing development educational outcomes and
processes not only has implications for improving the quality and efficiency of higher education, it
also has the potential to bridge longstanding class, racial, and ethnic disparities in collegiate access
and success.
This dissertation begins to unpack the potential of existing reform efforts that seek to deliver
higher quality development education to better assess and meet the needs of underprepared
students. Focusing on a group of community colleges with high math remediation rates and
significant flexibility in the way it delivers and assigns students to developmental math, this
dissertation, built on a collection of three studies, specifically investigates the adoption of
alternative approaches to deliver math, and the potential of Directed Self-Placement to serve as a
viable substitute to placement tests, the most common means of identifying a student’s needs for
developmental education. This dissertation research demonstrates that although reforms in
developmental education are in progress, these reforms are slow to catch on, in part, due to a lack of
solid empirical evidence in support of them. While findings from this research hopefully provide
127
researchers, policymakers, and practitioners some clarity on the opportunities and the challenges
that come with changing business as usual, they offer even more transparency in areas where
research should expand. Below, I highlight three major findings from this research, and relate each
one with its implications for policy and practice. I conclude by making recommendations for future
research avenues.
The Traditional Model Prevails to Deliver Developmental Math
Results show that, for the most part, the traditional model prevailed as the dominant method
to deliver developmental math in this particular group of community colleges between 2005 and
2013. This means that students within these colleges had limited opportunities to access approaches
that deviated from how developmental education has historically been delivered – through a
sequence lecture-based, stand-alone courses that mimic a high school math progression. Further, the
alternative approaches most commonly employed by these colleges aligned the closest with the
traditional delivery model. These approaches were the supplemental instruction model and the
extended traditional model, which lengthened the number of semesters a student spent in a
developmental math course. This suggests that modularization and acceleration, two examples of
alternative approaches that radically change the nature of how to structure and teach developmental
math, were not employed at scale. Neither was the model that contextualized the teaching to
developmental math to the academic and career goals of the student. Together, these results show
that the push for reform in developmental education – absent outside support - may fail to produce
the transformational change we want to see in our community colleges delivering developmental
education.
Yet, under the hood of evidence demonstrating that these colleges stick to tradition, results
nevertheless point to signs of innovation. For example, several community colleges combined
128
alternative models to address the needs of underprepared students, which may indicate a belief
among some faculty and administrators that a variety of interventions is required to reverse poor
performance. Further, some colleges have recently devoted more course sections that accelerate a
student’s progress in the developmental math sequence, a tacit acknowledgement of the high costs
associated with lengthening the amount of time a student spends in developmental education.
Despite these glimmers of hope, students in the more rigorous levels of developmental math,
compared to students in the least rigorous levels, had greater access to course sections offering
alternatives approaches. This evidence undoubtedly raises concerns that these colleges unfairly
allocate these different approaches to those who already have a leg up on persisting and succeeding
in college, leaving students who already face the severest of challenges to succeed to learn math the
old way.
Implications for Policy and Practice
Several reasons emerge to explain the low take up of alternative models to deliver
developmental math. One may be that state-wide regulations aimed at promoting efficiency may be
inadvertently curbing the introduction of reforms that may be beneficial to students. According to
institutional researchers who work in these colleges, it can take up a year for faculty to receive the
necessary approvals from regulatory bodies at the state, district, and institution levels to re-engineer
a math course. Because the approval process is lengthy and costly, some faculty members do not
consider changing how they structure and teach developmental math. In order to make regulation
work in favor of innovation, it is critical to identify and subsequently revamp rules that
unnecessarily discourage faculty and administrators from adequately addressing the shortcomings
of the traditional delivery model.
129
A second reason may be tied to the over-assignment of adjunct faculty to lower
developmental math courses (Gerlaugh, Thompson, Boylan, & Davis, 2007). Unlike tenure-track or
tenured professors, part-time faculty often lack professional development and the resources –
particularly time and money – to implement reforms that may benefit students (Wallin, 2004). This
evidence speaks to the need to thoughtfully consider who is adequately equipped to provide a
learning environment that can address the multiple needs of developmental education students. If
community colleges continue to replace full-time professors with adjunct faculty members, it will
then be necessary to provide these faculty members with opportunities to learn new ways of
teaching math if we are invested in changing the traditional delivery model.
A third reason may be linked to the lack of empirical evidence that advocates for their use.
To date, it is unclear whether any of these approaches improves success to a point where political
and financial investment would make sense. It is therefore critical to develop student-level datasets
that capture the participation of students in these programs (when they start, level of intensity), and
their performance as they progress through college. Systemically evaluating the performance of
students who receive these supports against students who do not can help to uncover the advantages
and disadvantages of these programs on achievement.
Directed Self-Placement (DSP) May Be Superior to a Placement
Test in Accurately Identifying a Student’s Needs for Developmental Math
DSP has been one of a number of reforms state policymakers and community colleges are
using to correct for misdiagnoses that result from the shortcomings of placement tests. In a case
where some students were randomly allowed to self-place into math while others were not, results
from an investigation of this case show that the self-placers did better than test-placers on several
academic outcomes. Specifically, they were more likely to meet the minimum math requirement for
130
an Associate’s degree, pass a transfer-level math class, and complete 30 and 60 credits that apply
towards a college degree compared to students who enrolled in math after taking a placement test.
Self-placers were also less likely to withdraw from their selected math course. These positive
results may stem from the fact that close to half of self-placers chose a transfer-level math course
for their placement. However, digging more deeply into the effects of DSP across different racial
groups, findings indicate that while African-Americans and Latinos benefitted from self-placement
relative to test-placement, into math, they did less so than Whites and Asians. Women also did
worse than men, but nevertheless also performed better under DSP than under test-placement. What
might be behind these findings? For one, a higher percentage of students from these subgroups,
under DSP, selected courses that were lower than ones their counterparts were assigned to by a
placement test. This suggests that DSP may have triggered students to unintentionally incorporate
stereotypes about their math abilities into their decision-making. In fact, some evidence suggests
that girls, African Americans, and Latinos are susceptible to adopting mental frames that convince
them that they are not good in math or as good in math as others (Brown & Leaper, 2010; Eccles et.
al. 1999; McGee & Martin, 2011). Their beliefs, while formulated during primary and secondary
school, may persist, even harden, by the time a student reaches college. Along the same line,
another possible explain may be that counselors who guide students in their decision-making also
factor in their own racialized or genderized beliefs about who is good or bad at math when giving
advice on which course level students should choose. Finally, since Latinos and African-Americans
are less likely to be networked into a college-going community, they may be more likely to make
poor placement decisions based on incomplete information or misinformation.
131
Because additional statistical tests could not isolate the influence of DSP from normal
fluctuations in achievement, it is difficult to say, with any certainty, that DSP undoubtedly
improves academic achievement.
Implications for Policy and Practice
The results point to a number of interventions researchers and practitioners can jointly
implement and test to determine whether DSP is an effective means of evaluating a student’s need
for developmental math. First, considering this research could only produce a fuzzy account of the
impact of DSP, it is essential to conduct a randomized control trial to say with any certainty that
DSP does or does not boost achievement, and under what sorts of conditions.
Equally important is the need to develop longitudinal student data systems that link student
demographic data, student placement decisions, and student performance with the placement
recommendations of counselors providing self-placement guidance. Building or expanding that
kind of data system would allow researchers to carry out analyses that could detect to what extent
student opinions or counselor opinions have greater influence over placement decisions. Absent
such an analysis, policymakers will have a difficult time determining where to invest to stamp out
the role of race and sex in placement decision-making.
Community College Students Use Decision-Making Strategies that May Compromise Their
Ability to Make Sound Placement Decisions
Results show that community college students, in making autonomous placement decisions,
systematically take irrational shortcuts that compromise the quality of their decision-making. In this
study, students, irrespective of their race or sex, had a tendency to discard unfamiliar alternatives,
overlook informational supports in their self-placement guide, and assert their readiness for specific
alternatives without examining whether they had the knowledge or skills to succeed in those
132
courses. At each stage of decision-making, students were susceptible to irrelevant information and
decision biases (e.g. framing, over-confidence, subjective beliefs of math ability) that limited how
they engaged with the self-placement process.
Placement decision-making appeared to be informed by three perspectives: 1) their
academic goals, 2) their orientation towards learning, and 3) their beliefs about their math abilities.
These perspectives seemed to mediate the way they searched for and used information with their
self-placement guide. Students oriented towards mastery mainly focused on course content and
some students within this group ignored information from their self-placement guide; students who
structured choice around meeting their academic goals centered on using information so that it
would lead them to choose courses counting towards college credit; and students whose decision-
making was led by their anxieties about math examined a variety of informational supports, but
were the only ones to adhere to academic qualification thresholds set for course alternatives.
Nevertheless, the introduction of a self-placement guide did seem to change initial course
choice, but only for a minority of students. While shifts in placement levels were mostly downward,
they were marginal, with most students continuing to stick to transfer-level or higher-level
developmental math courses. Conceptions of math ability and previous math experiences (last math
course taken was in 2014; last math course was a transfer-level math course) were the most
powerful predictors of placement decisions before and after the introduction of a self-placement
guide. These factors, as well as the length of absence from being in school, also seemed to help
students participating in verbal protocols to disregard what they considered unsuitable alternatives.
Even though the inclusion of pre-requisite math problems in self-placement guides induced a higher
percentage of students to change their initial course choice, most students ignored their performance
on those problems, selecting a math course above their level of academic qualification. This
133
conveys that students may process information that confirms their biases. Together, this evidence
suggests that DSP – as implemented here – does not effectively control the heuristics and decision
biases that ordinarily skew decision-making.
Implications for Policy and Practice
Because this study was conducted using a hypothetical DSP scenario, community college
students who are actually allowed to self-place into courses may engage in different decision
behaviors and processes. Therefore, it is essential that researchers examine how these students self-
place into math in real life. It also important to increase the sample size of future studies examining
the impacts of different self-placement content on placement decisions to ensure that there is
enough statistical power to be able to detect an effect.
Knowing that students make ill-informed decisions because of irrational reasoning and other
psychological processes, community colleges employing self-placement should consider
programming that raises a student’s awareness about how their emotions and beliefs may impact
their decision-making (Gino, 2013) . This programming may come in the form of an orientation
session that helps students identify their emotions, and understand how controlling these beliefs and
feelings may help them resist the temptation of picking a more informed choice (Gino, 2013). At
these orientation sessions, faculty and counselors should also consider providing examples of poor
decision-making to make students conscious about the pitfalls of limiting search and diagnosis of
information. One example might be of a student who has chosen a less rigorous math course after
answering one pre-requisite math problem for the math course just one level above. Another
example could be of a student who wants to transfer but unintentionally ignores information about
whether a course transfers to a four-year institution. Examples such as these may help students stay
on track to make better, sounder placement decisions.
134
Future Research Avenues
Results from this research point to interesting conundrums that beg for more research. One
conundrum is that even though it seems that self-placers perform better than test placers, they also
take shortcuts and are influenced by factors that should not impact decision-making. Specifically,
the quantitative analysis shows that students who self-placed into math performed academically
better across a wide range of outcomes relative to those who test-placed into the same sequence. At
the same time, the mixed methods study demonstrates that students who hypothetically self-placed
into math did not make rational decisions based on the information provided to them. Taken
together, these results might suggest that the very students who seemed to benefit from DSP may
have chosen their courses irrationally.
Future research should seek to explain this incongruity by examining the psychological
impacts of self-determination on course selection and achievement, and build and incorporate such
knowledge to improve student success. Research from psychology provides us insight about the
extent to which giving students greater decision-making power might encourage them to adopt
behaviors to improve their success, and protect students against potentially negative consequences
of poor decision-making. For example, psychologists Ryan and Deci (2000) suggest that boosting
an individual’s feelings of autonomy and competence can help them become more self- or
intrinsically motivated (Ryan & Deci, 2000). Intrinsic motivation is a critical element of
engagement, persistence, and success in school (Ryan & Deci, 2000). This research, thus, intimates
that giving students the opportunity to decide to the extent they need developmental education to
succeed in college may satisfy an innate psychological need for autonomy, and as result, naturally
catalyze greater intrinsic motivation within these students. However, as these psychologists also
emphasize, the benefits that theoretically stem from operating within an autonomy-supportive
135
environment may be undermined when individuals lack the self-efficacy or competence to succeed
(Ryan & Deci, 2000). Thus, it is important to not only investigate how DSP affects motivation, and
its subsequent impacts on academic and social engagement, but also how these effects may vary
depending on how efficacious a student feels in making placement decisions on their own. Finally,
although DSP necessitates that students assess their academic skills with some support, literature
thus far has overlooked DSP’s influence on academic self-awareness, which is an important
component of self-regulated learning and scholastic success (Zimmerman, 1990). Filling in these
theoretical gaps has the potential to expand what we know about the extent to which community
college students academically engage with their college environments and become more
academically self-aware when given the opportunity to increase their voice in placement decisions.
The second conundrum is that we know little about how to design self-placement guides
such that students can counter their susceptibilities to decision biases and, in turn, improve their
decision-making. It may be that course sequences, course descriptions, and math problems – the
three informational pillars that give structure to self-placement guides – may be insufficient to help
students chose an appropriate course. In interviews, some students expressed that, before they made
a placement decision, they would want to know who is teaching each course and get a sense of their
quality of instruction. Others said they would also want to take a math test. Should these types of
information also be provided to students? To what degree would they improve placement decision-
making? Further, results from the mixed methods study demonstrate that students orient their
attention to, and process, a limited realm of information when given significant flexibility in how
they can use these self-placement guides for their decision-making. That processing of information
is often distorted by the influence of bias, emotions, the order in which information is presented,
and, most likely, the burden of cognitive load.
136
Future studies should also examine how information presented in self-placement guides
impacts diagnostic and decision-making processes, and further the placement decisions students
make. To move forward, researchers, in partnership with community colleges, may consider
running a series of experiments that alter the order and the display of information typically provided
in self-placement guides to determine if these alterations change student decision-making placement
decisions. These partnerships may also want to develop, and test the impact of, decision support
systems that personalize and integrate information presented in self-placement guides since research
shows that these two modifications may improve decision-making (Davidson, et. al 2007; Todd &
Benbasat,1993,1994). Building insights from these empirical interventions into current self-
placement strategies may improve on how students make autonomous placement decisions.
137
REFERENCES
Aarts, H., Verplanken, B., & Knippenberg, A. (1998). Predicting behavior from actions in the past:
Repeated decision making or a matter of habit? Journal of Applied Social
Psychology, 28(15), 1355-1374.
Arch, D. C., Bettman, J. R., & Kakkar, P. (1978). Subjects' information processing in information
display board studies. In H. K. Hunt (Ed.), Advances in Consumer Research, 5, 555-559.
Ann Arbor, MI: Association for Consumer Research.
Alberts, J. F., Sanderman, R., Eimers, J. M., & Heuvel, W. (1997). Socioeconomic inequity in
health care: A study of services utilization in Curacao. Social Science and Medicine, 45(2),
213–220.
Angrist, J. D., & Pischke, J. S. (2009). Mostly harmless econometrics: An empiricist’s companion.
Princeton, NJ: Princeton University Press.
Arendale, D. (2002). History of supplemental instruction (SI): Mainstreaming of developmental
education. Histories of Developmental Education, 15-27.
Ariely, D. (2000). Controlling the information flow: Effects on consumers’ decision making and
preferences. Journal of Consumer Research, 27(2), 233-248.
Artiles, A. J., Harry, B., Reschly, D. J., & Chinn, P. C. (2002). Over-identification of students of
color in special education: A critical overview. Multicultural Perspectives, 4(1), 3–10.
Astin, A. W. (1985). Achieving educational excellence: A critical assessment of priorities and
practices in higher education. San Francisco, CA: Jossey-Bass.
Auger, R. W., & Blackhurst, A. E. (2005). The development of elementary-aged children’s career
aspirations and expectations. American School Counselor Association Journal, 8(4), 322 –
329.
Bahr, P. R. (2012). Deconstructing remediation in community colleges: Exploring associations
between course-taking patterns, course outcomes, and attrition from the remedial math and
remedial writing sequences. Research in Higher Education, 53, 661-693.
Bahr, P. R. (2009). Educational attainment as process: Using hierarchical discrete-time event
history analysis to model rate of progress. Research in Higher Education, 50(7), 691–714.
Bailey, T., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment, and completion in
developmental education sequences in community colleges. Economics of Education Review,
29, 255-270.
Bandura, A., Barbaranelli, C., Caprara, G.V., & Pastorelli, C. (2001). Self-efficacy beliefs as
shapers of children’s aspirations and career trajectories. Child Development, 72, 187–206.
Bandura, A. (1986). Social foundations of thought and action. A social cognitive theory. Englewood
Cliffs, NJ: Prentice-Hall.
138
Barbour, R. S. (2001). Checklists for improving rigour in qualitative research: A case of the tail
wagging the dog? British Medical Journal, 322, 1115–1117.
Barnato, A. E., Llewellyn-Thomas, H. A., Peters, E. M., Siminoff, L., Collins, E. D., & Barry, M. J.
(2007). Communication and decision making in cancer care: setting research priorities for
decision support/patients' decision aids. Medical Decision Making, 27(5), 626-634.
Bassett, M. J., & Frost, B. (2010). Smart math: Removing roadblocks to college
success. Community College Journal of Research and Practice, 34(11), 869-873.
Beach, L. R., & Lipshitz, R. (1993). Why classical decision theory is an inappropriate standard for
evaluating and aiding most human processes (Ericcson & Simon, 1984). In G. A. Klein, J.
Orasanu, R. Calderwood, & C. Zsambok (Eds.), Decision-making in action: models and
methods (pp. 21–35). Norwood, NJ: Ablex Publishing.
Beach, L. R., & Mitchell, T. R. (1978). A contingency model for the selection of decision
strategies. Academy of Management Review, 3, 439-449.
Bensimon, E. M. (2007). The underestimated significance of practitioner knowledge in the
scholarship on student success. The Review of Higher Education, 30(4), 441-469.
Bertrand, M., Duflo, E., & Mullainathan, S. (2004). How much should we trust differences-in-
differences estimates? Quarterly Journal of Economics, 119(1), 249-275.
Bettinger, E., & Baker, R. (2014). The effects of student coaching: An evaluation of a randomized
experiment in student advising. Education Evaluation and Policy Analysis, 36(1), 3–19.
Bettinger, E., Boatman, A., & Long, B. (2013). Student supports: Developmental education and
other academic programs. The Future of Children, 23, 93–115.
Bettman, J. R., & Kakkar, P. (1977). Effects of information presentation format on consumer
information acquisition strategies. Journal of Consumer Research, 3, 233-240.
Betz, N. E., & Luzzo, D. A. (1996). Career assessment and the career decision-making self-efficacy
scale. Journal of Career Assessment, 4(4), 413-428.
Bickerstaff, S., Lontz, B., Cormier, M.S., & Xu, D. (2014). Redesigning arithmetic for student
success: Supporting faculty to teach in new ways. New Directions for Community Colleges,
67, 5-14.
Billings, R. S., & Marcus, S. A. (1983). Measures of compensatory and noncompensatory models
of decision behavior: Process tracing versus policy capturing. Organizational Behavior and
Human Performance, 31, 331-352.
Boatman, A. (2012, March). Examining the causal effects of instruction and delivery in
postsecondary remedial and developmental courses: Evidence from the Tennessee
Developmental Course Redesign Initiative. Paper presented at the annual conference of the
Association for Education Finance and Policy.
139
Bodenhausen, G.V., Gabriel, S., & Lineberger, M. (2000). Sadness and susceptibility to judgmental
bias: The case of anchoring. Psychological Science, 11, 320–323.
Bong, M., & Skaalvik, E. M. (2003). Academic self-concept and self-efficacy: How different are
they really? Educational Psychology Review, 15, 1– 40.
Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code
development. Thousand Oaks, CA: Sage.
Boylan, H. (2002). What works: Research-based best practices in developmental education. Boone,
NC: Continuous Quality Improvement Network/National Center for Developmental
Education.
Boylan, H.R., Bonham, B.S., & White, S.R. (1999). Developmental and remedial education in
postsecondary education. New Directions in Higher Education, 108, 87-101.
Boylan, H. R., Bliss, L. B., & Bonham, B. S. (1997). Program components and their relationship to
student performance. Journal of Developmental Education, 20, 2-9.
Bradley, E. H., Curry, L.A., & Devers, K. J. (2007). Qualitative data analysis for health services
research: Developing taxonomy, themes, and theory. Health Services Research, 42(4), 1758–
1772.
Brewer, D. & Tierney, W. G. (2011). Barriers to innovation in higher education. In B. Wildavsky,
A. P. Kelly, & K. Carey (Eds). Reinventing Higher Education: The Promise of Innovation.
Cambridge, MA: Harvard Education Press.
Brown, C. S., & Leaper, C. (2010). Latina and European American girls’ experiences with
academic sexism and their self-concepts in mathematics and science during adolescence. Sex
roles, 63(11-12), 860-870.
Brown, R. S., & Niemi, D. N. (2007). Investigating the alignment of high school and community
college assessments in California (National Center Report# 07-3). San Jose, CA: National
Center for Public Policy and Higher Education.
Brundage, M., Feldman-Stewart, D., Leis, A., Bezjak, A., Degner, L., & Velji, K. (2005).
Communicating quality of life information to cancer patients: a study of six presentation
formats. Journal of Clinical Oncology, 23, 6949–6956.
Buchmueller, T. C., DiNardo, J., & Valletta, R. G. (2009). The effect of an employer health
insurance mandate on health insurance coverage and the demand for labor: Evidence from
Hawaii. IZA discussion papers No. 4152, 1-68.
Buck, G. A., Clark, V. L. P., Leslie-Pelecky, D., Lu, Y., & Cerda-Lizarraga, P. (2008). Examining
the cognitive processes used by adolescent girls and women scientists in identifying science
role models: A feminist approach. Science Education, 92(4), 688-707.
140
Burdman, P. (2012). Where to begin? The evolving role of placement exams for students starting
college. Washington, DC: Jobs for the Future.
California Community College Chancellor’s Office - Data Mart. (n.d). Retrieved from
http://datamart.cccco.edu/datamart.aspx
California Community College Chancellor’s Office. (2011). Matriculation handbook. Retrieved
from
http://www.cccco.edu/Portals/4/SS/Matric/Matriculation%20Handbook%20(REV.%2009-
2011).pdf
Cantor, N., & Norem, J. K. (1989). Defensive pessimism and stress and coping. Social Cognition,
7, 92-112.
Cartnal, R., & Hagen, P. F. (1999). Evaluation of the Early Alert Program (ERIC Document
Reproduction Service No. ED41 541). San Luis Obispo, CA: Cuesta College.
Carnegie Foundation for the Advancement of Teaching. (n.d.).
http://www.carnegiefoundation.org/developmental-math.
Catsambis, S. (1994). The path to math: Gender and racial-ethnic differences in mathematics
participation from middle school to high school. Sociology of Education, 199-215.
Center for Urban Education. (n.d.) www.cue.usc.edu
Charles, C., Gafni, A., & Whelan, T. (1999). Decision-making in the physician–patient encounter:
revisiting the shared treatment decision-making model. Social Science & Medicine, 49(5),
651-661.
Chernekoff, J. (2003). Introducing directed self-placement to Kutztown University. In D.J. Royer &
R. Gilles (Eds.), Directed Self-placement: Principles and practices (pp. 127-147). Cresskill,
NJ: Hampton Press.
Christensen, A. J., Moran, P. J., & Wiebe, J. S. (1999). Assessment of irrational health beliefs:
relation to health practices and medical regimen adherence. Health Psychology, 18(2), 169.
Cho, S. W., & Karp, M. M. (2013). Student success courses in the community college: Early
enrollment and educational outcomes. Community College Review, 41(1), 86-103.
Chu, P. C., & Spires, E. E. (2003). Perceptions of accuracy and effort of decision strategies.
Organizational Behavior & Human Decision Processes, 91, 203–214.
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological
Measurement, 20, 37-46.
Collins, M. (2009). Setting up success in developmental education: How state policy can help
community colleges improve student outcomes. Boston, MA: Jobs for the Future.
141
Complete College America (2012). Remediation: Higher education’s bridge to nowhere.
Washington D.C.: Author.
Cook-Sather, A. (2006). Sound, presence, and power: “Student voice” in educational research and
reform. Curriculum Inquiry, 36(4), 359–390.
Creswell, J. W. (2007). Qualitative inquiry & research design: Choosing among five approaches.
Thousand Oaks, CA: Sage.
Creswell, J. W, Plano Clark, V. L., Guttmann, M. L., & Hanson, E. E. (2003). Advanced mixed
methods research design. In A.Tashakkori and C. Teddlie (Eds.), Handbook of mixed
methods in social and behavioral research (pp. 209–240). Thousand Oaks, CA: Sage.
Croskerry, P. (2005). The theory and practice of clinical decision-making. Canadian Journal of
Anesthesia/Journal Canadien D'anesthésie, 52, R1-R8.
Croskerry, P. (2003). The importance of cognitive errors in diagnosis and strategies to minimize
them. Academic Medicine, 78(8), 775-780.
Croskerry, P. (2002). Achieving quality in clinical decision making: cognitive strategies and
detection of bias. Academic Emergency Medicine, 9(11), 1184-1204.
Cullinane, J., & Treisman, P. U. (2010). Improving developmental mathematics education in
community colleges: A prospectus and early status report on the Statway initiative. New
York, N.Y: National Center for Postsecondary Research.
Davidson, J. E., Powers, K., Hedayat, K. M., Tieszen, M., Kon, A. A., Shepard, E., & Armstrong,
D. (2007). Clinical practice guidelines for support of the family in the patient-centered
intensive care unit: American College of Critical Care Medicine Task Force 2004–
2005. Critical Care Medicine, 35(2), 605-622.
Deci, E. L., & Ryan, R. M. (1995). Human autonomy. In M. Kernis (Ed.). In Efficacy, agency, and
self-esteem (pp. 31-49). New York, NY: Springer.
Donald, S.G., & Lang, K. (2007). Inference with difference-in-differences and other panel data.
Review of Economics and Statistics, 89(2), 221-233.
Duhan, D. F., Johnson, S. D., Wilcox, J. B., & Harrell, G. D. (1997). Influences on consumer use of
word-of-mouth recommendation sources. Journal of the Academy of Marketing
Science, 25(4), 283-295.
Dweck, C.S., & Elliott, E. S. (1983). Achievement motivation. In P. Mussen & E. M. Hetherington
(Eds.), Handbook of child psychology (pp. 643-691). New York, NY: Wiley.
Eccles, J. S. (2007). Where are all the women? Gender differences in participation in physical
science and engineering. Washington D.C.:The American Psychological Association.
142
Eccles, J. S., Barber, B., & Jozefowicz, D. (1999). Linking gender to educational, occupational, and
recreational choices: Applying the Eccles et al. model of achievement-related choices. In
Sexism and stereotypes in modern society: The gender science of Janet Taylor Spence. (pp.
153-192) Washington, D.C.: The American Psychological Association.
Edgecombe, N., Cormier, M. S., Bickerstaff, S., & Barragan, M. (2013). Strengthening
developmental education reforms: Evidence on implementation efforts from the scaling
innovation project. New York, N.Y: Columbia University, Teachers College, Community
College Research Center.
Edwards, A., Elwyn, G., Wood, F., Atwell, C., Prior, L., & Houston, H. (2005). Shared decision
making and risk communication in practice A qualitative study of GPs' experiences. British
Journal of General Practice, 55(510), 6-13.
Edwards, M., Davies, M., & Edwards, A. (2009). What are the external influences on information
exchange and shared decision-making in healthcare consultations: a meta-synthesis of the
literature. Patient Education and Counseling, 75(1), 37-52.
Einhorn, H. J., & Hogarth, R. M. (1981). Behavioral decision theory: Processes of judgment and
choice. Journal of Accounting Research, 1-31.
Einhorn, H. J. (1970). The use of nonlinear, noncompensatory models in decision
making. Psychological Bulletin, 73(3), 221.
Elstein, A. S., & Schwarz, A. (2002). Clinical problem solving and diagnostic decision making:
selective review of the cognitive literature. British Medical Journal, 324(7339), 729-732.
Elstein, A. S. (1999). Heuristics and biases: selected errors in clinical reasoning. Academic
Medicine, 74(7), 791-794.
Engstrom, C., & Tinto,V. (2008, February). Access without support is not opportunity. Change,
40(1), 46-50.
Epper, R. M., & Baker, E. D. (2009). Technology solutions for developmental math: An overview
of current and emerging practices. Journal of Developmental Education, 26(2), 4-23.
Ericsson, K. A., & Oliver, W. L. (1988). Methodology for laboratory research on thinking: Task
selection, collection of observations, and data analysis. In Sternberg, R. J., and Smith, E. E.
(eds.), The Psychology of Human Thought (pp. 392-428). Cambridge, UK: Cambridge
University Press.
Fain, P. (2013, June). Remediation if you want it. Inside Highered. Retrieved from
http://www.insidehighered.com/news/2013/06/05/florida-law-gives-students-and-colleges-
flexibility-remediation
143
Fast, L. A., Lewis, J. L., Bryant, M. J., Bocian, K. A., Cardullo, R. A., Rettig, M., & Hammond, K.
A. (2010). Does math self-efficacy mediate the effect of the perceived classroom
environment on standardized math test performance? Journal of Educational
Psychology, 102(3), 729.
Feinstein, A.R. (1973), Clinical Biostatistics. XIII. The Role of Randomization in Sampling,
Testing, Allocation, and Credulous Idolatry (Part 2). Journal of Clinical Pharmacology and
Therapeutics, 14, 898-915.
Feldman-Stewart, D., & Brundage, M. D. (2004). Challenges for designing and implementing
decision aids. Patient education and counseling, 54(3), 265-273.
Feldman-Stewart, D., Brundage, M. D., Van Manen, L., Skarsgard, D., & Siemens, R. (2003).
Evaluation of a question-and-answer booklet on early-stage prostate-cancer. Patient
Education and Counseling, 49(2), 115-124.
Felder, J. E., Finney, J. E., & Kirst, M. W. (2007). Informed Self-Placement at American River
College: A case study (National Center Report #07-2). San Jose, CA: National Center for
Public Policy and Higher Education.
Fong, K., Melguizo, T., & Prather, G. (2013). A different view of how we understand progression
through the developmental math trajectory. Los Angeles, CA: The University of Southern
California.
Gaston, C. M., & Mitchell, G. (2005). Information giving and decision-making in patients with
advanced cancer: a systematic review. Social Science & Medicine, 61(10), 2252-2264.
Gerdtham, U.G. (1997). Equity in health care utilization: Further tests based on hurdle models and
Swedish micro data. Health Economics, 6(3), 303-19.
Gerlaugh, K., Thompson, L., Boylan, H., & Davis, H. (2007). National study of developmental
education II: Baseline data for community colleges. Research in Developmental
Education, 20(4), 1-4.
Gilly, M. C., Graham, J. L., Wolfinbarger, M. F., & Yale, L. J. (1998). A dyadic study of
interpersonal information search. Journal of the Academy of Marketing Science, 26(2), 83-
100.
Gino, F. (2013). Sidetracked: Why our decisions get derailed, and how we can stick to the plan.
Cambridge, MA: Harvard Business Review Press.
Glover, S. M., Prawitt, D. F., & Spilker, B. C. (1997). The influence of decision aids on user
behavior: Implications for knowledge acquisition and inappropriate reliance. Organizational
Behavior and Human Decision Processes, 72(2), 232-255.
Gourgey, A. F. (1992). Tutoring developmental mathematics: Overcoming anxiety and fostering
independent learning. Journal of Developmental Education, 15(3), 10.
144
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-
method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255–274.
Grubb, W. N., and Associates (1999). Honored but invisible: An inside look at teaching in
community colleges. New York, NY: Routledge.
Hastie, R. (2001). Problems for judgment and decision making. Annual Review of
Psychology, 52(1), 653-683.
Hastie, R., Schkade, D. A., & Payne, J. W. (1998). A study of juror and jury judgments in civil
cases: Deciding liability for punitive damages. Law and Human Behavior, 22, 287-314.
Haubl, G. & Trifts, V. (2000) Consumer decision making in online shopping environments: the
effects of interactive decision aids. Marketing Science, 19(1), 4–21.
Hayward, C. & Willett, T. (2014). Acceleration effects of curricular redesign in the California
Acceleration Project. Berkeley, CA: The Research and Planning Group for California
Community Colleges. Retrieved from http://collegecampaign.org/wp-
content/uploads/2014/06/RP-Evaluation-CAP.pdf
Hern, K. (2012). Acceleration across California: Shorter pathways in developmental English and
math. Change, 44(3), 60-68.
Hiebert, J., & Grouws, D. A. (2007). The effects of classroom mathematics teaching on students’
learning. Second Handbook of Research on Mathematics Teaching and Learning, 1, 371-404.
Hodara, M. (2013). Improving students' college math readiness: A review of the evidence on
postsecondary interventions and reforms (CAPSEE Working Paper). New York, NY: Center
for Analysis of Postsecondary Education and Employment.
Hodara, M., & Jaggars, S. S. (2014). An examination of the impact of accelerating community
college students' progression through developmental education. The Journal of Higher
Education, 85(2), 246-276.
Hu, S. (2015, January). Learning from a bold experiment. Inside HigherEd. Retrieved from
https://www.insidehighered.com/views/2015/01/29/essay-making-most-floridas-remedial-
reform
Hu, S., Tandberg, D., Park, T., Nix, A., Collins, R., & Hankerson, D. (2014). Developmental
education in Florida: What do Florida state institutions plan to do? Tallahassee, FL: Florida
State University. Retrieved from
http://www.coe.fsu.edu/content/download/164608/1454410/file/Implementation_Plan_Report
_July2014.pdf
Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community
colleges: A review of the literature (CCRC Working Paper 19). New York, NY: Community
College Research Center, Teachers College, Columbia University.
145
Jaggars, S. S., Hodara, M., Cho, S., & Xu, D. (2015). Three accelerated developmental education
programs: Features, student outcomes, and implications. Community College Review, 43(1),
3-26.
Jacobs, J. E. (2005). Twenty‐five years of research on gender and ethnic differences in math and
science career choices: What have we learned? New Directions for Child and Adolescent
Development, 2005(110), 85-94.
Jenkins, D., & Cho, S. W. (2012). Get with the program: Accelerating community college students’
entry into and completion of programs of study (CCRC Working Paper No. 32). New York,
NY: Columbia University, Teachers College, Community College Research Center.
Jenkins, D., Zeidenberg, M., & Kienzl, G. (2009). Educational outcomes of I-BEST, Washington
State Community and Technical College System’s Integrated Basic Education and Skills
Training Program: Findings from a multivariate analysis (CCRC Working Paper No. 16).
New York, NY: Columbia University, Teachers College, Community College Research
Center.
Jick, T. D. (1979). Mixing qualitative and quantitative methods: Triangulation in
action. Administrative Science Quarterly, 602-611.
Karp, M.M., & Fletcher, J. (2014). Adopting new technologies for student success: A readiness
framework. New York, NY: Columbia University, Teachers College, Community College
Research Center.
Kezar, A. J., & Eckel, P. D. (2002). The effect of institutional culture on change strategies in higher
education: Universal principles or culturally responsive concepts? The Journal of Higher
Education, 73(4), 435-460.
Kidder, L. H, & Fine, M. (1987). Qualitative and quantitative methods: When stories converge. In
M. M. Mark & R. L. Shotland (Eds.), Multiple methods in program evaluation: New
directions for program evaluation (pp. 57- 75). San Francisco: Jossey-Bass.
Klein-Collins, R., & Starr, R. (2007). Rung by rung: Applying a work-based learning model to
develop missing rungs on a nursing career ladder. Retrieved from
http://www.jff.org/publications/rung-rung-applying-work-based-learning-model-develop-
missing-rungs-nursing-career.
Kleinmuntz, D. N., & Schkade, D. A. (1993). Information displays and decision
processes. Psychological Science, 4(4), 221-227.
Koriat, A., Lichtenstein, S., & Fischhoff, B. (1980). Reasons for confidence. Journal of
Experimental Psychology: Human Learning and Memory, 6,107-118.
Kosiewicz, H. (2014, November). Giving community college students voice: The effects of
mathematics self-placement on academic achievement. Paper presented at the annual
conference of Association for the Study of Higher Education, Washington, D.C.
146
Kosiewicz, H., Ngo, F., & Fong, K. (2014). Injecting innovation into developmental math: Issues of
use and student access in resource-constrained community colleges. In M. Kurlaender
(Chair), Community College Pathways: Factors Affecting Persistence and Degree
Attainment at Public Two-Year Colleges. Symposium conducted at the annual conference of
the Association for Public Policy and Management, Albuquerque, NM.
Kuh, G., Kinzie, J., Schuh, J.H., Whitt, E.J. & Associates. (2010). Student success in college:
Creating conditions that matter. New York, NY: Jossey-Bass.
Kuh, G. D. (2001). The National Survey of Student Engagement: Conceptual framework and
overview of psychometric properties. Bloomington, IN: Indiana University Center for
Postsecondary Research.
Kühberger, A., Schulte-Mecklenbeck, M., & Perner, J. (2002). Framing decisions: Hypothetical and
real. Organizational Behavior and Human Decision Processes, 89(2), 1162-1175.
Kuusela, H. & Paul, P. (2000). A comparison of concurrent and retrospective verbal protocol
analysis. The American Journal of Psychology, 113,387–404.
Ladson-Billings, G. (1997). It doesn't add up: African American students' mathematics
achievement. Journal for Research in Mathematics Education, 28, 697-708.
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data.
Biometrics, 33(1), 159–174.
Langer, E. J., & Moldoveanu, M. (2000). The construct of mindfulness. Journal of Social
Issues, 56(1), 1-9.
Levin, B. (2000). Putting students at the center in education reform. International Journal of
Educational Change, 1(2), 155–172.
Loewenstein, G. F., Weber, E. U., Hsee, C. K., & Welch, E, S. (2001). Risk as feelings.
Psychological Bulletin, 127, 267-286.
Lohse, G. L. (1997). The role of working memory on graphical information processing. Behaviour
& Information Technology, 16(6), 297-308.
Luna, A. (2003). A voice in the decision: Self-evaluation in the freshman English placement
process. Reading & Writing Quarterly, 19, 377–392.
Lupia, A., McCubbins, M. D., & Popkin, S. L. (Eds.). (2000). Elements of reason: Cognition,
choice, and the bounds of rationality. Cambridge, UK: Cambridge University Press.
Lyles, M. A. (1981). Formulating strategic problems: empirical analysis and model
development. Strategic Management Journal, 2(1), 61-75.
Lyles, M. A., & Mitroff, I. I. (1980). Organizational problem formulation: An empirical
study. Administrative Science Quarterly, 102-119.
147
Martin, A. J., Marsh, H. W., & Debus, R. L. (2001). Self-handicapping and defensive pessimism:
Exploring a model of predictors and outcomes from a self-protection perspective. Journal of
Educational Psychology, 93, 87–102.
March, J. G. (1994). Primer on decision making: How decisions happen. New York, NY: Simon
and Schuster.
Mathematics Diagnostic Testing Project. (2000). Winter 2000 Newsletter. San Diego, CA: Author.
McGee, E. O., & Martin, D. B. (2011). “You would not believe what I have to go through to prove
my intellectual value!” Stereotype management among academically successful black
mathematics and engineering students. American Educational Research Journal, 48(6),
1347-1389.
McLoyd, V. C. (1998). Socioeconomic disadvantage and child development. The American
Psychologist, 53(2), 185-204.
MDRC (n.d.). Developmental Education Initiative. Retrieved from
http://http://www.mdrc.org/project/developmental-education-initiative#featured_content.
Melguizo, T., Hagedorn, L. S., & Cypers, S. (2008). Remedial/developmental education and the
cost of community college transfer: A Los Angeles County sample. The Review of Higher
Education, 31(4), 401-431.
Melguizo, T., Bos, J., & Prather, G. (2013). Using a regression discontinuity design to estimate the
impact of placement decisions in developmental math (Working paper). Los Angeles, CA:
The University of Southern California.
Melguizo, T., Kosiewicz, H., Prather, G., & Bos, J. (2014). "How are community college students
assessed and placed in developmental math? Grounding our understanding in reality." The
Journal of Higher Education, 85(5), 691-722.
Meltsner, A. J. (1972). Political feasibility and policy analysis. Public Administration Review, 859-
867.
Mesa, V. (2011). Similarities and differences in classroom interaction between remedial and
college mathematics courses in a community college. Journal on Excellence in College
Teaching, 22(4), 21-55.
Midgley, C., Maehr, M. L., Hruda, L. A., Anderman, E., Anderman, L., Gheen, M., et al. (2000).
Manual for the Patterns of Adaptive Learning Scale. Ann Arbor, MI: University of Michigan.
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook.
(2nd ed.). Thousand Oaks, CA: Sage.
Mitra, D. L. (2003). Student voice in school reform: Reframing student-teacher relationships.
McGill Journal of Education, 38(2), 289–304.
148
Morgan, D. L. (1998). Practical strategies for combining qualitative and quantitative methods:
Applications to health research. Qualitative Health Research, 8(3), 362-376.
National Center for Public Policy and Higher Education & Southern Regional Education Board.
(2010). Beyond the rhetoric: Improving college readiness through coherent state policy. San
Jose, CA: Authors.
National Center for Public Policy and Higher Education. (2008). Policy alert. San Jose, CA:
Author.
National Conference of State Legislatures. (n.d.) Hot topics in higher Education: Reforming
remedial education. Retrieved from http://www.ncsl.org/research/education/improving-
college-completion-reforming-remedial.aspx.
Neumann, L. J., & Morgenstern, O. (1947). Theory of games and economic behavior. Princeton,
NJ: Princeton University Press.
Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
Ngo, F. & Melguizo, T. (February, 2015). How can placement policy improve math remediation
outcomes? Evidence from experimentation in community colleges. Paper presented at the
annual meeting of the Association for Education Finance and Policy, Washington, D.C.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of
General Psychology, 2(2), 175-220.
O'Connor, A. M., Drake, E. R., Fiset, V., Graham, I. D., Laupacis, A., & Tugwell, P. (1998). The
Ottawa patient decision aids. Effective Clinical Practice, 2(4), 163-170.
O'Gara, L., Karp, M. M., & Hughes, K. L. (2009). Student success courses in the community
college: An exploratory study of student perspectives. Community College Review, 36(3),
195-218.
Olshavsky, R. W., & Granbois, D. H. (1979). Consumer decision making-fact or fiction? Journal of
Consumer Research, 93-100.
Onwuegbuzie, A. J., & Leech, N. L. (2005). On becoming a pragmatic researcher: The importance
of combining quantitative and qualitative research methodologies. International Journal of
Social Research Methodology, 8(5), 375-387.
Pajares, F., & Miller, M. D. (1995). Mathematics self-efficacy and mathematics outcomes: The
need for specificity of assessment. Journal of Counseling Psychology, 42, 190-198.
Parsad, B., Lewis, L. & Green, B. (2003). Remedial education at degree-granting postsecondary
institutions in Fall 2000 (NCES Publication No. 2004-010). Washington, DC: National
Center for Education Statistics, U.S. Department of Education.
Patton, M. (2001). Qualitative research and evaluation method (3rd ed.). Thousand Oaks, CA:
Sage.
149
Payne, J. W., Bettman, J. R., & Johnson, E. J. (1992). Behavioral decision research: A constructive
processing perspective. Annual Review of Psychology, 43(1), 87-131.
Payne, J. W., Bettman, J. R., & Johnson, E. J. (1988). Adaptive strategy selection in decision
making. Journal of Experimental Psychology: Learning, Memory, and Cognition, 14(3),
534-552.
Pennington, N., & Hastie, R. (1988). Explanation-based decision making: Effects of memory
structure on judgment. Journal of Experimental Psychology: Learning, Memory, and
Cognition, 14(3), 521-533.
Perin, D. (2011). Facilitating student learning through contextualization: A review of evidence.
Community College Review, 39(3) 268–295.
Pfleging, E. (2002). An evaluation the Early Alert Program at Columbia College. (ERIC Document
Reproduction Service No. ED478596), Stanislaus, CA: Master of Arts Action Research
Project.
Pfeiffer, J. (2011). Interactive decision aids in e-commerce. New York, NY: Springer Science &
Business Media.
Pintrich, P. R. (2000). Multiple goals, multiple pathways: The role of goal orientation in learning
and achievement. Journal of Educational Psychology, 92(3), 544-555.
Price, L. L., & Feick, L. F. (1984). The role of interpersonal sources in external search: An
informational perspective. Advances in Consumer Research, 11(1), 250-255.
Primary Research Group, Inc. (2008). Survey of assessment practices in higher education. New
York, NY: Author.
Public Policy Institute of California. (2013). The Impact of Budget Cuts on California’s
Community Colleges. Retrieved from: http://www.ppic.org/main/publication.asp?i=1048
Ranyard, R. (1987). Cognitive processes underlying similarity effects in risky choice. Acta
Psychologica, 64(1), 25-38.
Reichardt, C. S., & Cook, T. D. (1979). Beyond qualitative versus quantitative methods. Qualitative
and Quantitative Methods in Evaluation Research, 1, 7-32.
Rossman, G. B., & Wilson, B. L. (1985). Numbers and words: Combining quantitative and
qualitative methods in a single large-scale evaluation study. Evaluation Review, 9, 627-643.
Royer, D. J., & Gilles, R. (1998). Directed self-placement: An attitude of orientation. College
Composition and Communication, 50(1), 54–70.
Royer, D. J., & Gilles, R. (2003). Directed self-placement: Principles and practices. Cresskill, NJ:
Hampton Press.
150
Rudduck, J., & Flutter, J. (2000). Pupil participation and perspective: Carving a new order of
experience. Cambridge Journal of Education, 30(1), 75–89.
Rudmann, J. (1992). An evaluation of several early alert strategies for helping first semester
freshman at the community college and a description of the newly developed Early 9 Alert
Software (EARS) (ERIC Document Reproduction Service No. ED 349 055). Irvine, CA:
Irvine Valley College.
Rutschow, E. Z., & Schneider, E. (2011). Unlocking the gate: What we know about improving
developmental education. New York, NY: MDRC.
Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic
motivation, social development, and well-being. American Psychologist, 55(1), 68–78.
Saunders, C., & Jones, J. W. (1990). Temporal sequences in information acquisition for decision
making: A focus on source and medium. Academy of Management Review, 15(1), 29-46.
Scherz, Z. & Oren, M. (2006). How to change students’ images of science and technology. Science
Education, 90, 966–985.
Schkade, D. A., & Kleinmuntz, D. N. (1994). Information displays and choice processes:
Differential effects of organization, form, and sequence. Organizational Behavior and
Human Decision Processes, 57(3), 319-337.
Schwenk, C. (1988). The essence of strategic decision making. Lexington, MA: Lexington.
Schwenk, C., & Thomas, H. (1983). Effects of contlicting analyses on managerial decisionmaking.
Decision Sciences, 14, 467-482.
Scott-Clayton, J., & Rodriguez, O. (2012). Development, discouragement, or diversion? New
evidence on the effects of college remediation (No. W18328). National Bureau of Economic
Research.
Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence
from college remediation. Educational Evaluation and Policy Analysis, 36, 371-393
Scott-Clayton, J., & Rodriguez, O. (2012). Development, diversion, or discouragement? A new
framework and new evidence on the effects of college remediation. Paper presented at the
annual conference of the American Education Finance and Policy, Boston, MA
Scott-Clayton, J. (2012). Do high-stakes placement exams predict college success? (CCRC
Working Paper No. 41). New York, NY: Columbia University, Teachers College,
Community College Research Center.
Scrivener, S., Weiss, M. J., Ratledge, A., Rudd, T., Sommo, C., & Fresques, H. (2015). Doubling
graduation rates: Three-year effects of CUNY’s Accelerated Study in Associate Programs
(ASAP) for developmental education students. New York, NY: MDRC
151
Scrivener, S., & Coghlan, E. (2011). Opening doors to student success: A synthesis of findings from
an evaluation at six community colleges. Retrieved from
http://www.mdrc.org/publications/585/ overview.html
Seale, J. (2010). Doing student voice work in higher education: An exploration of the value of
participatory methods. British Educational Research Journal, 36(6), 995–1015.
Shavelson, R. J., Hubner, J. J., & Stanton, J. C. (1976). Self-concept: Validation of construct
interpretations. Review of Educational Research, 46, 407-441.
Sheldon, C. Q., & Durdella, N. R. (2010). Success rates for students taking compressed and regular
length developmental courses in the community college. Community College Journal of
Research and Practice, 34(1–2), 39–54.
Shore, M., Shore, J., & Boggs, S. (2004). Allied health applications integrated into developmental
mathematics using problem based learning. Mathematics and Computer Education, 38(2),
183-189.
Simon, H. A. (1955). A behavioral model of rational choice. Quarterly Journal of Economics, 69,
99-118.
Skaalvik, E. (1997). Self-enhancing and self-defeating ego orientations: Relations with task and
avoidance orientation, achievement, self-perceptions, and anxiety. Journal of Educational
Psychology, 89, 71- 81.
Slovic, P., Griffin, D., & Tversky, A. (1990). Compatibility effects in judgment and choice. In R.
Hogarth (Ed.), Insights in decision making: A tribute to Hillel Einhorn (pp. 5-27). Chicago:
University of Chicago Press.
Slovic, P. (1972). Information processing, situation specificity, and the generality of risk-taking
behavior. Journal of Personality and Social Psychology, 22(1), 128-134.
Speier, C. (2006). The influence of information presentation formats on complex task decision-
making performance. International Journal of Human-Computer Studies, 64(11), 1115-
1131.
Stone, D. (2001). Policy paradox: The art of political decision-making. New York, NY: W. W.
Norton & Company.
Strong Schools of America. (2008). Diploma to nowhere. Washington, DC: Author.
Sundstrom, G. A. (1987). Information search and decision making: The effects of information
displays. Acta Psychologica, 65, 165-179
Tang, Y. Y., & Posner, M. I. (2009). Attention training and attention state training. Trends in
Cognitive Sciences, 13(5), 222-227.
Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social & behavioral research.
Thousand Oaks, CA: Sage.
152
Taylor, K. L., & Dionne, J. P. (2000). Assessing problem-solving strategy knowledge: The
complementary use of verbal protocols and retrospective debriefing. Journal of Educational
Psychology, 92, 413–425.
Taylor, K. M., & Popma, J. (1990). An examination of the relationships among career decision-
making self-efficacy, career salience, locus of control, and vocational indecision. Journal of
Vocational Behavior, 37(1), 17-31.
Taylor, K. M., & Betz, N. E. (1983). Applications of self-efficacy theory to the understanding and
treatment of career indecision. Journal of Vocational Behavior, 22, 63-81.
Teichert T, Ferrera VP, & Grinband J. (2014). Humans optimize decision-making by delaying
decision onset. PLoS ONE 9(3). doi:10.1371/journal.pone.0089638
The Century Foundation, (2013). Bridging the higher education divide: Strengthening community
colleges and restoring the American dream. New York, NY: The Century Foundation Press.
Thompkins, P. (2003). Directed self-placement in a community college context. In D.J. Royer & R.
Gilles (Eds.), Directed self-placement: Principles and practices (pp. 193–206). Cresskill,
NJ: Hampton Press.
Tierney, W. G., Bailey, T., Constantine, J., Finkelstein, N., & Hurd, N. F. (2009). Helping Students
Navigate the Path to College: What High Schools Can Do. IES Practice Guide. NCEE 2009-
4066. What Works Clearinghouse.
Tinto, V. (1997). Classrooms as communities: Exploring the educational character of student
persistence. Journal of Higher Education, 68, 599-623.
Todd, P. & Benbasat, I. (1993). Decision-makers, DSS and decision making effort: An
experimental investigation. INFORS, 31(2) 1-21.
Todd, P. & Benbasat, I.(1994) The influence of DSS on choice strategies: An ex perimental
analysis of the role of cognitive effort. Organizational Behavior and Human Decision
Processes, 60, 36-74.
Tversky, A., & Kahneman, D. (1986). Rational choice and the framing of decisions. Journal of
Business, 251-278.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and
biases. Science, 185(4157), 1124-1131.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and
probability. Cognitive Psychology, 5(2), 207-232.
Tversky, A. (1969). Intransitivity of preferences. Psychological Review, 76(1), 105-110.
Twigg, C. (2005). Increasing success for underserved students: Redesigning introductory courses.
Saratoga Springs, NY: The National Center for Academic Transformation. Retrieved from
http://www.thencat.org/Monographs/IncSuccess.pdf.
153
Venezia, A., Bracco, K. R., & Nodine, T. (2010). One shot deal? Students’ perceptions of
assessment and course placement in California’s community colleges. San Francisco, CA:
WestEd.
Vermeer, H. J., Boekaerts, M., & Seegers, G. (2000). Motivational and gender differences: Sixth-
grade students' mathematical problem-solving behavior. Journal of Educational Psychology,
92(2), 308.
Visher, M. G., Weiss, M. J., Weissman, E., Rudd, T., & Wathington, H. D. (with Teres, J., & Fong,
K.). (2012). The effects of learning communities for students in developmental education: A
synthesis of findings from six community colleges. New York, NY: National Center for
Postsecondary Research.
Wallin, D.L. (2004). Valuing professional colleagues: Adjunct faculty in community and technical
colleges. Community College Journal of Research and Practice. 28(4), 373-391.
Washington State Board for Community and Technical College (2005). 2005-2006 Academic Year
Reports. Retrieved from http://www.sbctc.ctc.edu/college/_d-acad2005-06.aspx.
Weissman, E., Butcher, K. F., Schneider, E., Teres, J., Collado, H., & Greenberg, D. (2011).
Learning communities for students in developmental math: Impact studies at Queensborough
and Houston Community Colleges. New York, NY: National Center for Postsecondary
Research.
Wood, R., & Bandura, A. (1989). Social cognitive theory of organizational management. Academy
of management Review, 14(3), 361-384.
Wooldridge, J. (2006). “Cluster-sample methods in applied econometrics: An extended analysis.”
Manuscript, Department of Economics, Michigan State University.
Zacharakis, A. L., & Shepherd, D. A. (2001). The nature of information and overconfidence on
venture capitalists' decision making. Journal of Business Venturing, 16(4), 311-332.
Zajonc, R. B. (1980). Feeling and thinking: Preferences need no inferences. American
Psychologist, 35(2), 151-175.
Zeidenberg, M., Jenkins, D., & Calcagno, J. C. (2007). Do student success courses actually help
community college students succeed? New York, NY: Columbia University, Teachers
College, Community College Research Center.
Zhang, P., & Whinston, A. B. (1995). Business information visualization for decision-making
support-a research strategy. Paper presented at the Proceedings of the First Americas
Conference on Information Systems.
Zhao, C. M., & Kuh, G. D., (2004). Adding value: Learning communities and student engagement.
Research in Higher Education, 45(2): 115-138.
154
Zimmerman, B. J. (1990). Self-regulated learning and academic achievement: An overview. Educational
Psychologist, 21, 3–17.
155
APPENDIX A
Table 1
A Revised Typology of Alternative Modes of Delivery for Developmental Math
Mode of Delivery Programs Definitions
Traditional developmental math:
Traditional courses are 16-week,
semester based courses taught solely in
the classroom
Restructured developmental math: This
mode of delivery restructures
developmental math sequences to reduce
the amount of time students spend in
developmental math
Fast-track courses
Fast track courses compress the
developmental education courses into
several weeks or a half semester
Modularized courses
Modularized courses break semester-
long developmental education classes
down into smaller, competency-based
units
Mainstreaming
Mainstreaming places developmental
education students directly into college-
level courses, and typically provides
additional academic support in the form
of tutoring or study skills courses
Curricularly redesigned developmental
math: This mode of delivery changes
how students learn developmental math
by making content relevant to a student's
personal experiences or learning goals
Learning communities
Learning communities students to
enroll in developmental and college-
level as a cohort
Contextualized learning
Contextualized learning ties
developmental education with the
student's academic and vocational
interests
Supplemental Instruction: This mode of
delivery offers additional academic
supports to developmental education
students to facilitate their achievement
Tutoring and
supplemental instruction
Tutoring is general academic assistance
typically provided by faculty, staff ,
students or through computer-assisted
instruction; supplemental instruction is
academic assistance directly related to a
specific course
Intensive advising
Intensive advising reduces advisor
caseloads to allow them to meet more
frequently with students and provide
more specialized attention
Student success courses
Student success courses teach students
basic study and life skills
Extended traditional: This mode of
delivery lengthens the amount of time a
student must spend in developmental
math to meet exit requirements
156
Table 2
Programs Combining Multiple Alternatives Models of Delivery for Developmental Math
Reducing
Time
Curricular
Redesign
Supplemental
Instruction
Table 2. Programs E mploying Combined Models of Delivery
Program Description
Fast-track/ Compression
Modularized
Contextualization
Learning Communities
Tutoring/ Lab
Advising
Student Success Courses
ASAP: Algebra
Success at Pierce
Provides students an opportunity to pass both
elementary algebra and intermediate algebra in one
semester.
Adelante
Adelante First Year Experience is a comprehensive
program involving student services, linked courses, a
stimulating learning environment, and committed
faculty.
Automotive Learning
Community
Developmental math courses designed for students
pursuing a career in the automotive industry
FACE
Highly structured first-year transfer and associate
degree program designed for recent high school
graduates. Students are automatically enrolled in a pre-
selected schedule of classes and guaranteed a full,
appropriate program their first year.
Freshman Success
Implements learning communities by pairing
developmental courses with counseling classes.
MAP: Modeling with
Algebra Project
A grant-funded project to help students succeed in
intermediate algebra.
PACE: Program for
Adult College
Education
A five-semester, 60 unit curriculum designed to meet
transfer requirements for the full-time working adult.
Passage
Academic and student support service designed to
increase the success of mal students. Services feature
learning communities, counseling, field trips, workshops
and mentors.
STATWAY
A national project of the Carnegie Foundation for the
Advancement of Teaching to take students at the
elementary algebra level “to and through” a transfer
statistics class in two semesters.
Teacher Pathway
Partnership with California State University (CSU) and
center for counseling. Goal is to train students to
become future K-6 teachers.
UMOJA
Community of instructors, counselors, students and
support services staff committed to the academic and
personal and professional growth and self-actualization
of African American students.
Urban Teachers
Fellowship
Career pathway into credentialed teaching that includes
part-time employment in after-school programs.
VCAP
College accelerated program leading to transfer and/or
associate degree. Consist of a unique selection of
academic courses offered in an accelerated format.
157
Table 3
Distribution of Alternative Models of Delivery across Developmental Math Levels (in percents)
Alternative Delivery Model
Reduced
time
Curricular
redesign
Supplemental
instruction
Extended
traditional
Combined
Alternative
MOD total
Five-levels below
transfer
-- -- 0.00 -- -- 0.00
Four-levels below
transfer
0.15 0.22 0.17 -- 0.02 0.09
Three-levels below
transfer
0.34 0.06 0.52 -- 0.10 0.27
Two-levels below
transfer
0.34 0.67 0.24 0.69 0.58 0.46
One-level below
transfer
0.17 0.06 0.05 0.31 0.29 0.18
Total: Alternative
MODs
0.06 0.01 0.45 0.36 0.12 1.00
158
Table 4
Distribution of Alternative Models of Delivery within Developmental Math Levels (in percents)
Alternative Delivery Model
Developmental math
level
Reduced
time
Curricular
redesign
Supplemental
instruction
Extended
traditional
Combined
Five-levels below
transfer
1.00
Four-levels below
transfer
0.10 0.02 0.85 0.03
Three-levels below
transfer
0.08 0.00 0.88 0.04
Two-levels below
transfer
0.05 0.01 0.24 0.55 0.15
One-level below transfer 0.06 0.00 0.13 0.62 0.19
159
Figure 1. Percent of Traditional versus Alternative Delivery Models, Distribution of Alternative
Models Used
160
Figure 2. Percent of Alternative Delivery models, by Developmental Math Level
161
Figure 3a. Trends in the Overall Use of Alternative MODs, 2005-2012
162
Figure 3b. Trends in the Use of Alternative MODs by Delivery Type, 2005-2012
163
Figure 4. Trends in the Use of Alternative MODs by Delivery Type and Developmental Math,
2005-2012
164
APPENDIX B
Table 1
Sample sizes of test- and self-placed students by enrollment year and college
Note. Some colleges did not place students into math during the summer semesters, which explains samples of zero students.
College Implementing Self-
Placement Policy
Total College A College B College C College D College E College F College G College H College X
Wave 1: Before switch to self placement
Summer 438 0 217 0 39 2 133 2 45 0
Fall 4300 557 678 260 581 269 548 938 158 311
Spring 1467 272 362 106 208 6 16 316 69 112
Summer 360 0 233 6 69 4 2 7 38 1
Fall 4287 471 690 240 591 311 307 1,112 162 403
Spring 1837 245 407 84 218 173 192 263 150 105
Summer 615 1 269 41 78 4 136 20 66 0
Fall 4417 418 923 166 655 303 273 996 253 430
2008 Spring 1860 155 453 100 230 158 214 282 117 151
Wave 2: During period of selfplacement
Summer 1192 97 416 48 175 12 235 31 144 34
Fall 6309 585 1,299 469 839 524 509 1,419 313 352
Wave 3: After switch back to COMPASS
Spring 2588 288 607 182 287 216 280 375 159 194
Summer 231 31 2 31 107 1 34 22 1 2
Total 29901 3120 6556 1733 4077 1983 2879 5783 1675 2095
2009
Colleges Implementing Test-Based Placement Policies
2005
2006
2007
2008
165
Table 2
Differences in achievement trends between College X and control colleges prior placement policy change
This table presents differences in trends in achievement between College X and the test-based from the summer semester of 2005 to the spring semester of 2008,
before math self-placement took effect the summer of 2008. Standard errors are clustered at the campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7)
College X*Year -9.80e-07 3.51e-05*** -5.44e-05** -1.67e-05 6.70e-06 -8.23e-06 -0.000231
(9.12e-06) (8.97e-06) (2.18e-05) (1.42e-05) (1.41e-05) (1.15e-05) (0.00107)
Observations 19,581 19,581 19,581 19,581 19,581 19,581 19,581
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Completing 60
degree-
applicable
credits
Number of
degree-
applicable units
completed
Failing first
math course
Withdrawing
from first math
course
Meeting the
minimum math
requirement for
an AA degree
Passing a
transfer-level
math course
Completing 30
degree-
applicable
credits
166
Table 3
The effect of self-placement relative to test-placement on student characteristics
This table presents changes in the demographic makeup of student population during the administration of math self-placement at College X. Semester-cohort
fixed effects were included. Standard errors are clustered at the campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7) (8)
Female Asian/Pacific
Islander
Hispanic Black Other White Native
English
speaker
First
English
course was
transfer-
level
Self placement*Summer+Fall 2008 -0.00365 -0.00174 0.0110 -0.0111 0.0217*** -0.0137*** 0.0491*** -0.00662
(0.0290) (0.00555) (0.0105) (0.0101) (0.00389) (0.00451) (0.00871) (0.0110)
Mean of dependent variable 0.544 0.143 0.513 0.155 0.065 0.124 0.691 0.133
Observations 29,901 29,901 29,901 29,901 29,901 29,901 29,901 29,901
Semester-cohort fixed effects YES YES YES YES YES YES YES YES
Standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
167
Table 4
Summary statistics, by treatment college and period of self-placement versus test-based placement
This table presents compares demographic characteristics and outcomes for enrollees in College X who did and did not experience self-placement against
enrollees in test-based placement colleges when College did and did not use self-placement.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
Period of self-
placement
Periods of test-
placement
Difference
(Y2)
Period of self-
placement
Periods of test-
placement
Difference
(Y1)
Female 0.552 0.559 -0.007 0.542 0.545 -0.003 -0.004
Black 0.111 0.122 -0.011 0.156 0.161 -0.005 -0.006
White 0.134 0.166 -0.032 0.112 0.123 -0.011 -0.021
Hispanic 0.531 0.491 0.04 0.534 0.508 0.026 0.014
Asian 0.134 0.166 -0.032 0.129 0.144 -0.015 -0.017
Native English 0.87 0.794 0.076 0.719 0.669 0.050 0.026
Withdrew from first math course 0.184 0.266 -0.082 0.175 0.193 -0.018 -0.064 **
Failed first math course 0.238 0.255 -0.017 0.232 0.252 -0.02 0.003
Met the minimum math requirement for AA 0.37 0.322 0.048 0.478 0.432 0.046 0.002
Passed transfer-level math course 0.236 0.139 0.097 0.202 0.177 0.025 0.072 ***
Completed 30 degree-applicable units 0.461 0.373 0.088 0.398 0.357 0.041 0.047
Completed 60 degree-applicable units 0.244 0.187 0.057 0.227 0.202 0.025 0.032
Number of degree-applicable units 32.28 27.89 4.39 31.07 28.1 2.97 1.42
Test-based placement colleges College X Difference in
Difference
(Y2-Y1)
168
Table 5
Math placement and enrollment patterns for test- and self-placed students.
This table compares the percent of students in College X against the percent of students in test-based placement colleges who placed into and enrolled across the different
levels of the math sequence before, during, and after College X employed self-placement. Dashes indicate that no students were reported to be at that level.
Wave 1: Before switch to self placement
5-levels below transfer
4-levels below transfer
3-levels below transfer
2-levels below transfer
1-levels below transfer
Transfer-level math
Tutoring
Technical-career math
Wave 2: During period of selfplacement
5-levels below transfer
4-levels below transfer
3-levels below transfer
2-levels below transfer
1-levels below transfer
Transfer-level math
Tutoring
Technical-career math
Wave 3: After switch back to COMPASS
5-levels below transfer
4-levels below transfer
3-levels below transfer
2-levels below transfer
1-levels below transfer
Transfer-level math
Tutoring
Technical-career math 0.04
--
--
--
--
--
--
--
--
--
--
31.47
22.74
13.37
7.66
5.71
7.98
4.89
0.01
0.92
18.09
0.65
15.63
27.79
24.06
18.99
6.63
--
--
19.90
21.43
44.39
3.06
4.59
--
--
--
--
--
--
--
--
--
--
--
--
13.59
7.45
--
--
--
12.69
24.61
21.24
16.58
14.51
10.36
--
--
--
--
--
--
--
1.95
17.25
28.05
25.28
19.76
7.71
--
--
1.81
18.94
33.45
24.76
--
0.51
45.92
51.02
2.55
Placement level First enrolled math course
% %
All Colleges College X
Self-placement model Test-based placement model
College X
Test-based placement model
First enrolled math
course
0.07
0.07
36.68
44.61
--
--
--
--
Placement level
First enrolled math
course
% %
24.57
23.83
25.64
17.60
%
6.35
--
--
--
7.20
25.58
41.90
16.13
1.32
18.57
--
--
--
2.00 0.80
22.83
22.49
24.91
16.80 --
--
--
0.02
7.87
--
6.70
5.46
169
Table 6
Enrollment patterns by race and sex in College X during periods of test-placement and self-placement.
This table compares demographic characteristics of enrollees who test-placed versus self-placed into the math sequence.
African-
American Latino White Asian Other Female Male
African-
American Latino White Asian Other Female Male
% % % % % % % % % % % % % %
4-levels below transfer 14.76 8.10 7.04 5.99 13.19 8.90 8.36 25.58 15.61 5.77 1.79 6.67 15.49 9.25
3-levels below transfer 30.48 29.05 17.61 19.01 18.68 28.80 20.42 30.23 25.37 23.08 17.86 26.67 25.35 23.70
2-levels below transfer 34.29 40.60 52.46 42.96 40.66 39.58 45.49 13.95 22.44 21.15 28.57 10.00 21.60 20.81
1-level below transfer 6.19 13.33 14.44 23.24 19.78 12.04 17.90 6.98 19.02 11.54 21.43 13.33 15.96 17.34
Transfer 0.48 1.07 1.76 4.23 2.20 1.78 1.59 11.63 8.78 26.92 21.43 23.33 9.86 20.23
Tutoring 13.81 7.86 6.69 4.58 5.49 8.90 6.73 11.63 8.78 11.54 8.93 20.00 11.74 8.67
Self-placement Test-placement
170
Table 7
The effect of switching from a test-based placement policy to a self-placement on achievement, pooled
cohorts
This table considers the reduced form effects of math self-placement on proximal and distal achievement outcomes. All
outcomes are measured within four-years of math enrollment, and for the entire study sample. All regressions controlled for
student demographic characteristics and semester-cohort and institutional fixed effects. Standard errors are clustered at the
campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7)
Self-placement*Summer+Fall 2008 9.04e-05 -0.0531*** 0.0145** 0.0820*** 0.0542*** 0.0375*** 2.034***
(0.00367) (0.0112) (0.00732) (0.00845) (0.00742) (0.00873) (0.592)
Mean of outcome variable 0.193 0.247 0.436 0.181 0.369 0.207 28.859
Number of observations 29901 29901 29901 29901 29901 29901 29901
Semester Cohorts FE YES YES YES YES YES YES YES
Student-level demographics YES YES YES YES YES YES YES
Institutional FE YES YES YES YES YES YES YES
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Number of
degree
applicable
units
completed
Failing first
math
course
Withdrawing
from first
math course
Meeting
minimum
math
requirement
for an AA
Passing a
transfer-
level math
course
Completing
30 degree-
applicable
units
Completing
60 degree-
applicable
units
171
Table 8
Heterogeneous effects of self-placement by level of first math course, sex, and race
This table considers the reduced form effects of math self-placement on proximal and distal achievement outcomes by math
enrollment level, gender, and race. All outcomes are measured within four-years of math enrollment, and for the full study sample.
Dashes indicate that the effect of math self-placement on achievement could not be computed by STATA because no students in
either groups achieved those outcomes. All regressions controlled for student demographic characteristics and semester-cohort and
institutional fixed effects. Standard errors are clustered at the campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7)
A. Level of first math course
Self-placement * 4 levels below transfer
-0.142 *** -0.219 *** 0.015 -- 0.059 *** 0.030 ***
0.90
***
(0.017) (0.026) (0.009) -- (0.015) (0.005) (.805)
Self-placement * 3 levels below transfer
0.008 -0.114 *** 0.020 0.074 *** 0.078 *** 0.079 *** 8.99 ***
(0.021) (0.019) (0.039) (0.017) (0.036) (0.026) (2.455)
Self-placement * 2 levels below transfer
0.076 *** -0.021 0.049 ** 0.097 *** 0.127 ** 0.038 * 4.91 ***
(0.013) (0.025) (0.022) (0.009) (0.012) (0.021) (1.372)
Self-placement * 1 levels below transfer
0.017 *** 0.064 *** -0.136 *** -0.119 *** -0.062 *** -0.070 *** -6.07 ***
(0.007) (0.015) (0.005) (0.013) (0.013) (0.015) ( .893)
Self-placement * Transfer-level -0.022 0.060 *** 0.069 *** 0.072 *** 0.314 *** 0.389 *** 25.44 ***
(0.017) (0.012) (0.023) (0.025) (0.0243) (0.022) (2.000)
Self-placement * Tutoring
-0.052 -0.226 *** -- -- -0.183 *** -0.167 ***
-18.13
***
(0.046) (0.044) -- -- (0.017) (0.014) (1.502)
B. Gender
Self-placement*Male -0.036 *** -0.063 *** 0.053 *** 0.113 *** 0.075 *** 0.044 *** 3.55 ***
(0.006) (0.013) (0.006) (0.008) (0.009) (0.008) (0.731)
Self-placement*Female 0.034 *** -0.068 *** -0.039 *** 0.033 *** 0.023 *** 0.024 *** 0.87
(0.005) (0.013) (0.008) (0.006) 0.008 (0.009) (0.690)
C. Race
Self-placement*White+Asians 0.035 *** -0.076 *** 0.043 *** 0.110 *** 0.069 *** 0.051 *** 3.06 ***
(0.005) (0.012) (0.009) (0.008) (0.011) (0.015) (0.772)
Self-placement*Black+Hispanic -0.014 ** -0.059 *** -0.022 ** 0.044 *** 0.034 *** 0.022 ** 1.31
(0.005) (0.014) (0.010) (0.008) (0.011) (0.009) (0.826)
Mean of dependent variable .247 .193 .438 .183 .370 .208 28.94
N 29668 29668 29668 29668 29668 29668 29668
Semester Cohorts FE YES YES YES YES YES YES YES
Student-level demographics YES YES YES YES YES YES YES
Institutional FE YES YES YES YES YES YES YES
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Number of
degree
applicable
units
completed
Failing first
math course
Withdrawing
from first
math course
Meeting
minimum
math
requirement
for an AA
degree
Passing a
transfer-
level math
course
Completing
30 degree-
applicable
units
Completing
60 degree-
applicable
units
172
Table 9
The effect of switching from a test-based placement policy to a self-placement on achievement,
summer and fall cohorts
This table considers the reduced form effects of math self-placement on proximal and distal achievement outcomes. All
outcomes are measured within four-years of math enrollment, and for students who placed and enrolled in math during the
summer and fall semesters. All regressions controlled for student demographic characteristics and semester-cohort and
institutional fixed effects. Standard errors are clustered at the campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7)
Selfplacement*Summer+Fall 2008 0.00233 -0.0536*** 0.00219 0.0652*** 0.0231*** 0.0191** 0.296
(0.00509) (0.0120) (0.00746) (0.00782) (0.00698) (0.00800) (0.576)
Mean of outcome variable 0.184 0.247 0.454 0.191 0.393 0.225 30.64
Observations 22,149 22,149 22,149 22,149 22,149 22,149 22,149
Semester Cohorts FE YES YES YES YES YES YES YES
Student-level demographic characteristics YES YES YES YES YES YES YES
Institutional FE YES YES YES YES YES YES YES
Mean of outcome variable 0.184 0.247 0.454 0.191 0.393 0.225 30.64
Observations 22,149 22,149 22,149 22,149 22,149 22,149 22,149
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
Number of
degree
applicable
units
completed
Failing first
math course
Withdrawing
from first
math course
Meeting
minimum math
requirement
for an AA
degree
Passing a
transfer-level
math course
Completing
30 degree-
applicable
units
Completing
60 degree-
applicable
units
173
Table 10.
Falsification test
This table considers the reduced form effects of math self-placement on proximal and distal achievement outcomes one year one
year before the actual administration of math self-placement at College X. All outcomes are measured within four-years of math
enrollment, and for the entire study sample. All regressions controlled for student demographic characteristics and semester-cohort
and institutional fixed effects. Standard errors are clustered at the campus of enrollment and are presented in parentheses.
∗p< .1, ∗∗p< .05, ∗∗∗p< .01.
(1) (2) (3) (4) (5) (6) (7)
Failing first
math
course
Withdrawing
from first math
course
Meeting the
minimum math
requirement for
an AA degree
Pass a
transfer-level
math course
Completing 30
degree-
applicable
credits
Completing 60
degree-
applicable
credits
Number of
degree-
applicable units
completed
Self-placement*Summer+Fall 2008 0.00202 0.0206** 0.000725 -0.0311*** 0.0110* 0.00171 0.599
(0.00645) (0.00977) (0.00907) (0.00651) (0.00603) (0.00640) (0.507)
Number of observations 29,901 29,901 29,901 29,901 29,901 29,901 29,901
Semester Cohorts FE YES YES YES YES YES YES YES
Student-level demographics YES YES YES YES YES YES YES
Institutional FE YES YES YES YES YES YES YES
Robust standard errors in parentheses
*** p<0.01, ** p<0.05, * p<0.1
174
Table 11.
Permutation Tests
This table presents placebo estimates, which each measure the effect of math self-placement on achievement if the administration of
self-placement had occurred at one of the eight remaining colleges employing a test-based placement model. The first row presents
the reduced form effects from the main model; the last two rows show the range of values for the placebo estimates. All regressions
controlled for student demographic characteristics and semester-cohort and institutional fixed effects. Standard errors are clustered
at the campus of enrollment.
(1) (2) (3) (4) (5) (6) (7)
Self-placement effect 9.04e-05 -0.0531 0.0145 0.082 0.0542 0.0375 2.034
Placebo effects (Range)
Bottom -0.0149 -0.0696 -0.051 -0.046 -0.0683 -0.084 -5.992
Top 0.0268 0.0524 0.0244 0.0284 0.0188 0.0437 1.474
Number of
degree
applicable
units
completed
Failing first
math course
Withdrawing
from first
math course
Meeting
minimum math
requirement for
an AA degree
Passing a
transfer-level
math course
Completing 30
degree-
applicable
units
Completing
60 degree-
applicable
units
175
Figure 1. Parallel Trend Pre-Implementation of DSP before Summer 2008 in College X.
Proportion
Proportion
Proportion
Proportion
Proportion
Proportion
176
Number
177
Figure 2. Series of Distributions of DSP Placebo Effects on Achievement.
Frequency
Frequency
Self-placement * Treated Semesters coefficient
Frequency
Self-placement * Treated Semesters coefficient
Self-placement * Treated Semesters coefficient Self-placement * Treated Semesters coefficient
Frequency
Frequency
Self-placement * Treated Semesters coefficient
Self-placement * Treated Semesters coefficient
178
Note.Thick bars represent the number of placebo tests whose coefficients fall in between
two numbers on the line graph; the red line represents the College X value.
Self-placement * Treated Semesters coefficient
179
APPENDIX C
Table 1
Compensatory and non-compensatory decision strategies.
Adapted from Pfeiffer, (2012), Interactive Decision Aids in E-Commerce.
Decision Strategy Type Application
Additive Difference Rule Compensatory Employed to iteratively perform pairwise
comparisons of utilities calculated for each
alternatives until one alternative is left.
Conjuctive Strategy Non-compensatory Used to eliminate an alternative if at least
one attribute does not meet a pre-
determined cut off level.
Disjunctive Strategy Non-compensatory Used to eliminate an alternative if all of its
attributes do not meet a pre-determined cut
off level.
Dominance Strategy Non-compensatory Used to choose an alternative that
dominates all other alternatives. If no
alternative dominates, then no alternative is
chosen or an alternative is chosen at
random.
Elimination by Aspect (EBA) Non-compensatory Establishes a cut off value for the most
important attribute, and eliminates all
alternatives that do not meet or exceed that
cutoff value.
Frequency of Good or Bad
Attribute Features Heuristic
(FGB)
Non-compensatory Used to distinguish between good, neutral,
and bad attribute values for each alternative;
the alternative with the highest number of
good attribute levels is retained.
Lexicographic Heuristic Non-compensatory Determines the most important attribute,
evaluates that attribute across all
alternatives, and retains the alternative
exhibiting the highest value for that attribute.
Majority of Confirming
Dimensions Heuristic
Non-compensatory Similar to Additive Difference, decision-
makers engage in a similar comparative
process, but base choice on preference
instead of utility.
Satificing Heuristic Non-compensatory Iteratively considers alternatives in the order
in which they occur; decision-makers select
the first alternative where all attribute levels
are above a pre-determined cut off level.
Satisficing Plus Heuristic Non-compensatory Similar to SAT, decision-makers select an
alternative on an incomplete evaluation of
attributes.
Weighted Additive Rule Compensatory Used to compute a utility for each
alternative based on the sum of the
weighted attribute values.
180
Table 2
Simplification techniques and decision biases
Simplification technique
+ Decision bias Effects on decision-making
Examples of how simplification techniques &
decision biases may affect DSP decision-making
Anchoring Influences people to base decisions on judgments
made at the start of the decision-making process.
(Bodenhausen, Gabriel, & Lineberger, 2000).
Students may reach a placement decision without
carefully considering all of the information provided in
their self-placement guide.
Ascertainment bias Pre-shapes decision-making process on what the
individual hopes or think she or he is expected to
find; sterotyping and gender bias are examples of this
bias (Terwilliger & Weiss, 2003).
Students, particularly women and those of color, may
engage in diagnostic and decision-making processes in
ways that confirm societal beliefs of their math abilities.
Availability bias Influences people to making a decision on how
easily they can recall events (Croskerry, 2002).
Students who have difficulty recalling memories about
their performance in some math courses may ignore
reviewing those courses in reaching a placement
decision.
Confirmation bias Convinces individuals to interpret evidence in ways
that are partial to existing beliefs, expectations, or a
hypotheses (Nickerson, 1998).
Students may choose a course even if the face of
evidence disconfirming that they have met the academic
qualifications for that course.
Framing Defines how an individual approaches a problem
depending on their norms, habits, and expectancies
(Aversky & Kahneman, 1986).
Students who like academic challenges may frame their
decision-making approach such that they forsake
searching for and evaluating courses appearing too easy.
Order effects Impacts what information individuals use to makes
decisions based on the order in which information is
presented (Hogarth & Einhorn, 1992).
Students may make placement decisions that are
unintentionally uninformed by pertinent information
received at the start of their decision-making.
Overconfidence bias Convinces individuals to think they know more than
what they actually do (Croskerry, 2002).
Students are compelled to pick a more rigorous math
course because they succumb to the illusion that they
are "good" or "not bad" at math.
Pre-mature closure A type of anchoring, this bias encourages individuals
sensitive to emotions and time pressures to
preemptively make a decision without considering all
relevant information that could produce a different
decision. (Croskerry, 2002)
Students, apathetic about choosing an appropriate math
course, may intentionally ignore information in their self-
placement to pick a course quickly.
Search satisificing Leads individuals to make a decision without fully
examining all available options (Caplin, Dean, &
Martin, 2011).
Students may fail to examine all course alternatives, and
their attributes, when reaching a final placement
decision.
Visceral bias Elicits a strong emotional response from an individual
limiting an individual's diagnostic and decision
processes (Nendaz & Perrier, 2012).
Students who are anxious about math may engage in
decision-making that leads them to ignore alternatives
that may further weaken their confidence in their math
abilities.
181
Table 3
Summary statistics of participants from quantitative study, aggregated and by college
Note. College-level math includes trigonometry, pre-calculus, statistics, and calculus. Sample sizes vary
due to non-response.
N Mean N Mean N Mean
Female
130 0.508 81 0.469 49 0.571
Latino
131 0.726 82 0.805 49 0.591
African-American
131 0.031 82 0.012 49 0.061
Asian
131 0.045 82 0.037 49 0.061
White
131 0.099 82 0.049 49 0.184
Age
129 22.83 81 21.84 48 24.50
First-generation
130 0.515 81 0.518 49 0.510
Last math course
Algebra 1
130
0.100
81 0.086 49
0.122
Geometry
130
0.208
81 0.160 49
0.286
Algebra 2
130
0.369
81 0.383 49
0.347
College-level math*
130
0.169
81 0.198 49
0.123
Other
130
0.069
81 0.074 49
0.061
Don't remember
130
0.085
81 0.099 49
0.061
Passed last math course
131
0.802
81 0.829 49
0.755
Avg math self-efficacy
130
3.921
81 3.938 49
3.892
Avg math self-concept
129
2.780 80
2.802 49
2.744
All College Y College X
182
Table 4
Summary statistics, first and second treatment
Mean SD Mean SD T stat
Female 0.477 0.062 0.538 0.062 -0.698
Asian 0.045 0.026 0.046 0.026 -0.019
White 0.106 0.038 0.092 0.036 0.261
African-American 0.030 0.021 0.031 0.022 -0.015
Latino 0.697 0.057 0.754 0.054 -0.725
First-generation status 0.508 0.062 0.523 0.062 -0.174
Took college-level math as last H.S. math course 0.136 0.043 0.200 0.050 -0.970
Passed last math course 0.758 0.053 0.846 0.045 -1.269
Level of avg math self-efficacy 3.900 0.087 3.942 0.095 0.328
Level of avg math self-concept 2.799 0.078 2.762 0.080 0.341
First Treatment Second Treatment
183
Table 5
Descriptive characteristic of participants from qualitative study
Sample (N=16)
Number Latina females 6
Number Latino males 2
Number mixed-race female (Latina/White) 1
Number White males 3
Number mixed race males (Chinese/White; Latino/White) 2
Number African-American males 1
Number Asian males (Iranian) 1
Number first-generation 5
Age range 18-30
Last math course taken Algebra I, Geometry, Algebra II
Statistics, Math Analysis
184
Table 6
Reasons for choosing first math course, by percent reported
Note. Summation of percents exceeds 100 because some participants reported more than one reason for choosing
their first math course.
Reason
Math course followed last math course in high school 31%
In need of refresher math course / out of school 22%
Negative associations or feelings about math 17%
Confidence in succeeding in course/comfortability with course material 15%
Interest in / affinity for math course 4%
Counted as credit towards upward transfer 4%
Location of course in sequence 2%
"What college students take" 2%
Not reported 7%
Percent
185
Table 7
Predictors of naïve and informed placement decisions
β SD β SD
Took last math course in high school in 2014 0.190 0.114 0.186 0.105
Took college-level math as last math course in h.s. 0.269 0.137 0.243 0.133
Passed last math course in high school 0.120 0.121 -0.028 0.120
Average math self-efficacy -0.013 0.082 -0.014 0.076
Average math self-concept 0.311 0.102 0.289 0.093
Female 0.027 0.099 0.026 0.092
First generation -0.065 0.101 -0.096 0.092
Age as of January 1, 2015 -0.030 0.014 -0.015 0.010
White/Asian 0.094 0.145 -0.002 0.138
Naïve placement
decision
Informed placement
decision
186
Table 8
Overview of decision-making patterns for participants in qualitative component.
Name Sex Race/Ethnicity Age Information focus Biases Placement decision
Angela I'm not good at math F Latina 30 yrs Doesn't remember Search satificing Pre-algebra
Araceli F Latina 18 yrs Algebra II Search satificing Intermediate algebra
Cameron M White 27 yrs Doesn't remember None Pre-mature closure Arithmetic
Chris I'm not good at math M White 18 yrs Algebra II Search satificing Intermediate algebra
Cynthia F Latina 18 yrs Algebra II Search satificing, over-
confidence bias
Intermediate algebra
Daniel M Latino 20 yrs Algebra II Search satificing Elementary algebra
Eric I'm not good at math M Latino 18 yrs Algebra I Course content Search satificing Arithmetic
Gloria F Latina 30 yrs Doesn't remember Course content Search satificing Arithmetic
Jonathan M African-American 18 yrs Algebra II Search satificing, over-
confidence bias
Elementary algebra
Jeff M White 24 yrs Geometry None Pre-mature closure Intermediate algebra
Kristen F Latina 18 yrs Algebra II Search satificing Intermediate algebra
Linda F Latina 30 yrs Doesn't remember Search satificing, pre-
mature closure
Pre-algebra
Mani M Middle Eastern 18 yrs Statistics Search satificing,
confirmation bias
Intermediate algebra
Mark I'm not good at math M Mixed
(Chinese/White)
24 yrs Algebra II Search satificing,
emotional response
Elementary algebra
Matt M Mixed
(Latino/White)
18 yrs Math Analysis Search satificing, over-
confidence bias
Statistics
Rachel F Mixed
(Latina/White)
18 yrs Algebra II Course content Search satisficing,
visceral bias, anchoring
College algebra
Decision strategies
I need math for my academic
goals
I'm here to learn
I need math for my academic
goals
I'm here to learn
EBA, Satificing Plus
EBA
Satisficing Plus
EBA
Approach Last math course
I'm here to learn
I need math for my academic
goals
I'm here to learn
I need math for my academic
goals
I'm here to learn
I need math for my academic
goals
I need math for my academic
goals
I'm here to learn
Math sequence, course
content, course
transferability, pre-req math
problems
Math sequence, course
content, pre-req math
problems
Course content, course pre-
reqs, pre-req math problems
EBA
EBA
EBA
Course content, course-
transferability, pre-req math
problems
Course content, course pre-
requisites, pre-requisite math
problems
Math sequence, course
content, pre-req math
problems
Math sequence, course
content, pre-req math
Course pre-requisites, course
transferability,pre-req math
problems
Course content, course pre-
reqs, course transferiability,
pre-req math problems
EBA
EBA
EBA
Course content, course pre-re
course transferability, pre-req
math problems
Course content, pre-req math
problems
EBA
FGB, Satisficing Plus
Satisficing Plus
Satisficing Plus
EBA, Satificing
Satisficing Plus
187
Figure 1. Concurrent Mixed Method Design Map (adopted from Creswell, et. al., 2003)
Data Analysis Methods:
Decision-coding / Thematic analysis
QUANTITATIVE
Data Collection Methods:
Online computer survey (embedded RCT)
Data results compared Data Analysis Methods
Descriptive + OLS Regression
Interpretation of
combined results
QUALITATIVE
Data Collection Methods:
Interview + Verbal Protocols
188
Figure 2. Quantitative research design
Survey
Introduction to scenario
Picked initial placement
Randomization
Treatment 1
Course Descriptions
Treatment 2
Course Descriptions
Math Problems
Given chance to change intial
placement decision
Reported supports needed for DSP to work
Reported demographic, math self-
efficacy, math self-concept data
Open
Given math sequence
visual
189
Figure 3. Qualitative research design
Interview +
Talk Alouds
Open-ended interview
Concurrent verbal
protocol
Retrospective verbal
protocol
190
Figure 4. Percent of placement decisions by math level, before and after receipt of treatment
191
Figure 5. The impact of including math problem sets in self-placement guides changing naïve
placement decision.
Naive placement
decision
31%
20%
With math problems
W/o math problems
192
Figure 6. Rachel’s information search and processing patterns
Appealing courses:
pre-Calculus, Calculus
Divides courses
into three blocks
Courses already taken:
pre-algebra, Elementary
Algebra
Unfamiliar courses:
Arithmetic, Intermediate
Algebra, Math for Liberal
Arts, Statistics; College
Algebra
Diagnosis: Searches for
course content for each
course
Choice: Eliminates both courses
Strategy: EBA (aspect: previous
experience)
Choice: Eliminates
Arithmetic, Intermediate
algebra, Math for Liberal
Arts, Statistics, Calculus
Strategy: EBA (aspect:
readiness + degree of
difficult)
Choice: Keeps College
Algebra, pre-Calculus
Choice: Eliminates
pre-calculus
Strategy: Disjunctive
Final decision:
College Algebra
193
Figure 7. Matt’s information search and processing patterns
Diagnosis: Reviews math
sequence
Choice: Eliminates arithmetic,
pre-algebra, elementary and
intermediate
algebra
Strategy: EBA (aspect:
transferability)
Choice: Keeps College
Algebra, Statistics, pre-
Calculus, Calculus, Math
for Liberal Arts
Choice: Keeps Statistics,
pre-Calculus,
Strategy: Search satisficing
Diagnosis: Draws on
hearsay, past
experiences; course
content
Diagnosis: Scans
math problems
Final decision:
Statistics
Choice: Eliminates pre-
Calculus
Strategy: EBA (aspect:
problems too difficult)
194
Figure 8. Eric’s information search and processing patterns
Choice: Selects
Arithmetic
Strategy: Satificing
Choice: Eliminates all
alternatives but
arithmetic
Strategy: EBA (aspect:
level of rigor)
Diagnosis Reads
c ourse content
Choice: Arithmetic
Final decision:
Arithmetic
195
Figure 9. The number of math problem sets respondents assigned to the second treatment
reviewed.
196
Figure 10. Distribution of pre-requisite math problem sets attempted, by math level
197
APPENDIX D
Figure 1. Visual Depiction of the Math Sequence
198
Figure 2. Example of course description
PRE-ALGEBRA
Overview: The purpose of this class is to help you make
the transition from arithmetic to algebra. This course
reviews arithmetic operations including percentages and
measurement, and provides an introduction to how to
solve equations and word problems.
Pre-requisites: Arithmetic
Degree-Applicable Credit: This course does not count for
credit towards an Associate’s or Bachelor’s degree.
COURSE DESCRIPTIONS
199
Figure 3. Example of pre-requisite math problems for Elementary Algebra.
Interested in taking elementary algebra? Solve the following problems.
1) Order these numbers from lowest to highest:
1
8
, −
4
5
,
2
5
,
1
12
, −
3
4
2) Use simple interest to find the ending balance: $34,100 at 4% for 3 years
3) Solve: 3 × 6 × 8
4) List all of the positive factors for: 30
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
A multi-perspective examination of developmental education: student progression, institutional assessment and placement policies, and statewide regulations
PDF
How extending time in developmental math impacts persistence and success: evidence from a regression discontinuity in community colleges
PDF
Three essays on the high school to community college STEM pathway
PDF
To what extent does being a former high school English learner predict success in college mathematics? Evidence of Latinx students’ duality as math achievers
PDF
Essays on education: from Indonesia to Los Angeles, CA
PDF
The advanced placement program: a case study of one urban high school
PDF
Math and the making of college opportunity: persistent problems and possibilities for reform
PDF
Measuring the alignment of high school and community college math assessments
PDF
Developmental math in California community colleges and the delay to academic success
PDF
AB 705: the equity policy – race and power in the implementation of a developmental education reform
PDF
Mathematics identity and sense of belonging in mathematics of successful African-American students in community college developmental mathematics courses
PDF
How do non-tenure track faculty interact with Latino and Latina students in gatekeeper math courses at an urban community college?
PDF
Developmental education pathway success: a study on the intersection of adjunct faculty and teaching metacognition
PDF
Levels of interest: the effects of teachers' unions on Congress, statehouses, and schools
PDF
Essays on economics of education
PDF
Accountability models in remedial community college mathematics education
PDF
The perceptions and attitudes of “low-router” students in developmental math
PDF
The conceptualization and development of a global community college: a case study examining the perspective and roles of Pasadena City College leadership and management
PDF
Promising practices of California community college mathematics instructors teaching AB 705 accessible courses
PDF
The role of the timing of school changes and school quality in the impact of student mobility: evidence from Clark County, Nevada
Asset Metadata
Creator
Kosiewicz, Holly Irene
(author)
Core Title
Reforming developmental education in math: exploring the promise of self-placement and alternative delivery models
School
Rossier School of Education
Degree
Doctor of Philosophy
Degree Program
Urban Education Policy
Publication Date
07/22/2017
Defense Date
05/12/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
alternative delivery models,Community colleges,developmental math,OAI-PMH Harvest,placement
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Melguizo, Tatiana (
committee chair
), Martorell, Francisco (
committee member
), Painter, Gary Dean (
committee member
), Rueda, Robert (
committee member
)
Creator Email
hkosiewicz@gmail.com,kosiewic@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-603484
Unique identifier
UC11299553
Identifier
etd-KosiewiczH-3669.pdf (filename),usctheses-c3-603484 (legacy record id)
Legacy Identifier
etd-KosiewiczH-3669.pdf
Dmrecord
603484
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Kosiewicz, Holly Irene
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
alternative delivery models
developmental math
placement