Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Math and the making of college opportunity: persistent problems and possibilities for reform
(USC Thesis Other)
Math and the making of college opportunity: persistent problems and possibilities for reform
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
1 MATH AND THE MAKING OF COLLEGE OPPORTUNITY: PERSISTENT PROBLEMS AND POSSIBILITIES FOR REFORM Federick Joseph Ngo A Dissertation Presented to the FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF PHILOSOPHY (URBAN EDUCATION POLICY) MAY 2017 2 Acknowledgements This research was supported by a grant from the American Educational Research Association which receives funds for its "AERA Grants Program" from the National Science Foundation under Grant #DRL-0941014. It was also supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305A100381 to the University of Southern California. Opinions reflect those of the author and do not necessarily reflect those of the granting agencies. 3 Dedication I dedicate this dissertation to my mother, Lindsey Lor, and my father, Mark Se Ngo, who since fleeing Cambodia in 1979 have given me a life beyond my dreams. Thank you also to my wife, Diana, and daughter, Nara, for being my home. My deepest gratitude goes out to my Ph.D. advisor, Dr. Tatiana Melguizo, for her unwavering support and for the numerous opportunities she has helped to create during my doctoral training. I am also grateful for my committee members, Dr. Estela Bensimon and Dr. Ann Owens, whose insightful comments and challenging questions helped to refine this work. Many thanks as well to Dr. Katharine Strunk for her enthusiastic support throughout this journey. Finally, I also dedicate this dissertation to the amazing students I taught in Oakland, California. Your stories, hopes, and dreams inspired this work and will continue to motivate me in what is to come. 4 Abstract In three studies, I provide a unique look at how policies related to math course-taking and math placement serve to create and curtail college opportunity. The first draws upon national data to examine math course-taking in the transition to college, one of the most universal undergraduate experiences. I define and empirically demonstrate the prevalence of “redundant” course-taking – when students repeat a high school course and this repetition does not appear to be warranted by academic preparation. Using matching methods, I demonstrate how redundant mathematics decreases college persistence, lowers degree attainment, and diverts students from STEM pathways, thereby perpetuating inequality in higher education. The second and third studies investigate the mechanisms by which remedial and redundant math course-taking might arise – through assessment and placement policies and through organizational practices in mathematics – while also offering much-needed evidence to guide reforms. The findings of the second suggest a more holistic approach to course placement that includes indicators of academic and non-academic attributes can benefit students. Students can access higher-level courses that they are likely to be successful in. The third study demonstrates how organizational practices in mathematics, in this case extending algebra courses from one to two semesters, can be detrimental to college student outcomes. Using a regression discontinuity design, I show that extended math, though well-intentioned, significantly decreases student persistence and the likelihood of completing required math courses. I discuss how these studies together frame and forge a research agenda on math and the making of college opportunity. 5 Table of Contents Introduction .......................................................................................................................................... 6 High School All Over Again: The Problem of Redundant College Mathematics ......................... 14 Federick J. Ngo Mathematics Placement Using Holistic Measures: Possibilities for Community College Students ................................................................................. 80 Federick J. Ngo, W. Edward Chi, and Elizabeth S. Park How Extending Time in Developmental Math Impacts Persistence and Success: Evidence from a Regression Discontinuity in Community Colleges ........................................................................ 127 Federick J. Ngo and Holly Kosiewicz Math and the Making of College Opportunity: A Research Agenda ........................................... 172 6 Introduction There is no question that mathematics has long been and remains a fundamental component of U.S. undergraduate education. Not only do expectations of mathematics proficiency pervade the college admissions process, they continue to shape student opportunity throughout college (Hacker, 2015). For example, a survey conducted in 2015 by the American Association of Colleges and Universities (AAC&U) found that 92 and 94 percent of responding member institutions cited knowledge of mathematics and quantitative reasoning, respectively, as desired student learning outcomes for all undergraduates, up from 87 and 91 percent in 2008 (AAC&U, 2016). Mathematics is therefore often a general education requirement for degree completion and serves a “gatekeeping” role in higher education (Bryk & Treisman, 2010). The role of math in shaping opportunity in higher education is also apparent in science, technology, engineering, and mathematics (STEM) fields, since proficiency with advanced math topics is strongly predictive of and typically a pre-requisite for entry into STEM pathways (Chen, 2013; Maltese & Tai, 2011). Advanced math courses such as pre-calculus and calculus have long stood as gatekeepers for college students pursuing degrees in STEM fields (Burdman, 2015; Treisman, 1992). The significance of mathematics in higher education has motivated a wealth of research. Key areas of inquiry include: the teaching practices of math faculty (e.g., Grubb, 1999; Mesa, Celis, & Lande, 2013), the impact of developmental and remedial math education (e.g., Bettinger & Long, 2009; Martorell & McFarlin, 2011), issues of identity and sense of belonging in math and STEM fields (e.g., Sax et al., 2015), the stigma associated with remediation (e.g., Deil-Amen & Rosenbaum, 2002), equity in math courses (Bustillos, Rueda, & Bensimon, 2011), and racialized experiences of mathematics (e.g., McGee & Martin, 2011). This literature has 7 contributed to our understanding of how math affects college opportunity and experiences, and it has informed a host of reforms and interventions in curriculum and instruction, student support services, and higher education policy. This dissertation, consisting of three empirical studies, expands upon the aforementioned literature by focusing on the key but less well-studied issues of math course-taking and math placement testing in the transition to college. Specifically, I document and describe the extent of two persistent problems in mathematics in higher education – redundant course-taking and remedial math placement – and propose possibilities for reform in the assessment, placement, and organizational practices that typically lead to them. Persistent Problems In the first study, I consider the puzzling reality that for many college students, math in college is essentially high school math all over again. While repeating courses or taking remedial courses may be beneficial to under-prepared college students, I focus on course-taking that may be redundant – when students take a math class they have already completed and they likely could have passed a higher-level course. Using linked high school and college records for a national sample of students available in the Education Longitudinal Study of 2002 (ELS:02), I document the first college math courses that students take and examine how they compare with math courses taken in high school. This rich variation in college course-taking and available student background data, which include the results of a 12 th grade low-stakes math assessment and other indicators of math preparation, allow me to estimate the extent of redundant college math experiences. I inspect patterns of redundant math course-taking among the college-going population and use matching methods to examine how redundant mathematics may affect grades, college persistence, and degree attainment in both two and four year colleges. The findings 8 indicate that the first college math courses of a significant proportion of students – between 15 and 30 percent depending on model specification – are redundant and likely unnecessary. Matching analyses show that redundant math experiences reduce student persistence and the likelihood of degree completion, particularly in STEM fields for which math is a notorious gatekeeping subject. This study is the first to define and provide empirical evidence of redundant mathematics as a higher education experience. As such, it provides an important launching point for further analysis of college math course-taking and its repercussions for student success in higher education. Possibilities for Reform The primary implication of this analysis of redundant math course-taking is not necessarily to remove math requirements in higher education altogether, but rather, to better understand and improve how students choose or are assigned to their college math coursework. If students can begin math course-taking in the courses best-suited for them, they might be more likely to achieve their educational goals. In the community college setting, this decision is typically aided by placement tests (Hughes & Scott-Clayton, 2011). Although these instruments are used with the goal of determining students’ readiness for college-level coursework, they are admittedly imperfect tools. Researchers have documented significant problems including placement error and misdiagnosis of remediation needs (e.g., Scott-Clayton, Crosta, & Belfield, 2014). Others have explored the consequences of mis-placement, which can hinder the educational progression and attainment of community college students (Melguizo, Bos, Ngo, Mills, & Prather, 2016). Qualitative research examining the approaches and decisions inherent in implementing assessment and placement policies in community colleges has also unpacked issues associated 9 with over-reliance on placement testing. For example, although colleges benefit from the batch- efficiency of commercially-available placement testing products, faculty and staff often report not having adequate technical expertise or support to evaluate whether the tools and instruments they are using are effective and in the best interest of students (Melguizo, Kosiewicz, Prather, & Bos, 2014). The second study, co-authored with W. Edward Chi and Elizabeth Park and presented in Chapter 3, addresses some of these concerns with placement testing and explores new possibilities for reform in placement testing practices. Specifically, we examine the practice of placing community college students in mathematics courses using a holistic approach that considers measures beyond placement test scores. This includes academic background measures, such as high school GPA and prior math courses taken, and indicators of non-cognitive constructs, such as motivation and time use. The practice of using “multiple measures” is spreading across community college systems across the country (Burdman, 2012), but there is relatively little evidence to guide these policy efforts. Importantly, while there is some empirical evidence that academic measures (e.g., high school GPA; prior math course-taking) may be beneficial for assessment and placement (Ngo & Kwon, 2015; Scott-Clayton et al., 2014), there is an outstanding question as to whether indicators of non-academic/non-cognitive skills and attributes could also be incorporated into placement practices. We provide this needed evidence by examining possibilities for using academic and non- academic measures for math placement. First, we use supplemental administrative data gathered during routine placement testing in a large urban community college district in California and conduct predictive exercises that identify severe placement errors under current practice. We find that as many as one-quarter of students may be mis-assigned to their math courses by status quo 10 practices. Then, building off a similar study that focused on academic measures (Ngo & Kwon, 2015), we evaluate student outcomes in two of the colleges where non-cognitive indicators were directly factored into placement algorithms. The results show that considering non-cognitive indicators in addition to placement test scores increased student access to higher-level courses without compromising their likelihood of success. We interpret this as a marginal increase in placement accuracy and discuss how a holistic set of measures that includes high school background measures and non-academic indicators can be used to improve placement into community college mathematics courses. The third and final study, co-authored with Holly Kosiewicz, investigates organizational practices in mathematics that impact college opportunity. We focus on the practice of extending developmental math courses from one-semester to two semesters, which is thought to be beneficial for student achievement. Even though the extended course may provide students with more time to master math concepts, the direct and indirect costs to this additional time may outweigh the academic benefits and subsequently influence students’ persistence decisions. Drawing on data from four large California community colleges, we use a regression discontinuity design to identify the effect of taking two-semester extended algebra courses relative to typical single-semester courses on student persistence in mathematics and math credit completion. We find that enrolling in extended algebra significantly decreased students’ likelihood of attempting and completing the math courses required for degree attainment. The results underscore the role that institutional practices play in determining college student outcomes. We therefore discuss reforms in the delivery of developmental math that may better support today’s community college students. 11 Significance As a whole, this dissertation provides a comprehensive and unique look at how policies and practices related to math course-taking and math placement serve to create and curtail opportunity in higher education. The first study is the only study using the restricted ELS:02 postsecondary transcript data to focus on math course-taking, one of the most universal undergraduate experiences. In so doing, it provides an important national look at the problems associated with redundant math course-taking in college. The second and third studies highlight the mechanisms by which redundant and remedial math course-taking might arise – through assessment and placement practices and through organizational practices in mathematics – while also offering much-needed evidence to guide policy reforms. Specifically, the findings suggest a more holistic approach to mathematics course placement that includes indicators of academic and non-academic attributes can be useful within the open-access setting of community colleges. Incorporating this additional information into screening processes more intentionally may provide opportunity and access to students who do not appear to be academically-prepared based on placement test results alone but who are likely to succeed if given the opportunity and supports to do so. Finally, the third study showcases how organizational practices in the delivery of developmental math, in this case extending math courses from one to two semesters, can be detrimental to college student outcomes. I discuss how reform in these organizational practices and experimenting with alternative models of delivering developmental math have the potential to ensure that more college students achieve their educational aspirations as they pursue higher education. 12 References American Association of Colleges and Universities (2016). Retrieved from http://www.aacu.org/sites/default/files/files/LEAP/2015_Survey_Report2_GEtrends .pdf Bettinger, E., & Long, B. T. (2009). Addressing the needs of under-prepared students in higher education: Does college remediation work? Journal of Human Resources, 44, 736–771. Bryk, A. S., & Treisman, U. (2010). Make math a gateway, not a gatekeeper. Chronicle of Higher Education, 56(32), B19-B20. Burdman, P. (2012). Where to begin? The evolving role of placement exams for students starting college. Boston, MA: Jobs for the Future. Burdman, P. (2015). Degrees of freedom: Varying routes to math readiness and the challenge of intersegmental alignment (Report 2 of a 3-part series). Jobs for the Future: Berkeley, CA. Retrieved from http://www.learningworksca.org/dof2/ Bustillos, L. T., Rueda, R., & Bensimon, E. M. (2011). Faculty views of underrepresented students in community college settings: Cultural models and cultural practices. In P. R. Portes & S. Salas (Eds.), Vygotsky in 21st century society: Advances in cultural historical theory and praxis with non-dominant communities (pp. 199-213). New York: Peter Lang. Chen, X. (2013). STEM Attrition: College Students’ Paths Into and Out of STEM Fields (NCES 2014-001). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Deil-Amen, R., & Rosenbaum, J. E. (2002). The unintended consequences of stigma-free remediation. Sociology of Education, 249-268. Grubb, N. (1999). Honored but invisible: An inside look at America’s community colleges. New York: Routledge. Hacker, A. (2016). The Math Myth: And Other STEM Delusions. The New Press. Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39(4), 327-351. Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95(5), 877-907. Martorell, P., & McFarlin Jr., I. (2011). Help or hindrance? The effects of college remediation on academic and labor market outcomes. The Review of Economics and Statistics, 93(2), 436-454. McGee, E. O., & Martin, D. B. (2011). “You would not believe what I have to go through to prove my intellectual value!” Stereotype management among academically successful black mathematics and engineering students. American Educational Research Journal, 48(6), 1347-1389. Melguizo, T., Bos, J. M., Ngo, F., Mills, N., & Prather, G. (2016). Using a regression discontinuity design to estimate the impact of placement decisions in developmental math. Research in Higher Education, 57(2), 123-151. Melguizo, T., Kosiewicz, H., Prather, G., & Bos, J. (2014). How are community college students assessed and placed in developmental math?: Grounding our understanding in reality. The Journal of Higher Education, 85(5), 691-722. Mesa, V., Celis, S., & Lande, E. (2014). Teaching approaches of community college mathematics faculty: Do they relate to classroom practices? American Educational Research Journal, 51(1), 117-151. 13 Ngo, F., & Kwon, W. (2015). Using multiple measures to make math placement decisions: Implications for access and success in community colleges. Research in Higher Education, 56(5), 442-470. Sax, L. J., Kanny, M. A., Riggers-Piehl, T. A., Whang, H., & Paulson, L. N. (2015). “But I’m not good at math”: The changing salience of mathematical self-concept in shaping women’s and men’s STEM aspirations. Research in Higher Education, 1-30. Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371-393. Treisman, U. (1992). Studying students studying calculus: A look at the lives of minority mathematics students in college. The College Mathematics Journal, 23(5), 362-372. REDUNDANT COLLEGE MATHEMATICS 14 High School All Over Again: The Problem of Redundant College Mathematics Federick J. Ngo One puzzling reality of college mathematics course-taking is that for many students, math in college is essentially high school math all over again. Despite increasing high school graduation requirements and nearly twenty states now requiring four years of high school math (Domina & Saldana, 2012; Education Commission of the States, 2012), national estimates from the period 2003-2009 indicate nearly 60 percent of students attending two-year colleges and over 30 percent of students attending four-year colleges take at least one developmental/remedial math 1 or English course in college (Chen, 2016). Many students enrolling in these courses fail to persist and earn a college degree (Bailey, Jeong, & Cho, 2010), and studies of college remediation have raised concern about how it affects other outcomes such as dropout (Lesik, 2007), transfer from two- to four-year colleges (Crisp & Delgado, 2014; Martorell & McFarlin, 2011), and STEM degree completion (Crisp, Nora, & Taggart, 2009). Taking a remedial math course (e.g., algebra 1) in college or repeating an upper-level high school math course (e.g., algebra 2, pre-calculus or calculus) can provide students the opportunity to shore up skills and be more successful in subsequent coursework. However, research on math sorting reveals that some students enrolling in lower-level math courses in college are actually some of the nation’s highest-achieving students. Consider, for example, the following incongruities: according to the National Education Longitudinal Study of 1988 one- tenth of the top quartile of 12 th grade students took remedial courses in college (Attewell, Lavin, Domina, & Lavey, 2006). About fifteen years later, according to the Education Longitudinal 1 These courses are typically at the intermediate algebra level or below. REDUNDANT COLLEGE MATHEMATICS 15 Study of 2002 (ELS:02), this grew to one-fifth of the highest quartile of students (Lauff & Ingels, 2015). Even when colleges systematically use placement tests to facilitate math course-taking, many students end up in inappropriate courses. Studies of community college placement testing estimate that as many as one-quarter of students may be assigned to incorrect math courses, most often to courses that are at a lower-level than what they are predicted to be able to pass (Ngo, Chi, & Park, forthcoming; Scott-Clayton, Crosta, & Belfield, 2014). These sorting issues suggest that there may be a nuanced but prevalent problem as students make the transition to college and repeat high school level courses in college that are not warranted by their skill level and academic background -- in other words, a problem of redundant college mathematics. As such, the goals of this study are to document college math course-taking for a national sample of students, estimate the extent of these redundant college math experiences, and examine how redundancy may affect college persistence and degree attainment. Using linked high school and college transcript data from ELS:02, I first inspect math course-taking just prior to high school graduation and then as students begin math course- taking in college. I describe the extent of misalignment in math course-taking patterns, when students take a lower-level course in college than expected in typical math trajectories. Since some lower-level course-taking might be beneficial (i.e., useful to students in need of math remediation), and some might be detrimental (i.e., inefficient and costly to those who could have passed a higher level course), I devise a strategy to disentangle the two. I do so by considering academic background measures such as prior math course-taking, high school achievement, and information on specific math skills available in the data, using them in an array of predictive models to determine the math courses students could have taken and passed in college. Since students of similar skill enroll in a wide range of math courses in college, I use ordered logit REDUNDANT COLLEGE MATHEMATICS 16 models to predict the college math course best-suited for each student. I compare these predictions to observed course enrollment and thereby estimate the extent of unnecessary redundancy in students’ first math courses taken in college. The novelty of the study is that in the absence of national college placement tests, which are a primary means by which students are sorted to college coursework, I use the results of a non-high-stakes ELS:02 math assessment administered in the 12 th grade as a proxy for math skills at the start of college. Related scholarship on mathematics in the transition to college has examined the role of commercially-developed tests such as the SAT and ACT (e.g., Rothstein, 2004), but these are only taken by a select sample of students. Studies of college entrance exams and placement test results have only been able to generalize to local or state populations (e.g., Scott-Clayton et al., 2014). The common assessment available in the ELS:02 data enables me to compare math skills across a national sample. I can therefore also identify the types of students (e.g., by race, SES, and college context) most likely to be subject to redundant math experiences, thus revealing important implications for student equity in terms of redundant college math experiences. Finally, I consider a redundant mathematics experience as a “treatment” and use propensity score matching methods to compare the outcomes of students identified as in a redundant mathematics course with similar students whose first math course was not identified as redundant. This modeling approach attempts to account for student selection into postsecondary institutions and courses and therefore provides a more rigorous examination of the relationships between math redundancy and college grades, persistence, and completion. Specifically, I investigate how redundant college mathematics affects math course grades, first and second year college credit accrual, total GPA and credits, degree completion, and STEM REDUNDANT COLLEGE MATHEMATICS 17 degree attainment, with subgroup analyses in both two- and four-year colleges. The study moves beyond typical remedial/non-remedial dichotomies since the population of interest includes students whose math course-taking in college is redundant and not necessarily remedial. I next review research that has examined college math course-taking. I then discuss how redundant course-taking can affect college persistence and attainment, characterizing it both as a problem of inefficiency and a problem of opportunity that can perpetuate inequality in higher education. I describe the ELS:02 data and outline the methods used to identify redundant mathematics and its effects. I conclude with a discussion of how institutions of higher education can ameliorate the problem of mathematics redundancy. Literature Review The significance of mathematics in college has prompted higher and math education scholars alike to examine several aspects of college math experiences. Researchers have focused on math instruction (e.g., Grubb, 1999; Mesa, Celis, & Lande, 2014), student performance in math courses (Treisman, 1992), math faculty (Carrell & West, 2010), attrition from math pathways (Bahr, 2012), the link between college math course-taking and STEM participation (Chen, 2013; Crisp et al., 2009; Wang, 2015), as well as the labor market returns to math coursework (Belfield, 2015). Research in education psychology and college student development has likewise made important contributions to our understanding of the nature of students’ college math experiences. This scholarship has noted the central roles of math beliefs and attitudes such as self-concept (Sax et al., 2015), sense of belonging in math fields (Good, Rattan, & Dweck, 2012), math anxiety (Perry, 2004), and math self-efficacy (Pajares & Miller, 1994). Much of this work is focused on understanding student interest, enrollment, and persistence in STEM fields (Crisp et al., 2009; Good et al., 2012; Sax et al., 2015; Wang, 2013; 2016). Importantly, a REDUNDANT COLLEGE MATHEMATICS 18 handful of scholars have examined the intersections of race and mathematics in higher education, such as how students manage race-based stereotypes about math ability on college campuses (McGee & Martin, 2011), how to support students of color in calculus (Treisman, 1992), and how faculty perceive and promote equity in math courses (Bustillos, Rueda, & Bensimon, 2011). College Mathematics Course-Taking Despite the importance of mathematics in college, there are relatively few studies of math course-taking that examine what courses students actually take using student transcript data, particularly those of national scope. Some recent scholarship has capitalized on the availability of longitudinal transcript data through the National Center for Education Statistics (NCES). For example, Wang (2016) used data mining techniques with national data to investigate course- taking and STEM transfer pathways, and the NCES itself has provided analytical reports on aspects of college math-course-taking (Chen, 2013; 2016). However, most research studies examining math course-taking draw from college-level administrative data and have primarily been conducted in the community college setting (e.g., Adelman, 2005; Bahr, 2012; Hagedorn et al., 2007). A number of these transcript studies focus on postsecondary math remediation, and appropriately so given significant college remediation rates. Developmental/remedial math courses, which are typically high school or even middle school level courses, are designed to prepare students to successfully complete college math requirements (Bettinger, Boatman, & Long, 2013). However, while they can equip students will the skills necessary to be successful in subsequent coursework, a main concern is that they can also extend time to degree and thereby increase the opportunity costs associated with pursuing postsecondary credentials (Melguizo, Bos, Ngo, Mills, & Prather, 2016; Ngo & Kosiewicz, 2017). It is no surprise then that REDUNDANT COLLEGE MATHEMATICS 19 evaluations of the effectiveness of postsecondary remediation have produced mixed findings (Melguizo, Bos, & Prather, 2011). For example, in some contexts remediation appears to benefit students (e.g., Bettinger & Long, 2009; Lesik, 2007), especially those placed in the lowest levels of math (Chen, 2016; Boatman & Long, 2010). In others it appears to confer a penalty by diverting students from progression towards completion of college-level courses (Martorell & McFarlin, 2011; Scott-Clayton & Rodriguez, 2015). Nevertheless, though there is lingering doubt about the merits of postsecondary remediation as an intervention, these studies have clarified the pivotal role that college math course-taking plays in shaping college students’ academic trajectories. Redundant College Mathematics Enrollment in postsecondary math remediation and repeating coursework have traditionally been understood as the consequence of academic under-preparation in high school, but research on sorting practices such as placement testing have indicated that for many students, postsecondary math remediation may be unnecessary and redundant (e.g., Scott-Clayton et al., 2014). From this perspective, some of the high proportion of students observed taking remedial courses is attributable to policies and practices that mis-assign students to their courses. Estimates from studies in community colleges suggest that placement tests incorrectly assign as many as one-quarter of incoming students in math, with the majority of errors being assignments to remedial courses (Ngo et al., forthcoming, Scott-Clayton et al., 2014). In fact, students in two- year colleges have a nineteen percent higher likelihood of being enrolled in math remediation than academically equivalent students who attend four-year colleges (Monaghan & Attewell, 2015), most likely due to the open-access nature of the community college system, variation in readiness standards across states, and subsequent reliance on placement tests (Hughes & Scott- REDUNDANT COLLEGE MATHEMATICS 20 Clayton, 2011; Porter & Polikoff, 2011). There is also concern about the capacity of colleges to accurately set, evaluate, and calibrate their assessment and placement policies, which may also explain placement error (Melguizo, Kosiewicz, Prather, & Bos, 2014). There is little scholarship on math course-taking and its implications in the four-year college setting, and one goal of the present study is to examine the math course-taking patterns of students in both two- and four- year colleges. While errors and inaccuracies in math sorting processes may explain why redundant mathematics may be a prevalent phenomenon in undergraduate education, research exploring student perspectives on mathematics provides alternative explanations. Students may themselves choose into lower-level redundant courses, and for various reasons. For one, students may opt to repeat lower-level math coursework because they feel unprepared for and unable to be successful in higher-level college courses (Venezia, Bracco, & Nodine, 2010). While such a choice may result from self-assessment of skills, research on math beliefs and attitudes suggests these decisions may also stem from self-concept, underlying math anxiety, fears, and doubts about math self-efficacy (Cox, 2009; Perry, 2004; Sax et al., 2015). Not feeling a sense of belonging in mathematics-intensive fields may also deter students, particularly female students, from enrolling in and persisting in math and other STEM coursework (Good et al., 2012). There is also evidence that anxieties, fears, and low self-confidence in math may be particularly acute among females, first-generation college students, and students of color (Cox, 2009; Ellis, Fosdick, & Rasmussen, 2016; Sax et al., 2015). For example, Kosiewicz (2016) examined students’ choice of mathematics courses in a community college where students were given the opportunity to self-select courses rather than follow a placement test referral (because the college had accidently not renewed their testing license). While overall students chose REDUNDANT COLLEGE MATHEMATICS 21 higher-level math courses than prior cohorts who followed placement test referrals, African- American students and women tended to choose lower-level math courses when given the freedom to choose. Fong and Melguizo (2016) similarly found that students of color tended to elect to take lower-level math tests, below their high school math level, thus limiting the scope of courses into which they could enroll in. Similar patterns of student choice are emerging in Florida, which has implemented a statewide policy of self-placement in which students are not bound by placement test results and referrals (Park et al., 2016). Overall, the studies on math sorting and student math beliefs discussed above point towards a clear problem – that redundant mathematics course-taking may be widespread in the transition to college. Students may be taking math classes they have already completed in high school, classes for which they are likely over-prepared. However, our current understanding of college math course-taking and the implications of it are rather limited. Those studies that have examined math sorting have typically relied on placement testing data in community college systems (e.g., Ngo et al., forthcoming; Scott-Clayton et al., 2014) and have therefore been unable to provide inferences for the national population of undergraduates. Those that have examined national data (e.g., Attewell et al., 2006; Monaghan & Attewell, 2015) exclusively focused on the extent and impact of remediation and have not provided a more nuanced look at who is incorrectly sorted to remedial coursework and redundant math experiences. Finally, while a number of studies have examined the link between high school math course-taking and college outcomes, generally finding a positive benefit to advanced math course-taking with respect to college success (e.g., Auginbaugh, 2012; Byun, Irvin, & Bell, 2014; Goodman, 2012; Gottfried & Bozick, 2016; Kim, Kim, DesJardins, & McCall, 2015; Long, Conger, & Iatarola, 2012; REDUNDANT COLLEGE MATHEMATICS 22 Wang, 2013), these studies have not examined how college math course-taking, and specifically, experiences of redundancy, mediates these relationships. This study addresses these gaps in the literature by using linked high school and college transcripts for a national sample of students to move beyond the remedial/non-remedial dichotomy and examine the problem of redundant college mathematics. I also investigate the implications of a redundant mathematics experience for college student success. The research can be summarized by the following questions: 1) What is the extent of redundant math course-taking in the transition to college? 2) Which types of students (e.g., by race, gender, SES, academic preparation, college type) are most likely to experience redundant college mathematics? 3) What is the relationship between redundant college mathematics and subsequent college outcomes such as grades, persistence, and degree attainment? Theoretical Explanations Defining and identifying redundant mathematics course-taking in college is important because the courses students take are significant markers of college student experiences and are among the primary points of contact between students and institutions of higher education (Adelman, 2005; Hagedorn & Kress, 2008; Wang, 2015). Since theories used to study higher education have typically linked student experiences to college engagement, persistence, and attainment (Carter, Locks, & Winkle-Wagner, 2013; Melguizo, 2011), I also characterize redundant course-taking as a college experience that may affect college student outcomes. Here I draw from theories used to examine college persistence and attainment to provide hypotheses and explanations for how and why redundant college mathematics may be both beneficial and problematic for students pursuing higher education. REDUNDANT COLLEGE MATHEMATICS 23 The Costs & Benefits of Redundant Mathematics The first set of hypotheses stem from economic logics, which characterize students as making rational choices to pursue and persist in postsecondary education for the purpose of accumulating human capital resources such as skills or knowledge (Becker, 1964). They will continue to invest in their schooling if they believe it will garner future returns in the form of credentials and labor market wages (Oreopoulos & Petrojinevic, 2013; Paulsen, 2001). If students’ course choices are made to optimize individual benefits, then the promise of higher grades may also incentivize redundant math course-taking. Students take courses that they think are useful and that they can be successful in (Paulsen, 2001). From this perspective, repeating a course is one way that students may be able to increase their chances of earning higher grades or acquiring the skills necessary to be successful in subsequent course-work. In this sense there may be marginal short-term benefits to taking a redundant math course, leading to the following hypothesis: Hypothesis 1a: Students in redundant mathematics will have higher grades in their first math course and a higher overall GPA. At the same time, economic logics also make salient the cost of redundant course-taking, which may outweigh these aforementioned benefits. Warranted remediation and repetition – when students take appropriate lower-level math courses – may yield economic benefits in the form of grades and academic skills, but redundancy – when students unnecessarily repeat math courses they have already taken – may incur excessive direct and indirect costs in the form of wasted time, money, and resources. These accumulated costs can be detrimental to and discourage college persistence (Melguizo, Hagedorn, & Cypers, 2008; Ngo & Kosiewicz, 2017). A second hypothesis is: REDUNDANT COLLEGE MATHEMATICS 24 Hypothesis 2: Students in redundant college mathematics will be less likely to persist in college. Redundancy and College Opportunity A redundant math course might also set students on a particular course-taking track. This is a point of concern because the math tracking literature has described how high school math course-taking results in differential and stratified opportunity to learn and shapes how students perceive their educational opportunity (Gamoran, 1987; McFarland, 2006; Oakes, 1985). Redundant math experiences in college in particular might influence how students perceive college opportunity due to the messaging and labels associated with taking a lower-level math course, one which is a repeat of a high school course. The danger is that students may come to internalize the messages and labels and accept the fiction that they are in need of lower-level math when they actually may not (Deil-Amen & Rosenbaum, 2002). This may work to undermine educational progress and have repercussions for students’ aspirations. Redundant mathematics may therefore be one mechanism by which college experiences serve to “cool out” students’ educational aspirations, another concept described in the higher education literature (Clark, 1960; Deil-Amen & Rosenbaum, 2002). This happens as students become demoralized and lower their expectations or re-think their college plans as they encounter obstacles to degree completion (Clark, 1960; Karabel, 1972). Tinto (1993) might characterize redundant course-taking as an example of “mismatch” between student skills and institutional demands. This mismatch may be detrimental to academic integration and ultimately reduce commitment and facilitate student disengagement from the institution. Researchers that have empirically tested cooling out theory in the context of community college remediation have not necessarily found evidence of discouragement due to remedial REDUNDANT COLLEGE MATHEMATICS 25 placement in community colleges but rather diversion into non-credit bearing pathways (Martorell, McFarlin,& Xue, 2015; Scott-Clayton & Rodriguez, 2015). This suggests that a key function of redundant math courses may not just be to reduce persistence as in the human capital logic described above, but to divert students from particular pathways, such as those that result in degree attainment or STEM participation. Indeed, given the link between math preparation and STEM entry (Wang 2013; 2016), redundant math experiences in college might send students signals about their capability of success in a math-intensive field and steer them away from STEM fields altogether. This literature suggests another set of hypotheses for how redundant college mathematics may affect college students: Hypothesis 2a: Students in redundant college mathematics may earn comparable credits as their non- redundant math peers but may be less likely to earn a college degree. Hypothesis 2b: Students who start college math course-taking in a redundant math course will be less likely to earn STEM credits and a degree in a STEM field. Redundant Mathematics and Inequality It is also important to identify redundancy in college mathematics course-taking because it may be a nuanced way by which inequality in higher education is perpetuated. The sociology of education literature has long described math sorting and ability tracking as structures that curtail opportunity and maintain social stratification (Gamoran, 1987; McFarland, 2006; Oakes, 1985). Studies within this literature have also consistently documented that students with higher SES and parental education are more likely to take advanced math courses in high school (e.g., pre-calculus or calculus), as are white and Asian students and male students (e.g, Domina & Saldana, 2012; Riegle-Crumb & Grodsky, 2010). Differences in the high school and REDUNDANT COLLEGE MATHEMATICS 26 postsecondary outcomes of students subjected to this tracking and sorting cannot be completely explained by prior academic performance, underscoring the role of access and opportunity to advanced math courses (Attewell & Domina, 2008). Stratification along these lines is also evident in higher education. For example, despite increasing college-going rates for historically under-represented groups, many of these students begin college in remedial courses (Bailey et al., 2010). This is particularly prevalent in two-year colleges, which disproportionately serve students of color and first-generation college students (Carnevale & Strohl, 2010). The reliance on placement testing in community colleges, despite studies that have shown that placement tests are only weakly correlated with success in college courses (e.g., Scott-Clayton et al., 2014), suggests that in addition to high remediation rates, more students may be identified as redundant in the two-year college setting relative to the four- year college setting. This leads to three additional hypotheses about redundant college mathematics: Hypothesis 3a: Students who complete lower levels of math in high school (those who do not take advanced math courses) will be more likely to have a redundant math experience in college. Hypothesis 3b: Redundant mathematics will be more prevalent in two-year colleges than in four-year colleges. Hypothesis 3c: Students of color and low-income students will be more likely to be subject to redundant mathematics experiences in college. If there is evidence of rampant redundancy in college math course-taking then this is both indicative of inefficiency in higher education and, if redundancy is more likely for certain groups REDUNDANT COLLEGE MATHEMATICS 27 of college students, that social stratification and inequality have been perpetuated in the transition to college. I test these hypotheses with the data and methods described next. Data The data for the study come from the restricted-use files of ELS:02. The initial cohort of 16,197 10 th grade ELS:02 students was surveyed in 2002, during which math and English tests were administered and survey information was collected. What makes the study feasible is a follow-up conducted in 2004. The results of a 12 th grade assessment provide a unique low-stakes measure of student math skills just prior to the transition to college. Students were contacted again in 2006, when most were in their second year of college, and the last ELS:02 follow-up conducted in 2012 provided data on postsecondary experiences and attainment. Students’ postsecondary transcripts were collected in 2014 via the Postsecondary Education Transcript Study (PETS), making it possible to link high school and college course-taking. The ELS:02 PETS data include course-level data compiled from transcripts and college catalogs, so I identify the types and levels of math courses that students took in college, as well as the grades that students received in these courses. Sample Selection Seventy-one percent (N=11,506) of students enrolled in a postsecondary institution by the ends of PETS data collection in 2013. I limited this sample to those who did so within one year of on-time high school graduation (June 2004-June 2005) (86% of college sample; N=9,840) since delayed college enrollment may itself affect math course-taking decisions. I further limited the sample to only students whose records show that they took math in college (84%; N=8,252), and I identified the first math course taken. While many of the descriptive analyses are conducted on these “math-takers,” for the predictive analyses I only kept those REDUNDANT COLLEGE MATHEMATICS 28 students whose first math class was at or below the calculus level (69%; N=5,670), since course- taking after calculus is not as linear and students typically have a range of options (e.g., linear algebra, differential equations, etc.). (See Table 1 for a complete description of each sample). Methods I first determined the extent of repetition and misalignment in math course-taking between high school and college. The ELS:02 data include a variable that indicates the highest level of math each student completed in the high school math pipeline. I re-categorized them based on typical math sequences in U.S. high schools and also to be able to align with the structure of the PETS college transcript data. The five levels are: 1) arithmetic and non-academic math, 2) pre-algebra, algebra 1, and geometry, 3) algebra 2, 4) pre-calculus, trigonometry, and statistics, and 5) calculus. I then used the PETS data and identified each student’s first math course in college, its level, and the grade the student earned in the course. Due to the considerable variation in course names across colleges and to maintain anonymity, the NCES staff compiled similar courses under a set of College Course Map (CCM) codes (Bozick & Ingels, 2008). I used the descriptors provided for each set of courses to assign a level of math corresponding to the following categories: 1) arithmetic, developmental math, and basic skills math, 2) pre-algebra, algebra 1, and geometry, 3) algebra 2, 4) pre-calculus, statistics, trigonometry, and introductory college math, and 5) calculus. 2 The levels align closely with the high school math levels described above to facilitate comparison in the transition to college. All courses that did not fit within this 2 Pre-collegiate business mathematics (F3TCCMCODE=27.9990) identified by NCES as remedial with the F3TCREMATRIB variable was grouped with pre-algebra/algebra 1/geometry. The non-remedial version was grouped with Algebra 2. REDUNDANT COLLEGE MATHEMATICS 29 categorization were marked as “other.” 3 This was approximately 30 percent of all students who took any math course in college. Identifying Math Redundancy I then devised a strategy to identify redundancy in math course-taking, which I define as taking a math course in college that is 1) lower than the level that a student is predicted to pass, and 2) the same or lower than the level of the highest math course taken and passed in high school. I did so by predicting the course that students could have passed in college using information about 12 th grade math skills captured by the ELS math assessment and the other available background data. The variation in math course-taking across colleges enabled me to determine whether students of similar skill could have passed different types of math courses. Conceptually, the approach is similar to much of the research on college match, which identifies college match and under-match for a given student based on the observed enrollment patterns of other students (e.g., Smith, Pender, & Howell, 2013). Specifically, I used multinomial ordered logit modeling to determine the math level that each student has the highest probability of passing, based on the sample of students who actually took and passed their first math courses with a B or better. 4 Multinomial ordered logit models are useful when the dependent variable has a meaningful sequential order (Hausman & Ruud, 1987), which in this case is comprised of the five levels of math. Ordered logit and probit models have been used in a number of social science contexts, and in education they have used to study ordered outcomes such as academic progress (e.g., Bailey et al., 2010) and educational attainment (e.g., Breen, Luijkx, Müller, & Pollak, 2009). 3 The most prevalent types of “other” courses are those classified as “Algebra and Number Theory” and include such courses as Linear Algebra and Matrix Algebra (68%). The second most prevalent category was “Computational Mathematics” (e.g., logic, linear programming, discrete math). 4 Even though earning a C would be considered passing in most college contexts, the B or better threshold may reduce bias associated with grading practices. REDUNDANT COLLEGE MATHEMATICS 30 The baseline multinomial ordered logit model is: 𝑀𝑀 𝑖𝑖 ∗ = 𝛼𝛼 0 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (1) 𝑀𝑀 𝑖𝑖 = ∑ ( 𝑀𝑀 𝑖𝑖 ∗ > 𝑐𝑐 𝑗𝑗 ) 5 𝑗𝑗 = 1 where 𝑀𝑀 𝑖𝑖 ∗ is the latent math level that each student is predicted to pass with a B or better, and 𝑀𝑀 𝑖𝑖 is the discrete predicted math level, one of five possible levels, changing value when 𝑀𝑀 𝑖𝑖 ∗ crosses an unknown threshold 𝑐𝑐 𝑗𝑗 . The model enabled me to predict the likelihood that each student has of passing each of the five levels of math. I identified the highest probability and considered this as indicative of the course level that students should have taken as their first math class in college. Comparing this with actual enrollment allowed me to identify cases of unwarranted math redundancy. 5 Since passing a math course with a B or better in college may be dependent on a number of factors, I estimated the logit models using the following sets of predictors 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 (described further below): Model 1) ELS math assessments alone; Model 2) All variables in 1 + student background variables; Model 3) All variables in 2 + high school context variables; and Model 4) All variables in 3 + college context variables. For the first model, ELS math assessments alone, the redundant students I identified are those who could have passed a higher-level math course based on the math skills associated with passing each level of math alone. For the fourth model, the redundant students I identified are those who could have passed a higher-level math course based on the math skills, student background, high school context, and college context variables associated with passing each 5 By design, students whose first math class in college was at the calculus level were never identified as redundant. REDUNDANT COLLEGE MATHEMATICS 31 level of math. 6 Since it is possible that each set of predictors identifies different sets of students as being in a redundant math course, I also created an indicator of whether the student was identified as redundant under all model specifications, which I call model 5. Model 5) Identified as redundant under models 1-4 This is perhaps the most conservative identification since students are considered redundant under all estimation approaches. Moving forward, I present estimates related to model 1 and model 4, as well as this cross-specification indicator from model 5. Covariates. The specific covariates within each set of predictors are those that are identified in the extant literature as being related to success in college math. In the baseline model I first focused on prior math skills, since math preparation is predictive of success in math courses (Adelman, 2006). I capitalized on the availability of detailed ELS:02 math testing data. The NCES computed the criterion-referenced probability of proficiency in the following five mathematics subareas based on the results of the 12 th grade ELS:02 assessment: 1) simple arithmetical operations on whole numbers, 2) simple operations with decimals, fractions, powers, and roots, 3) simple problem solving, requiring the understanding of low level mathematical concepts, 4) understanding of intermediate level mathematical concepts and/or multi-step solutions to word problems, and 5) complex multi-step word problems and/or advanced mathematics material (Bozick & Ingels, 2008). I included all five subareas of math skill. The second set of predictors, student background variables, includes race, gender, and a composite variable of socioeconomic status provided by NCES. 7 These are important to include because there are race, gender, class, and SES differences among students that might explain 6 I also ran models 3 and 4 without math test score information, since research has consistently demonstrated that math test scores (e.g., placement tests) are only weakly correlated with academic performance in college (e.g., Rothstein, 2004; Scott-Clayton et al., 2014). The results are very similar to those in models 3 and 4. 7 This indicator is the first of two provided by NCES. It is based on five equally weighted, standardized components: father/guardian education, mother/guardian education, family income, father/guardian occupation, and mother/guardian occupation. REDUNDANT COLLEGE MATHEMATICS 32 selection into college, college majors, and math coursework, along with the likelihood of passing a math class (Perna & Titus, 2005; Riegle-Crumb, 2006; Riegle-Crumb, King, Grodsky, & Muller, 2012; Walpole, 2003). For example, researchers have documented how race and gender are interconnected in explaining female students major/field choice in college (Perez-Felkner, McDonald, Schneider, & Grogan, 2012). The third set of predictors includes variables related to students’ high school academics and high school context, which have been shown to be predictive of college outcomes (Engberg & Wolniak, 2010). I included high school GPA, which may be a proxy for work habits and self- control (Duckworth, Quinn, & Tsukuyama, 2012), whether the students was in an academic or vocational concentration (a proxy for tracking), and the highest math course taken, which is among the strongest of predictors of success in college courses (Adelman, 2006). I also included a dummy variable indicating whether student took a remedial course in high school and I controlled for the number of years of math required for a high school diploma. Finally, this set of predictors includes a unique composite variable of math self-efficacy generated by NCES. This variable is derived from various survey questions in the 12 th grade follow-up and is important to consider since math self-efficacy has been shown to influence math achievement (Pajares & Miller, 1994). The fourth and final set of predictors incorporates variables related to college choice and institutional characteristics. Even though the analytical sample only includes students who enrolled in a postsecondary institution within one-year of on-time high school graduation, I included a NCES-generated indicator of a postsecondary enrollment gap and an indicator of whether the student was a part- or full-time student, since time out of formal schooling prior to college enrollment, part-time status, and concurrent employment may be related to grades in REDUNDANT COLLEGE MATHEMATICS 33 math courses (Adelman, 2006; Darolia, 2014; Stinebrickner & Stinebrickner, 2003). Further, the type and quality of institution attended may be associated with students’ academic progress (Melguizo, 2008). I therefore added dummy indicators of private and public control and whether the first institution attended is a two-year, four-year, or for-profit institution. The ELS data also include a variable documenting whether the institution reported using an entrance examination, which can help to account for how students are sorted into coursework, either by examination or student self-selection into their math classes. Table 1 provides summary statistics for all variables used in the analyses. [Insert Table 1 here: Sample and Covariates] Missing data. Missing data are a primary concern of higher education research with longitudinal survey data (Cox, McIntosh, Reason, & Terenzini, 2014). While hundreds of explanatory variables are available in the ELS:02 data, the inclusion of each additional covariate in a regression can result in a reduction in sample size due to missing data. There are a number of strategies to deal with this problem, including mean substitution, creating an indicator for each observation with missing data, imputing values based on the observed data, or list-wise deletion in which observations with missing data are dropped (Cox et al., 2014). I proceed with dropping observations with missing variables, but I also assess sensitivity of the results using other strategies for addressing missing data. These are presented after the results section. Effects of Redundant Mathematics: Propensity Score Matching In addition to describing the extent of math course redundancy and which students are most prone to this problem, another primary goal of the study is to understand the effects of redundant math course-taking on students’ college experiences and outcomes. Here I capitalize on the variation in math sorting processes in the transition to college nationwide. Students of REDUNDANT COLLEGE MATHEMATICS 34 similar skill and background may enter into different course-taking experiences – some may be subject to redundant mathematics while others may not – by virtue of the postsecondary institutions they attend. A straightforward ordinary least squares linear regression approach would likely result in biased estimates since there is self-selection into colleges and coursework. One way of overcoming this selection bias is to use propensity score matching (PSM), a method that can provide less-biased estimates arguably closer to a causal estimate of the effect of math redundancy on postsecondary outcomes (Murnane & Willett, 2010; Rosenbaum & Rubin, 1985). PSM relies on modeling selection into treatment conditional on observable variables. The resultant propensity scores can be used to match individual units that received the treatment with similar observational units that did not. The difference in the outcomes of these matched groups is the Average Treatment Effect on the Treated (ATT) but should only be cautiously interpreted as causal effects of a treatment, since the design relies heavily on the assumption that selection into treatment is correctly modeled (Dehejia & Wahba, 2002; Heckman, Ichimura, Smith, & Todd, 1998). Here I describe the approach to PSM and the balance, sensitivity, and robustness checks I conducted to determine the validity of my estimates of the effect of redundant math course-taking. Estimating propensity scores. I first used logistic regression to obtain a probability of having a redundant math experience based on observable characteristics for all students. Previous studies have used PSM with “remediation” as the treatment (Attewell et al., 2006), but the present study adds further nuance since the treatment condition is a redundant college math experience, as identified using the predictive techniques (i.e., ordered logit models) described above. Redundancy, as I have defined it, is a more nuanced math experience than math REDUNDANT COLLEGE MATHEMATICS 35 remediation, since I specifically identify redundant math students as those who are repeating a level of math they already completed and they are predicted to pass a higher level course. For comparison purposes I provide PSM estimates for “any math remediation” and “math misalignment” (when students take the same or lower math course/level in college). One key task in PSM is to adequately model selection into the treatment condition, here a redundant math course. Since an array of student background and institutional context variables may be related to the likelihood of having a redundant math experience, I include the same predictors 𝑋𝑋 𝑖𝑖 as in model 4 above (i.e., math test scores, student background, high school academics/context, college characteristics). Propensity score estimation involves regressing a dummy indicator of a redundant math experience (determined from the previously described ordered logit modeling) on the chosen variables using equation (2), and for each redundancy indicator (models 1-5): 𝑃𝑃 [ 𝑅𝑅𝑅𝑅 𝑅𝑅 𝑅𝑅𝑅𝑅𝑅𝑅 𝑅𝑅 𝑅𝑅 𝑅𝑅 ] 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝑇𝑇 𝑅𝑅 𝑇𝑇 𝑅𝑅 𝑖𝑖 + 𝛽𝛽 2 𝐻𝐻𝐻𝐻 𝑖𝑖 + 𝛽𝛽 3 𝐶𝐶 𝐶𝐶 𝐶𝐶𝐶𝐶 𝑅𝑅 𝐶𝐶 𝑅𝑅 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (2) While the full set of covariates from model 4 is more likely to model selection into a redundant math experience, each additional covariate exacerbates the missing data problem since not all responses are available for each covariate. Indeed, the missing data from the model 4 covariates reduce the analytical sample from 5,670 students to 3,500 in the logistic regression. To assess the sensitivity of the results to missing data and covariate choice, I run the analyses with and without the two most frequently missing variables (math self-efficacy and HS math requirements). I also include an indicator of whether 12 th grade students reported interest in a STEM career since this may predict student self-selection into certain types of math courses. Balance between matched samples. Students with similar propensity scores have a similar likelihood of taking a redundant math course. I therefore matched students who were REDUNDANT COLLEGE MATHEMATICS 36 identified as being in redundant math with other students who have similar propensity scores but who were not identified as redundant. In other words, these students were predicted to be in redundant math but their first math course was not a redundant one. I constructed these matched comparison groups by employing a nearest-neighbor matching algorithm (Dehejia & Wahba, 2002). I used a caliper of .25 standard deviations of the standardized log propensity score, the recommendation in the literature (Rosenbaum & Rubin, 1985; Stuart, 2010). Figure 1 shows the distribution of propensity scores before and after matching for models 1, 4, and 5 along with the resultant matched sample sizes. [Insert Figure 1: Matching: Propensity Score Distributions] Since PSM approximates experimental design by modeling selection into treatment on observed covariates, it is imperative to assess the quality of the matched samples by examining covariate balance before and after matching (Murnane & Willett, 2010; Stuart, 2010). One balance metric that can be used for this purpose is to compare the standardized bias in the covariates of the unmatched and matched samples (Stuart, 2010). This is calculated as the standardized difference in means between the treated and control groups for each relevant covariate. Matching should reduce the standardized bias in each covariate, and the recommendation is that the absolute standardized bias should be below 0.25 (Rubin, 2001). Figure 2 shows this analysis of covariate bias. 8 It is apparent that the matching procedure reduces the standardized bias for nearly all covariates, to well below the recommended threshold of 0.25 (Rubin, 2001). I therefore proceeded with estimation of treatment effects using these matched samples. [Insert Figure 2: Covariate Bias Reduction] 8 The full results are available in the Appendix. REDUNDANT COLLEGE MATHEMATICS 37 Treatment effects. A regression of outcomes on the treatment indicator with the matched sample gives the ATT of a redundant math experience on various college outcomes y 𝑖𝑖 . These include the following sets of outcomes: i) Grades: grades in the first math course; total GPA in college; ii) Persistence: first year credits 9 completed, second year credits completed; iii) Attainment/Completion: total credits earned; whether the student earned a degree (a generated dummy variable for AA or BA degree completion) iv) STEM outcomes: total STEM credits earned; a NCES-provided indicator of whether the student completed a STEM degree. The baseline PSM model used to estimate the relationship between redundant college mathematics and these outcomes is given by (3): y 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝑇𝑇 𝑇𝑇 𝑅𝑅 𝑅𝑅𝑅𝑅 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (3) The coefficients of the 𝑇𝑇 𝑇𝑇 𝑅𝑅 𝑅𝑅𝑅𝑅 𝑖𝑖 dummy variable can be interpreted, though with caution, as the ATT of redundant college mathematics course-taking on each postsecondary outcome y 𝑖𝑖 . Sensitivity and robustness. Since bias is not completely reduced after matching, I ran additional models that included the propensity score (4) and all covariates (5) as controls in the ATT estimation: y 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝑇𝑇 𝑇𝑇 𝑅𝑅 𝑅𝑅𝑅𝑅 𝑖𝑖 + 𝛽𝛽 2 𝑃𝑃 [ 𝑅𝑅𝑅𝑅 𝑅𝑅 𝑅𝑅𝑅𝑅𝑅𝑅 𝑅𝑅 𝑅𝑅𝑅𝑅 ] 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (4) y 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝑇𝑇 𝑇𝑇 𝑅𝑅 𝑅𝑅𝑅𝑅 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (5) Another concern of PSM is that only matched observations are included in the estimation. As a supplementary analysis, I weighted each observation in the full treatment group with the inverse of the propensity score, and each observation in the control group with one minus the inverse of the propensity score (i.e., the propensity of not being in the treatment group). I 9 College credits were scaled by NCES to be comparable across colleges (Ingels et al., 2015). REDUNDANT COLLEGE MATHEMATICS 38 standardized this inverse probability treatment weight (IPTW) and then estimated the treatment effect using this weighted sample. The IPTW estimator has the advantage of weighting the full sample and not dropping unmatched observations as in the nearest neighbor methods described above (Imbens, 2004; Stuart, 2010). I also tested sensitivity of the IPW modeling to inclusion of the propensity score and covariates, and with trimming to prevent unusually large or small weights from being extremely influential in the estimation. Validity & Generalizability A primary limitation of the analysis is that the specific sorting mechanisms that determine math courses within each college and all variables associated with a student’s math course choice are not observable. This weakens the causal claim of the PSM analysis since the validity of PSM estimates rely heavily on the assumption that the researcher can adequately model selection into the treatment condition. The ATT results should therefore only cautiously be interpreted as a causal estimate between redundant math course-taking and college outcomes for the analytical sample of ELS:02 students who enrolled in college within one year of on-time high school graduation and who took a college math course. Results Math Course-Taking in College I first present an overview of math course-taking in college. Table 1 indicated that of all students who enrolled in college within one year of finishing high school, most (85 percent) take a math course at some point during their college career, with pass rates of 70-80 percent (C or better) and 50-60 percent (B or better). Table 2 shows the most common types of courses taken as the first math enrollment were in the categories of algebra (high school level), linear algebra, and calculus. Just about 60 percent of math-takers enrolled in the courses listed under these three REDUNDANT COLLEGE MATHEMATICS 39 categories. Statistics and pre-calculus level courses were the next most popular. Overall, 32 percent of all math-takers took a remedial math course in college, as indicated by a remedial flag in the college transcript or a remedial designation in the course catalog. [Table 2: First College Math Course, Descriptive] Linking college records to high school records reveals significant misalignment in math course-taking in the transition to college. As shown in Table 3, about four percent of students who took and passed at least one semester of calculus in high school enrolled in a math course that was two or more levels below calculus (algebra 2 level or lower). Nearly 25 percent of students who completed the pre-calculus level in high school enrolled in a course two levels below or lower (algebra 1 or developmental/basic math). Disaggregating these results by college type reveals that there appears to be more downward math movement in two-year colleges than four-year colleges, with about 55 percent of students’ first math courses being within the three lower levels of math. Overall, 57 percent of math-takers’ first math course was at the same level or lower than their highest high school math course. [Table 3: Highest HS Math vs. First College Math] These results concerning the degree of math misalignment may not be unexpected since screening processes at the start of college such as placement testing sort students on the basis of math skill and not on the basis of math course-taking. However, there also appears to be considerable math mismatch in terms of math skill. The results in Table 4 indicate that 7 percent of the highest scoring quartile of 12 th graders took high school level courses in college (algebra 2 or below), but 12 percent of the lowest quartile took pre-calculus or calculus. Notably, while just about 5 percent of the highest quartile of test-takers took high school level courses in a four-year institution, nearly 20 percent of those who attended two-year colleges did so. REDUNDANT COLLEGE MATHEMATICS 40 [Table 4: Math Quartile vs. First College Math] Redundant College Mathematics Which of these students observed as repeating math courses in college may not have needed to? I answer this question by predicting the math course that students should have taken based on rich student background data and variation in college math course-taking within the sample. The ordered logit models first identify variables that are associated with students passing their first college math course with a B or better. As shown in Table 5, higher levels of math proficiency are predictive of earning a B or better, as is higher math self-efficacy. Students whose highest math was a lower-level math course (e.g., geometry) were less likely to pass their first math course in college with a B or better. The ordered logit models indicate that overall, study sample students had the highest probability of passing the pre-calculus, trigonometry, and statistics level. 10 [Table 5: Predicted Passable Class, Ordered Logit Models] The ordered logit model enabled me to determine the math course best-suited for each student in the sample from within the ordered sequence of courses. I compared these predicted course levels with observed course-taking to identify redundant college mathematics courses, and these results are summarized in Table 6. In the first ordered logit model, which is based on five subscores from the ELS 12 th math grade assessment alone, 18 percent of all math-takers who started college within a year of on-time high school graduation can be considered as in a redundant math experience. That is, their first math course in college was at the same level or below the highest math course taken in high school and they had a higher probability of passing a higher level course. 10 One interesting finding with these ordered logits is that algebra 2 was not determined to be the appropriate course or any students, and there was a skew towards higher or lower courses. This means that either the sample sizes are too small or that the skills and backgrounds that are ‘necessary’ for passing algebra 2 are the same as that needed for pre-calculus. REDUNDANT COLLEGE MATHEMATICS 41 The redundancy estimates increase as covariates are added to the model, with the biggest jump in model 3 after high school academics and context variables are added. 11 In model 4, with all student background, high school academics, and college context variables, 27 percent of students who took math in college can be considered as in a redundant college mathematics course. Since models 1 through model 4 may identify different students as redundant due to sensitivity in covariate inclusion, I focus for the remainder of the paper on the results of model 5. Students identified as redundant under model 5 were those identified as redundant under all four model specifications. This sample designation reveals that 22 percent of students in the study sample can be considered as in a redundant college mathematics experience, or 15 percent of all math-takers. I still present results from models 1 and 4 for comparison purposes and also provide remedial course-taking rates and misalignment rates, which can serve as a comparison for the magnitude of the redundancy indicators. Remedial first math courses were those flagged as “remedial” by NCES based on transcript analysis. Misaligned first math courses are those I identified as out of sync with typical math trajectories. For example, if a student’s first math course in college was pre-calculus, but the student had completed calculus in high school, this would be considered misaligned. Thirty-two percent of math-takers began math course-taking in what is a remedial math course, and 57 percent were in a misaligned course, that is, their first math course was at the same or lower level than their highest high school math course. [Insert: Table 6: Redundancy Rates] Alternative samples. I conducted comparison redundancy analyses with different samples to determine if there is bias in the identification of redundant math students that stems 11 I also ran a model with no math tests scores and only demographic information, high school background, and high school context variables. The redundancy estimates were about three percentage points higher, and the resulting matching estimates are also similar. REDUNDANT COLLEGE MATHEMATICS 42 from covariate or sample selection. The non-redundant control group described includes all students whose math class was predicted to not be redundant, whether they were in a higher-, same-, or lower- level course than their high school math course. I conducted a supplementary analysis and limited the non-redundant control group to only students whose first college math was, like the redundant students, at the same or lower level than their highest high school math. These students can be thought of as being in a “warranted” remedial/redundant course. This comparison enables me to determine if redundant students, despite being in the same lower-level courses as other students, had worse outcomes. These results are also shown in Table 6, and the percentages should be compared to the results for the calculus and below students. Although the redundancy estimates are larger for models 1-4, the estimate under model 5 is equivalent. The original analytical sample includes all students who began college within one year of high school graduation. Since it is possible that this sample includes students who attended college within one year of high school graduation, but took math later in their college careers, I also limited the sample to those whose first math course was within this time window. In addition, I removed statistics courses, which were coded as equivalent to the pre-calculus level. This exclusion is worth testing since statistics is not typically a prerequisite for calculus. It is possible that students whose first math course was in statistics are those more likely to enter into social science fields such as psychology or sociology. The portion of students identified as redundant decreases from 22 to 20 percent when considering only those students who took math within one year of college enrollment, and from 22 to 17 percent when statistics students are excluded. Finally, I also provide redundancy estimates when students who attended for-profit colleges were excluded, and when only two-year and four-year college students were included. It REDUNDANT COLLEGE MATHEMATICS 43 is notable that redundancy is slightly higher in two-year colleges than in the other institution types. Who Experiences Math Redundancy? Table 7 presents a closer examination of the characteristics of students in redundant college mathematics. Since there is variation across redundancy indicators, I discuss here only the characteristics that are significant for all redundancy indicators. These are primarily the high school and college context variables, and neither the math skills or student background variables. Students identified as in redundant math consistently had lower high school GPAs, completed lower-level math courses in high school, and were less likely to have been in an “academic” track. Comparison of means tests for the college context variables indicated that students in private colleges are less likely than students in both public colleges and for-profit colleges to be subject to redundant college mathematics. Students in two-year colleges were also more likely to have a redundant college mathematics experience than students in four-year colleges, as were students in colleges that did not report using entrance examinations. Students who delayed college enrollment and part-time students also were more likely to take a redundant math course. Although there were racial and class differences in redundancy rates under models 1 through 4, such as higher redundancy rates for African-American students in models 3 and 4 and for lower- SES students in models 1, 3,and 4, none of these were consistent across all five models. [Table 7: Which students are in redundant college mathematics?] The Effects of Redundant College Mathematics I next use propensity score matching (PSM) to create matched samples of students to estimate the effects of redundant math experiences on college persistence and attainment outcomes. In the first step, I estimated the propensity of being identified as redundant under each REDUNDANT COLLEGE MATHEMATICS 44 model specification. Table 8 shows these results, and the estimated coefficients also indicate the variables most strongly associated with starting college mathematics in a redundant course. Level 3 math skills (simple problem solving with algebra) are associated with a higher likelihood of redundancy across all redundancy indicators, as is taking lower-level math courses in high school (the indicator for math track). The only other variable consistently predictive of redundancy is attending a private college, and a negative relationship is reported. It is interesting to note that, like the results of the analysis in Table 7, student background characteristics are not significant while high school and college context variables are significantly predictive of redundant math. [Table 8: PSM Stage 1 – Propensity of Redundancy] I used the estimated coefficients to calculate propensities for redundant mathematics and constructed a matched sample of students with similar propensity scores. The treatment group consists of those I identified as in a redundant math experience and the control group consists of students with similar propensity scores but who were not identified as redundant. These students’ first math courses in college can be considered as warranted or appropriate. In the second stage of PSM I estimated differences between these groups on a number of college persistence and completion outcomes. The tables presented show the difference between the treatment and control groups in the unmatched and matched samples. I focus on the redundancy indicators from models 1, 4, and 5, and I provide seven different specifications (S1- S7) to assess sensitivity of the results. S1 shows a baseline unmatched difference between students in redundant and non-redundant math. S2 shows the estimated difference using OLS with covariates. S3 shows the results after matching, with propensity scores added in S4 and REDUNDANT COLLEGE MATHEMATICS 45 covariates added in S5. S6 and S7 use inverse probability weighting with covariates and trimming, respectively. For comparison purposes, I also conducted the PSM analyses with “math remediation” being the treatment indicator and “math misalignment” (took the same or lower level math) as treatment indicators. This provides a reference point for understanding how redundant math is different from remedial math course-taking and lower-level course-taking. Grades. As hypothesized, students identified as in a redundant first college math course were more likely to earn a B grade or better in their first math course compared to matched but non-redundant math students (see Table 9) . The results of specifications 3-5 for model 5 indicate that compared to non-redundant math students, redundant math students were about 10 percentage points (p<.01) more likely to earn a B or better in their first math course. Interestingly, their total college GPAs were about .08 grade points lower (p<.05), though this is not consistent across redundancy models. [Table 9: PSM Stage 2 – Grade Outcomes] College persistence. There were no significant differences between redundant and non- redundant students in the number of credits completed in the first year (see Table 10). However, there were significant differences in second year credits earned, and this was somewhat consistent across the other redundancy models. In model 5, students identified as in redundant college math in the PSM and IPTW estimation earned about 2 fewer credits (p<.05) than their non-redundant matched peers. [Table 10: PSM Stage 2 – Persistence Outcomes] College attainment and completion. Although model 5 students appear to earn fewer total college credits through the end of PETS transcript collection in 2013, this difference is not REDUNDANT COLLEGE MATHEMATICS 46 statistically significant (see Table 11). Students whose first math was redundant were about 9 percentage points (p<.01) less likely to earn a college degree than their counterparts. [Table 11: PSM Stage 2 – Completion Outcomes] STEM credits and degrees. With respect to STEM participation (see Table 12), students in redundant math were significantly less likely to earn STEM credits. PSM specification 5 for each redundancy model indicates that redundant math students earned about 4 to 5 fewer units in STEM fields than their non-redundant counterparts, though this is significant at the .10 level for model 5. In contrast, students in misaligned math or remedial math were not less likely to earn STEM credits. Subsequently, of the students who earned a college degree, those who began math course-taking in a redundant math course were less likely to earn a degree in a STEM field. Although these estimates are of varying magnitude across specifications and across redundancy models, they are all statistically significant. In the preferred specification of model 5 and specification 5, students in redundant college math were 9.4 percentage points less likely than their counterparts to earn a STEM degree. The PSM models shown do not account for one potentially important factor in math course choice. STEM aspirations have been documented as significant predictors of STEM participation and attainment in college (Wang, 2013; 2016). I therefore included a variable that may capture STEM aspirations – whether the student aspired to have a career in a STEM field at age 30. This question was asked in the base year survey in 2002 and again in the first follow-up 2004. I use the 2004 response since this is perhaps most relevant to any course choices at the start of college. Including this variable reduces the PSM sample size as expected. While the effects of redundant math on STEM credit completion are no longer significant, the effects on REDUNDANT COLLEGE MATHEMATICS 47 STEM degree attainment remain, though they are of smaller magnitude. Across models and specifications, redundant math students are about 5 to 7 percentage points less likely to earn a STEM degree. [Table 12: PSM Stage 2 – STEM Outcomes] Robustness Checks I tested sensitivity of the PSM results by conducting the analyses with alternative samples and with modifications to the matching procedure. These are outlined in Table 13, and I only show the preferred specification from redundancy model 5 and specification 5 (PSM with covariates). Upon replicating the PSM analyses for warranted remedial students, the results show that there were no significant differences in first and second year credit completion. Differences in degree attainment and STEM degree completion remained but were of smaller magnitude. Redundant math students, as compared to students whose remediation was warranted, were about 5 percent less likely to complete a degree in a STEM field. 12 PSM estimation including only the students who took math within one-year of college enrollment and then excluding students whose first math was statistics produced similar results, although again, the STEM degree outcomes were smaller in magnitude when statistics was excluded. The PSM estimates for two- and four-year colleges separately add some nuance to the results described above. Redundancy appeared to negatively affect credit and degree attainment in two year-colleges, but there was only a significant effect of redundant math on STEM degrees in the four-year college setting. These results suggest that redundant course-taking in four-year colleges might serve to divert students from STEM fields but not necessarily deter them from earning a degree. However, these results are drawn from much smaller sample sizes than the 12 These results are available from the author upon request. REDUNDANT COLLEGE MATHEMATICS 48 main results so there are concerns about power in the calculation of the treatment effect estimates, particularly or the two-year only sample. I also checked the sensitivity of the main PSM results to estimation approaches. I allowed for five nearest-neighbor matches instead of just one. This has the effect of increasing the sample size of the PSM estimation sample, thereby increasing both power and validity of the estimated effects. The total college GPA effect is no longer significant, but the other results are comparable. Table 13 also shows the complete PSM results with the STEM occupation variable included. The results are robust to these alternative matching specifications. [Table 14: Robustness & Sensitivity of PSM] Missing Data The PSM estimates did not drastically change in magnitude and maintained significance for the same outcomes when missing data were replaced using mean substitution. When I conducted the analyses after excluding the math self-efficacy and HS math requirement variables, the two variables with the most frequently missing data, the magnitude, direction, and significance of the ATT estimates remain the same as the main results. Lastly, I imputed zeroes for missing outcome data. This primarily impacts the degree attainment variables since these data had more missing observations. The estimates are also robust to this change, suggesting that attrition from the sample may be unrelated to redundant math experiences in college. Discussion There is no doubt that college mathematics plays a considerable role in shaping the postsecondary opportunities and experiences of undergraduate students. While there is a well- established literature on how math experiences such as tracking and repetition affect students in grade school, there is notably less empirical research on the ways that mathematics influences REDUNDANT COLLEGE MATHEMATICS 49 educational experiences and opportunities in college. This study sought to document one such experience, redundant college mathematics course-taking, and to examine how it defines college opportunity in both the two- and four-year college settings. The study therefore extends the scholarship on college math course-taking beyond the remedial/non-remedial dichotomy and adds nuance to our understanding of college students’ math experiences. The findings of the study confirm that a substantial portion of students begin their math course-taking in college in a course level they have already completed in high school. After considering these students’ 12 th grade math skills and academic backgrounds, it appears that a sizable percentage in fact would be predicted to pass a higher-level course if given the opportunity to do so. Depending on model specification, anywhere from about 15 to 30 percent of students entering college within one year of high school graduation may be starting college math in a redundant course. These students are predicted to be able to successfully pass a higher- level math course with a B or better but enrolled instead in a lower-level course that they had already completed in high school. Modeling the likelihood of redundant math experiences revealed that the primary factors associated with a reduced likelihood of redundancy were related to high school and college contexts. Students who took more advanced courses in the high school math pipeline were less likely to be identified as taking redundant math in college, as were students who attended private colleges and colleges that reported using some placement exam. The PSM estimates indicate a penalty to this undue redundancy. Students subject to a redundant math experience, compared to those whose math course was identified as appropriate, earned on average two fewer credits after their first two years of college and were less likely to earn a college degree. Supporting the diversion hypothesis, redundant math students were significantly less likely to attempt STEM credits and complete a STEM degree. These students REDUNDANT COLLEGE MATHEMATICS 50 presumably could have passed a higher math course and reaped the subsequent benefits therein had they not enrolled in a redundant math course. These findings substantiate the idea that redundant mathematics course-taking is a relevant category of college student experiences that can affect higher education outcomes. They also confirm a number of the hypotheses outlined in the literature on college student persistence and attainment. That redundant math students were significantly less likely to persist and earn a college degree suggests that the experience was costly and potentially inefficient. That students were diverted from STEM pathways suggests that it may have “cooled out” or changed student’s aspirations. This may be especially pertinent for students of color, for whom two-year colleges are an onramp to postsecondary STEM participation (Bahr, Jackson, McNaughtan, Oster, & Gross, 2016; Malcom, 2010). Finally, that observationally similar students’ likelihood of redundant math was linked to their high school educational opportunity and college choice reveals a nuanced way by which inequality is perpetuated in and through higher education. Policy Implications & Future Research Given that students in redundant college mathematics are those who could have passed a higher-level course and that students may have access to higher-level courses by virtue of the type of postsecondary institution they attend, the primary implication of the results is that math sorting processes in the transition to college should be questioned and improved. If students begin college in more appropriate courses, then this can increase efficiency in institutional investments in postsecondary remediation and also reduce the direct and indirect costs of higher education for students pursuing their credentials. Further, that the use of college entrance exams was associated with a reduced likelihood of redundancy is an important contribution to current policy debates. Rather than do away with placement testing as some community college systems REDUNDANT COLLEGE MATHEMATICS 51 have done, the implication seems to be to improve practices associated with placement testing, with attention towards the way students of color and female students experience assessment and placement. Doing so is easier said than done given the climate of misaligned secondary and postsecondary standards, but research has provided some direction for practitioners. Colleges should focus attention on methods to assess placement error and the impact of placement decisions made using placement tests (Melguizo et al., 2016; Scott-Clayton et al., 2014), experiment with alternatives to placement tests such as math diagnostics (Ngo & Melguizo, 2016), and consider multiple measures as high school grades, indicators of prior math course- taking, and other relevant predictors of college success when making math course placement decisions (Ngo & Kwon, 2015; Ngo et al., forthcoming; Scott-Clayton et al., 2014). This research has primarily been conducted in community colleges, but it is imperative to extend this work to the four-year setting as well. The findings also clarify how college math experiences affect STEM participation and attainment. Although incoming undergraduates may have STEM aspirations and believe they have a high probability of completing a major, many come to believe that their abilities are not adequate after a few semesters of college course-taking and choose other majors (Chen, 2013; Maltese & Tai, 2011; Stinebrickner & Stinebrickner, 2011). This analysis of college math course-taking highlights how mathematics can play a role in the deflation of these aspirations and diversion from STEM pathways. I suggested mechanisms related to opportunity cost, signaling, and remedial labeling as potential ways to understand how redundant math experiences might affect college persistence and attainment. Future research can seek to better REDUNDANT COLLEGE MATHEMATICS 52 understand exactly how redundant mathematics experiences in college might work to discourage or alter aspirations, particularly with respect to the STEM pipeline. Finally, the findings also suggest that both two- and four-year colleges and universities ought to reconsider why the mathematics requirements that keep students from progressing through college are in place and whether they continue to serve their purpose. This study therefore also motivates further questioning of the institutional and disciplinary “logics” that maintain status quo practices in higher education, particularly in mathematics-oriented fields (Posselt, 2015). Indeed, a recent study demonstrated that allowing community college students to take college-level math (statistics) instead of remedial algebra courses improved student outcomes; students enrolled in co-requisite remediation (statistics and a workshop) had better passing rates and credit accumulation than those who started in remedial algebra courses (Logue, Watanabe-Rose, & Douglas, 2016). There likewise has been increased willingness to change general education requirements in the four year setting, with some colleges removing math requirements altogether (Joselow, 2016), and to transform postsecondary math experiences by introducing more math pathways (Logue, 2016). It remains to yet to be seen how changing math requirements in college in this way can provide increased opportunity for students in their postsecondary careers, but such efforts can potentially reduce redundancy and inefficiency in college math course-taking. Limitations While the study capitalized on the rich student data available in ELS:02 to identify redundant math experiences, one limitation is that college course-level explanatory variables (e.g., faculty characteristics) were not available. Since the identification of redundant math experiences was based on whether or not students in the ELS data passed their first college math REDUNDANT COLLEGE MATHEMATICS 53 courses, it may be important to consider the role of instructor’s grading practices and difference in the rigor of similarly named courses (Chingos, 2016; Umbach & Wawrzynski, 2005). However, mathematics is typically less subjective than other subjects and so grading practices may be more consistent across instructors. In addition, since I use the B or better criterion to identify redundancy rather than the C or better criterion, this may reduce bias that stems from differences between instructors and institutions (Scott-Clayton et al., 2014). In addition, the validity of PSM estimates largely depends on the assumption that selection into treatment is adequately modeled. The rich longitudinal and multi-dimensional ELS:02 data certainly make it more likely that this assumption is met, but without observing the actual assignment and sorting mechanisms inherent in college mathematics sorting there will be a lingering reservation about the validity of the results. A study using more rigorous quasi- experimental methods to compare the outcomes of students placed in a redundant math experience versus an appropriate math course might provide more internally valid estimates of redundant mathematics. However, absent national high school exit exams and national college placement testing this is not yet a feasible analysis. This study therefore provides the best possible attempt to understand redundant college mathematics at the national level, and future research should replicate and extend the analyses using other nationally representative datasets and administrative data from large education systems. Conclusion This study provides a unique look at the first college math courses of a national sample of students and investigates how these courses square with high school math preparation. In defining and identifying the problem of redundant college mathematics, I draw attention to a nuanced yet unexamined way that math structures opportunity and experiences while in college. REDUNDANT COLLEGE MATHEMATICS 54 The results suggest that the first college math courses of a significant proportion of college students are redundant and likely unnecessary. Further, the inefficiency and curtailed opportunity associated with this math experience seems to reduce student persistence and the likelihood of degree completion, particularly in STEM fields for which math is a gatekeeper course. Redundant college mathematics is therefore both inefficient and inequitable. As such, attention to the processes that create and perpetuate this redundancy can decrease unwarranted math course-taking and enable all students to start college at a point from which they will be more likely to achieve their educational goals. REDUNDANT COLLEGE MATHEMATICS 55 References Adelman, C. (2005). Moving into town—and moving on. The community college in the lives of traditional age students. Washington, DC: U.S. Department of Education. Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through college. Washington, D.C.: Office of Vocational and Adult Education, U.S. Department of Education. American Association of Colleges and Universities (2016). Retrieved from http://www.aacu.org/sites/default/files/files/LEAP/2015_Survey_Report2_GEtrends .pdf Attewell, P., & Domina, T. (2008). Raising the bar: Curricular intensity and academic performance. Educational Evaluation and Policy Analysis, 30(1), 51-71. Attewell, P., Lavin, D., Domina, T., & Lavey, T. (2006). New evidence on college remediation. Journal of Higher Education, 886-924. Aughinbaugh, A. (2012). The effects of high school math curriculum on college attendance: Evidence from the NLSY97. Economics of Education Review, 31(6), 861-87 Bahr, P. R. (2012). Deconstructing remediation in community colleges: Exploring associations between course-taking patterns, course outcomes, and attrition from the remedial math and remedial writing sequences. Research in Higher Education, 53, 661–693. Bahr, P. R., Jackson, G., McNaughtan, J., Oster, M., & Gross, J. (2016). Unrealized Potential: Community College Pathways to STEM Baccalaureate Degrees. The Journal of Higher Education, 1-49. Bailey, T., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255-270. Becker, G. S. (1964). Human capital: A theoretical analysis with special reference to education. National Bureau for Economic Research. Belfield, C. (2015). The Labor Market Returns to Math in Community College: Evidence Using the Education Longitudinal Study of 2002. A CAPSEE Working Paper. Center for Analysis of Postsecondary Education and Employment. Bettinger, E. P., Boatman, A., & Long, B. T. (2013). Student supports: Developmental education and other academic programs. The Future of Children, 23(1), 93-115. Bettinger, E., & Long, B. T. (2009). Addressing the needs of under-prepared students in higher education: Does college remediation work? Journal of Human Resources, 44, 736–771. Boatman, A., & Long, B. T. (2010). Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation. An NCPR Working Paper. National Center for Postsecondary Research. Bozick, R., and Ingels, S. J. (2008). Mathematics coursetaking and achievement at the end of high school: Evidence from the Education Longitudinal Study of 2002 (ELS:2002) (NCES 2008-319). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Breen, R., Luijkx, R., Müller, W., & Pollak, R. (2009). Nonpersistent inequality in educational attainment: Evidence from eight European countries1. American Journal of Sociology, 114(5), 1475-1521. Brint, S., & Karabel, J. (1989). The diverted dream: Community colleges and the promise of educational opportunity in America, 1900-1985. REDUNDANT COLLEGE MATHEMATICS 56 Bryk, A. S., & Treisman, U. (2010). Make math a gateway, not a gatekeeper. Chronicle of Higher Education, 56(32), B19-B20. Burdman, P. (2015). Degrees of freedom: Varying routes to math readiness and the challenge of intersegmental alignment (Report 2 of a 3-part series). Jobs for the Future: Berkeley, CA. Retrieved from http://www.learningworksca.org/dof2/ Bustillos, L. T., Rueda, R., & Bensimon, E. M. (2011). Faculty views of underrepresented students in community college settings: Cultural models and cultural practices. In P. R. Portes & S. Salas (Eds.), Vygotsky in 21st century society: Advances in cultural historical theory and praxis with non-dominant communities (pp. 199-213). New York: Peter Lang. Byun, S. Y., Irvin, M. J., & Bell, B. A. (2014). Advanced math course taking: Effects on math achievement and college Eerollment. The Journal of Experimental Education, 1-30. Carnevale, A. P., & Strohl, J. (2010). How increasing college access is increasing inequality, and what to do about it. Rewarding strivers: Helping low-income students succeed in college, 71-190. Carrell, S. E., & West, J. E. (2010). Does professor quality matter? Evidence from random assignment of students to professors. Journal of Political Economy, 118(3). Carter, D. F., Locks, A. M., & Winkle-Wagner, R. (2013). From when and where I enter: Theoretical and empirical considerations of minority students’ transition to college. In Higher education: Handbook of theory and research (pp. 93-149). Springer Netherlands. Chen, X. (2013). STEM Attrition: College Students’ Paths Into and Out of STEM Fields (NCES 2014-001). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Chen, X. (2016). Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions: Scope, Experiences, and Outcomes (NCES 2016-405). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved [date] from http://nces.ed.gov/pubsearch. Chingos, M. M. (2016). Instructional quality and student learning in higher education: Evidence from developmental algebra courses. The Journal of Higher Education, 87(1), 84-114. Clark, B. R. (1960). The" cooling-out" function in higher education. American Journal of Sociology, 569-576. Cox, B. E., McIntosh, K., Reason, R. D., & Terenzini, P. T. (2014). Working with missing data in higher education research: A primer and real-world example. The Review of Higher Education, 37(3), 377-402. Cox, R. D. (2009). The college fear factor: How students and professors misunderstand one another. Harvard University Press. Crisp, G., & Delgado, C. (2014). The impact of developmental education on community college persistence and vertical transfer. Community College Review, 42(2), 99-117. Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a Hispanic serving institution. American Educational Research Journal, 46(4), 924-942. Darolia, R. (2014). Working (and studying) day and night: Heterogeneous effects of working on the academic performance of full-time and part-time students. Economics of Education Review, 38, 38–50. REDUNDANT COLLEGE MATHEMATICS 57 Dehejia, R. H., & Wahba, S. (2002). Propensity score-matching methods for nonexperimental causal studies. Review of Economics and statistics, 84(1), 151-161. Deil-Amen, R., & Rosenbaum, J. E. (2002). The unintended consequences of stigma-free remediation. Sociology of Education, 249-268. Domina, T., & Saldana, J. (2012). Does raising the bar level the playing field? Mathematics curricular intensification and inequality in American high schools, 1982–2004. American Educational Research Journal, 49(4), 685-708. Duckworth, A. L., Quinn, P. D., & Tsukayama, E. (2012). What No Child Left Behind leaves behind: The roles of IQ and self-control in predicting standardized achievement test scores and report card grades. Journal of Educational Psychology, 104(2), 439-451. Ellis, J., Fosdick, B.K., & Rasmussen, C. (2016). Women 1.5 times more likely to leave STEM pipeline after calculus compared to men: Lack of mathematical confidence a potential culprit. PLoS ONE 11(7): e0157447. doi10.1371/journal.pone.0157447 Engberg, M. E., & Wolniak, G. C. (2010). Examining the effects of high school contexts on postsecondary enrollment. Research in Higher Education, 51(2), 132-153. Fong, K. E., & Melguizo, T. (2016). Utilizing additional measures of high school academic preparation to support students in their math self-assessment. Community College Journal of Research and Practice, 1-27. Fong, K. E., Melguizo, T., & Prather, G. (2015). Increasing success rates in developmental math: The complementary role of individual and institutional characteristics. Research in Higher Education, 1-31. Gamoran, A. (1987). The stratification of high school learning opportunities. Sociology of Education, 135-155. Good, C., Rattan, A., & Dweck, C. S. (2012). Why do women opt out? Sense of belonging and women's representation in mathematics. Journal of Personality and Social Psychology, 102(4), 700. Goodman, J. S. (2012). The Labor of Division: Returns to Compulsory Math Coursework. Harvard University, John F. Kennedy School of Government. Gottfried, M. A., & Bozick, R. (2016). Supporting the STEM pipeline: Linking applied STEM course-taking in high school to declaring a STEM major in college. Education Finance and Policy. Grubb, N. (1999). Honored but invisible: An inside look at America’s community colleges. New York: Routledge. Hacker, A. (2016). The Math Myth: And Other STEM Delusions. The New Press. Hagedorn, L. S., & Kress, A. M. (2008). Using transcripts in analyses: Directions and opportunities. New Directions for Community Colleges, 2005(143), 7–17. Hagedorn, L. S., Maxwell, W. E., Cypers, S., Moon, H. S., & Lester, J. (2007). Course shopping in urban community colleges: An analysis of student drop and add activities. The Journal of Higher Education, 78(4), 464-485. Hausman, J. A., & Ruud, P. A. (1987). Specifying and testing econometric models for rank- ordered data. Journal of Econometrics, 34(1-2), 83-104. Heckman, J. J., Ichimura, H., Smith, J., & Todd, P. (1998). Characterizing selection bias using experimental data. Econometrica, 66, 1017–1098. Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39(4), 327-351. REDUNDANT COLLEGE MATHEMATICS 58 Imbens, G. (2004). Nonparametric estimation of average treatment effects under exogeneity: A review. Review of Economics and Statistics 86 (1), 4–29. Ingels, S.J., Pratt, D.J, Alexander, C.P., Bryan, M., Jewell, D.M., Lauff, E. Mattox, T.L., & Wilson, D. (2015). Education Longitudinal Study of 2002 (ELS:2002):Postsecondary Education Transcript Study Data File Documentation (NCES 2014-033). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved from http://nces.ed.gov/pubsearch. Joselow, M. (2016, June 16). No Math Required. Inside HigherEd. Retrieved from https://www.insidehighered.com/news/2016/06/16/debate-over-whether-all- undergraduates-should-take-mathematics-course Karabel, J. (1972). Community colleges and social stratification. Harvard Educational Review, 42(4), 521-562. Kim, J., Kim, J., DesJardins, S. L., & McCall, B. P. (2015). Completing Algebra II in high school: Does it increase college access and success? The Journal of Higher Education, 86(4), 628-662. Kosiewicz, H. (2016). Giving community college students voice. Unpublished doctoral dissertation. Los Angeles: University of Southern California. Lauff, E., & Ingels, S. J. (2015). Education Longitudinal Study of 2002 (ELS:2002): A first look at the postsecondary transcripts of 2002 high school sophomores (NCES 2015-034). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Lesik, S. A. (2007). Do developmental mathematics programs have a causal impact on student retention? An application of discrete-time survival and regression-discontinuity analysis. Research in Higher Education, 48(5), 583-608. Logue, A. W., Watanabe-Rose, M., & Douglas, D. (2016). Should students assessed as needing remedial mathematics take college-level quantitative courses instead? A randomized controlled trial. Educational Evaluation and Policy Analysis, 38(3), 578-598. Logue, J. (2016, April 21). Pushing New Math Paths. Inside HigherEd. Retrieved from https://www.insidehighered.com/news/2016/04/21/tpsemath-working-reform-math- education Long, M. C., Conger, D., & Iatarola, P. (2012). Effects of high school course-taking on secondary and postsecondary success. American Educational Research Journal, 49(2), 285-322. Malcom, L. E. (2010). Charting the pathways to STEM for Latina/o students: The role of community colleges. New Directions for Institutional Research, 2010(148), 29-40. Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95(5), 877-907. Martorell, P., & McFarlin Jr., I. (2011). Help or hindrance? The effects of college remediation on academic and labor market outcomes. The Review of Economics and Statistics, 93(2), 436-454. Martorell, P., McFarlin Jr., I., & Xue, Y. (2015). Does Failing a Placement Exam Discourage Underprepared Students from Going to College? Education Finance and Policy, 10(1), 46-80. McFarland, D. A. (2006). Curricular flows: Trajectories, turning points, and assignment criteria in high school math careers. Sociology of Education, 79(3), 177-205. REDUNDANT COLLEGE MATHEMATICS 59 McGee, E. O., & Martin, D. B. (2011). “You would not believe what I have to go through to prove my intellectual value!” Stereotype management among academically successful black mathematics and engineering students. American Educational Research Journal, 48(6), 1347-1389. Melguizo, T. (2008). Quality matters: Assessing the impact of attending more selective institutions on college completion rates of minorities. Research in Higher Education, 49(3), 214-236. Melguizo, T. (2010). Are students of color more likely to graduate from college if they attend more selective institutions? Evidence from a cohort of recipients and nonrecipients of the Gates Millennium Scholarship Program. Educational Evaluation and Policy Analysis, 32(2), 230-248. Melguizo, T. (2011). A review of the theories developed to describe the process of college persistence and attainment. In Higher education: Handbook of theory and research (pp. 395-424). Springer Netherlands. Melguizo, T., Bos, J. M., Ngo, F., Mills, N., & Prather, G. (2016). Using a regression discontinuity design to estimate the impact of placement decisions in developmental math. Research in Higher Education, 57(2), 123-151. Melguizo, T., Hagedorn, L. S., & Cypers, S. (2008). Remedial/developmental education and the cost of community college transfer: A Los Angeles County sample. The Review of Higher Education, 31(4), 401-431. Melguizo, T., Kosiewicz, H., Prather, G., & Bos, J. (2014). How are community college students assessed and placed in developmental math?: Grounding our understanding in reality. The Journal of Higher Education, 85(5), 691-722. Mesa, V., Celis, S., & Lande, E. (2014). Teaching approaches of community college mathematics faculty: Do they relate to classroom practices? American Educational Research Journal, 51(1), 117-151. Monaghan, D. B., & Attewell, P. (2015). The community college route to the bachelor’s degree. Educational Evaluation and Policy Analysis, 37(1), 70-91. Murnane, R. J., & Willett, J. B. (2010). Methods matter: Improving causal inference in educational and social science research. Oxford University Press. Ngo, F., Chi, W. E., & Park, E. S. (Forthcoming). Mathematics course placement using holistic measures: Possibilities for community college students. Teachers College Record. Ngo, F. & Kosiewicz, H. (2017). How extending time in developmental math impacts persistence and success: Evidence from a regression discontinuity in community colleges. The Review of Higher Education, 40(2), 267-306. Ngo, F., & Kwon, W. (2015). Using multiple measures to make math placement decisions: Implications for access and success in community colleges. Research in Higher Education, 56(5), 442-470. Ngo, F., & Melguizo, T. (2016). How can placement policy improve math remediation outcomes? Evidence from community college experimentation. Educational Evaluation and Policy Analysis, 38(1), 171-196. Oakes, J. (1985). Keeping Track. New Haven, CT: Yale University Press. Oreopoulos, P., & Petronijevic, U. (2013). Making college worth it: A review of the returns to higher education. The Future of Children, 23(1), 41-65. REDUNDANT COLLEGE MATHEMATICS 60 Pajares, F., & Miller, M. D. (1994). Role of self-efficacy and self-concept beliefs in mathematical problem solving: A path analysis. Journal of Educational Psychology, 86(2), 193. Park, T., Woods, C. S., Richard, K., Tandberg, D., Hu, S., & Jones, T. B. (2016). When developmental education is optional, what will students do? A preliminary analysis of survey data on student course enrollment decisions in an environment of increased choice. Innovative Higher Education, 41(3), 221-236. Paulsen, M. B. (2001). The economics of human capital and investment in higher education. In M. B. Paulsen (Ed.), The Finance of higher education: Theory, research, policy, and practice, 55-94. Agathon Press. Perez-Felkner, L., McDonald, S. K., Schneider, B., & Grogan, E. (2012). Female and male adolescents' subjective orientations to mathematics and the influence of those orientations on postsecondary majors. Developmental Psychology, 48(6), 1658. Perna, L. W., & Titus, M. A. (2005). The relationship between parental involvement as social capital and college enrollment: An examination of racial/ethnic group differences. Journal of Higher Education, 485-518. Perry, A. B. (2004). Decreasing math anxiety in college students. College Student Journal, 38(2), 321. Porter, A. C., & Polikoff, M. S. (2011). Measuring academic readiness for college. Educational Policy, 0895904811400410. Posselt, J. R. (2015). Disciplinary logics in doctoral admissions: Understanding patterns of faculty evaluation. The Journal of Higher Education, 86(6), 807-833. Riegle-Crumb, C. (2006). The path through math: Course sequences and academic performance at the intersection of race-ethnicity and gender. American Journal of Education, 113(1), 101. Riegle-Crumb, C., & Grodsky, E. (2010). Racial-ethnic differences at the intersection of math course-taking and achievement. Sociology of Education, 83(3), 248-270. Riegle-Crumb, C., King, B., Grodsky, E., & Muller, C. (2012). The more things change, the more they stay the same? Prior achievement fails to explain gender inequality in entry into STEM college majors over time. American Educational Research Journal, 49(6), 1048-1073. Rosenbaum, P. R., & Rubin, D. (1985). Constructing and control group using multivariate matched sampling methods that incorporate the propensity score. American Statistician, 39, 33–38. Rothstein, J. M. (2004). College performance predictions and the SAT. Journal of Econometrics, 121(1), 297-317. Rubin, D. B. (2001). Using propensity scores to help design observational studies: Application to the tobacco litigation. Health Services & Outcomes Research Methodology, 2, 169–188. Sax, L. J., Kanny, M. A., Riggers-Piehl, T. A., Whang, H., & Paulson, L. N. (2015). “But I’m not good at math”: The changing salience of mathematical self-concept in shaping women’s and men’s STEM aspirations. Research in Higher Education, 1-30. Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371-393. REDUNDANT COLLEGE MATHEMATICS 61 Scott-Clayton, J., & Rodriguez, O. (2015). Development, discouragement, or diversion? New evidence on the effects of college remediation. Education Finance and Policy, 10(1), 4- 45. Smith, J., Pender, M., & Howell, J. (2013). The full extent of student-college academic undermatch. Economics of Education Review, 32, 247-261. Stinebrickner, R., & Stinebrickner, T. R. (2003). Working during school and academic performance. Journal of Labor Economics, 21(2), 473-491. Stinebrickner, T. R., & Stinebrickner, R. (2011). Math or science? Using longitudinal expectations data to examine the process of choosing a college major (No. w16869). National Bureau of Economic Research. Stuart, E. A. (2010). Matching methods and causal inference: A review and a look forward. Statistical Science, 25(1), 1-21. Tinto, V. (1993). Building Community. Liberal Education, 79(4), 16-21. Treisman, U. (1992). Studying students studying calculus: A look at the lives of minority mathematics students in college. The College Mathematics Journal, 23(5), 362-372. Umbach, P. D., & Wawrzynski, M. R. (2005). Faculty do matter: The role of college faculty in student learning and engagement. Research in Higher Education, 46(2), 153-184. Venezia, A., Bracco, K., & Nodine, T. (2010). One-shot deal? Students' perceptions of assessment and course placement in California's community colleges. San Francisco, CA: WestEd. Walpole, M. (2003). Socioeconomic status and college: How SES affects college experiences and outcomes. The Review of Higher Education, 27, 45–73. Wang, X. (2013). Why students choose STEM majors: Motivation, high school learning, and postsecondary context of support. American Educational Research Journal, 50(5), 1081- 1121. Wang, X. (2016). Course-taking patterns of community college students beginning in STEM: Using data mining techniques to reveal viable STEM transfer pathways. Research in Higher Education, 57(5), 544-569. REDUNDANT COLLEGE MATHEMATICS 62 Figures N=5670 (unmatched) N=1330 (matched) N=5670 (unmatched) N=1420 (matched) N=5670 (unmatched) N=1030 (matched) Figure 1. Propensity of redundant math course-taking, before and after matching REDUNDANT COLLEGE MATHEMATICS 63 Figure 2. Covariate bias before and after matching (Model 5). 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 Math Prof L1 Math Prof L2 Math Prof L3 Math Prof L4 Math Prof L5 Female Asian Black Hispanic Other White SES Math Self-Eff. HS GPA HS Math Rem. Hi HS Math Acad. Track Voc. Track Yrs. Math Req. STEM Occ 30? Private Coll. Two Yr. Coll. For-Prof. Coll. PS Gap Entr. Exam Part-time Stud. Matched Unmatched REDUNDANT COLLEGE MATHEMATICS 64 Table 1. Characteristics of ELS:02 Study Samples STUDY SAMPLES (A) All ELS:2002 (B) Any college Enrollment by 2013 (C) College Enrollment by June 2005 (D) All Math- Takers in (C) (E) (C)'s First Math was Calc. or Lower N=16,200 N=11,510 N=9,840 N=8,250 N=5,670 71% of (A) 86% of (B) 84% of (C) 69% of (D) Variable Var. Name N Mean SD Min Max Mean SD Mean SD Mean SD Mean SD Math Prof. Level 1 F1TX1MPP 13650 0.96 0.12 0.02 1 0.98 0.09 0.99 0.07 0.99 0.05 0.98 0.08 Math Prof. Level 2 F1TX2MPP 13650 0.80 0.36 0 1 0.85 0.31 0.89 0.28 0.90 0.25 0.86 0.30 Math Prof. Level 3 F1TX3MPP 13650 0.65 0.44 0 1 0.71 0.42 0.76 0.39 0.79 0.37 0.73 0.41 Math Prof. Level 4 F1TX4MPP 13650 0.37 0.41 0 1 0.44 0.42 0.47 0.42 0.50 0.42 0.47 0.43 Math Prof. Level 5 F1TX5MPP 13650 0.05 0.16 0 1 0.06 0.17 0.07 0.18 0.07 0.19 0.08 0.20 Female BYSEX 15370 0.50 0.50 0 1 0.55 0.50 0.55 0.50 0.55 0.50 0.54 0.50 Asian BYRACE_3 15240 0.10 0.29 0 1 0.10 0.30 0.11 0.31 0.11 0.32 0.12 0.33 Black BYRACE_2 15240 0.13 0.34 0 1 0.12 0.33 0.11 0.32 0.11 0.31 0.12 0.32 Hispanic BYHISPAN 15240 0.15 0.35 0 1 0.13 0.33 0.11 0.32 0.11 0.31 0.12 0.32 White BYRACE_1 15240 0.57 0.50 0 1 0.60 0.49 0.61 0.49 0.62 0.48 0.60 0.49 Other BYRACE_R 15240 0.06 0.23 0 1 0.05 0.22 0.05 0.22 0.05 0.21 0.05 0.21 SES (Composite) BYSES1 15240 0.04 0.74 -2.11 1.82 0.16 0.73 0.23 0.72 0.26 0.71 0.22 0.73 Math Self-Efficacy F1MATHSE 10220 0.02 0.99 -2.039 1.811 0.06 1.00 0.09 1.00 0.11 1.00 0.08 1.02 High School GPA F1RGPP2 14800 3.91 1.54 0 6 4.24 1.43 4.48 1.29 4.57 1.24 4.42 1.33 Took Remedial Math in HS BYS33E 14080 0.10 0.30 0 1 0.09 0.29 0.09 0.28 0.08 0.28 0.09 0.28 HS Math 1 (Basic) F1RMAPIP 14810 0.04 0.20 0 1 0.02 0.14 0.01 0.10 0.01 0.08 0.01 0.11 HS Math 2 (Pre-Alg/Alg) F1RMAPIP 14810 0.27 0.44 0 1 0.20 0.40 0.14 0.35 0.12 0.32 0.17 0.37 HS Math 3 (Algebra 2) F1RMAPIP 14810 0.23 0.42 0 1 0.23 0.42 0.23 0.42 0.23 0.42 0.23 0.42 HS Math 4 (Pre-calc) F1RMAPIP 14810 0.32 0.47 0 1 0.37 0.48 0.42 0.49 0.44 0.50 0.37 0.48 HS Math 5 (Calculus) F1RMAPIP 14810 0.14 0.34 0 1 0.17 0.38 0.20 0.40 0.21 0.41 0.22 0.41 Academic Concentrator F1RACADC 14810 0.27 0.44 0 1 0.33 0.47 0.38 0.49 0.41 0.49 0.37 0.48 REDUNDANT COLLEGE MATHEMATICS 65 Occupation Concentrator F1ROCCUC 14810 0.15 0.36 0 1 0.14 0.34 0.13 0.33 0.12 0.33 0.12 0.33 Years/Math Req'd in HS F1A07B 11630 2.90 0.62 1 4 2.91 0.63 2.92 0.63 2.94 0.63 2.91 0.63 Private College F2PS1CTR 10490 0.26 0.44 0 1 0.26 0.44 0.26 0.44 0.24 0.43 0.27 0.44 Two-Year College F2PS1LVL 10480 0.35 0.48 0 1 0.33 0.47 0.33 0.47 0.31 0.46 0.34 0.47 For-Profit College F2PS1SEC 10480 0.05 0.22 0 1 0.05 0.21 0.04 0.20 0.02 0.14 0.05 0.22 Gap in College Enrollment F2ENRGAP 10510 0.07 0.25 0 1 0.07 0.25 0.07 0.26 0.07 0.26 0.08 0.27 College has Entrance Exam F2PS1EEX 13000 1.66 1.46 0 4 2.01 1.39 2.35 1.21 2.42 1.21 2.30 1.22 Part-time Student F2PS1FTP 10510 0.14 0.34 0 1 0.13 0.34 0.12 0.33 0.10 0.30 0.13 0.34 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. REDUNDANT COLLEGE MATHEMATICS 66 Table 2. First Math Courses in College College Course Map Code Categories Example Courses Frequency Percent Below Calc.? Pre-Collegiate Algebra (Inter., Elem., Basic) Pre-Algebra, Algebra 1, Algebra 2 1,780 21.6 Algebra and Number Theory Linear Algebra; Combinatorics 1,750 21.3 Calculus Calculus I-IV 1,360 16.5 Mathematics, General Introduction to College-Level Math 620 7.6 Analytic Geometry, Functions, Pre-Calculus Pre-Calculus 540 6.5 Statistics, General Descriptive/Inferential Statistics 540 6.5 Pre-Collegiate Math General Developmental Math 450 5.5 Trigonometry Trigonometry 140 1.7 Computational Mathematics Computer Theory; Discrete Math 130 1.6 Geometric Analysis Set Theory 120 1.5 Collegiate Business Math Business Algebra 110 1.4 Applied Mathematics, General Fundamental Analysis of Real Variables 90 1.1 Arithmetic Arithmetic 90 1.1 Mathematical Statistics and Probability Probability Theory 90 1.0 Math Appreciation; Math in Society Math in the Modern World 80 1.0 Math for Social Science Math for Economics 60 0.7 Technical Math; Vocational Math Nursing Math 60 0.7 Mathematics and Statistics Specialized Statistics 40 0.5 Pre-Collegiate Business Math Business Arithmetic; Consumer Math 40 0.5 Number Systems Algebra/Geometry for Teachers 30 0.4 Mathematics, Other 30 0.3 Analysis and Functional Analysis Partial Differential Equations 20 0.3 Financial Mathematics Probability Theory; Finance 20 0.2 Computational and Applied Mathematics Differential Equations 20 0.2 Advanced Mathematics Topics Game Theory 10 0.2 Descriptive/Pre-Collegiate/Plane Geometry Geometry <10 Mathematics and Statistics, Other Math Economics <10 Mathematical Biology Chaos and Nonlinear Systems <10 Advanced Statistics Regression <10 Topology and Foundations Mathematical Logic, Proof Theory <10 Statistics, Other <10 Applied Mathematics, Other <10 Technical Math Using Scientific Calculators <10 Total 8,250 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. REDUNDANT COLLEGE MATHEMATICS 67 Table 3. Highest High School Math Level and First College Math Level First College Math Level Highest HS Math Level Dev. Math Pre-alg./ Alg 1/ Geom. Algebra 2 Pre-calc/ Trig/Stats Calculus Other All Study Sample General Math 36.96 10.87 19.57 21.74 0 10.87 Pre-Alg./Alg 1./Geom. 23.31 28.87 14.81 12.2 1.09 19.72 Algebra 2 11.63 21.19 15.35 17.41 1.49 32.93 Pre-calc/Trig/Stats 4.36 9.85 8.64 26.23 11.72 39.2 Calculus 1.22 0.85 1.76 23.65 51.79 20.73 Total 7.78 12.76 9.49 21.99 16.61 31.37 Four-Year Colleges General Math 10 20 10 40 0 20 Pre-Alg/Alg 1/Geom 15.53 22.83 10.5 21.46 3.65 26.03 Algebra 2 9.07 16.55 11 23.47 2.38 37.53 Pre-calc/Trig/Stats 3.4 7.75 6.68 28.31 14.23 39.62 Calculus 1.21 0.67 1.61 23.25 53.9 19.35 Total 4.27 7.88 6.12 25.74 23.23 32.76 Two-Year Colleges General Math 45.45 9.09 24.24 12.12 0 9.09 Pre-Alg/Alg 1/Geom 25.78 31.72 16.72 7.97 0.16 17.66 Algebra 2 13.75 26.28 20.56 10.58 0.61 28.22 Pre-calc/Trig/Stats 7.2 16.67 14.9 19.32 4.29 37.63 Calculus 0.68 2.74 3.42 26.03 32.19 34.93 Total 14.43 22.93 16.73 13.69 3.58 28.65 REDUNDANT COLLEGE MATHEMATICS 68 Table 4. ELS:02 12th Grade Math Assessment Quartile and First College Math Level First College Math Level 12th Grade Math Assessment Quartile Dev. Math Pre- alg./ Alg 1/Geom. Algebra 2 Pre-calc/ Trig/Stats Calculus Other All Study Sample Lowest 28.09 28.31 14.61 11.57 0.45 16.97 Second 11.43 23.05 14.7 15.55 2.06 33.21 Third 4.08 11.54 10.02 26.32 8.89 39.16 Highest 1.58 2.1 3.58 26.74 36.92 29.08 Total 7.46 12.37 9.13 22.49 16.98 31.57 Four-Year Colleges Lowest 22.71 23.39 15.93 15.59 0.68 21.69 Second 8.48 18.55 9.46 21.25 3.93 38.33 Third 3.08 9.05 7.64 29.43 10.78 40.02 Highest 1.49 1.45 2.51 26.59 40.06 27.89 Total 4.24 7.65 5.91 26.03 23.28 32.89 Two-Year colleges Lowest 30.78 31.51 14.75 8.2 0.36 14.39 Second 13.94 28.01 20.66 9 0.25 28.14 Third 6.31 18.15 16.15 18.31 4.31 36.77 Highest 2.08 7.12 10.68 27.3 15.13 37.69 Total 14.06 23.05 16.56 14.06 3.57 28.69 REDUNDANT COLLEGE MATHEMATICS 69 Table 5. Ordered Logistic Regression, Predictors of Passing First Math Level in College with a B or Better Model 1 Model 2 Model 3 Model 4 Math Prof. Level 1 2.012 1.986 0.975 0.986 (1.10) (1.14) (1.53) (1.56) Math Prof. Level 2 0.675* 0.591* 0.555 0.529 (0.28) (0.29) (0.35) (0.37) Math Prof. Level 3 0.346 0.345 0.129 0.038 (0.18) (0.19) (0.22) (0.24) Math Prof. Level 4 2.836*** 2.752*** 1.878*** 1.811*** (0.13) (0.14) (0.18) (0.18) Math Prof. Level 5 2.197*** 2.097*** 1.140*** 0.921*** (0.20) (0.21) (0.25) (0.26) Female 0.089 -0.077 -0.087 (0.07) (0.09) (0.09) Asian 0.352*** 0.232 0.352** (0.11) (0.13) (0.14) Black -0.025 0.125 0.01 (0.13) (0.16) (0.17) Hispanic -0.286* -0.268 -0.266 (0.12) (0.14) (0.15) Other -0.216 -0.024 -0.072 (0.17) (0.21) (0.22) SES (Composite) 0.199*** 0.196** 0.138* (0.05) (0.07) (0.07) Math Self-Efficacy 0.090* 0.07 (0.05) (0.05) High School GPA 0.290*** 0.237*** (0.05) (0.05) Took Remedial Math in HS 0.009 -0.039 (0.16) (0.16) HS Math 1 (Basic) -2.876* -3.036** (1.15) (1.18) HS Math 2 (Pre-Alg/Alg) - 1.550*** - 1.617*** (0.23) (0.24) HS Math 3 (Algebra 2) - 1.678*** - 1.727*** (0.18) (0.19) HS Math 4 (Pre-calculus) - 1.395*** - 1.424*** (0.12) (0.13) Academic Concentrator 0.282** 0.218* REDUNDANT COLLEGE MATHEMATICS 70 (0.11) (0.11) Occupation Concentrator -0.116 -0.114 (0.13) (0.14) Years of Math Req'd in HS -0.055 -0.095 (0.07) (0.07) Private 0.481*** (0.11) Two-Year College -0.045 (0.16) For-Profit College -0.957* (0.44) Gap in College Enrollment -0.008 (0.20) College has Entrance Exam 0.157* (0.06) Part-time Student -0.238 (0.17) Cut 1 Constant 1.691 1.633 -0.202 -0.439 (0.98) (1.01) (1.40) (1.43) Cut 2 Constant 3.167** 3.093** 1.343 1.18 (0.98) (1.01) (1.40) (1.43) Cut 3 Constant 3.976*** 3.919*** 2.246 2.137 (0.98) (1.01) (1.40) (1.44) Cut 4 Constant 6.304*** 6.294*** 4.994*** 4.958*** (0.98) (1.02) (1.40) (1.44) N 3220 3060 2260 2120 Note: Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. * p<0.05, ** p<0.01, ***p<0.001. Reference groups: White; Calculus level in high school Model 1: Math skills only Model 2: Math skills and student demographics Model 3: Math skills, student demographics, and high school background variables Model 4: Math skills, student demographics, high school background, and college context variables REDUNDANT COLLEGE MATHEMATICS 71 Table 6. Remediation, Misalignment, and Redundancy in First College Math Courses N Math Remediation Math Misalignment Math Redundancy Model 1 Model 2 Model 3 Model 4 Model 5 All math-takers (w/ postsecondary enrollment by June 2005) 8250 0.31 0.57 0.18 0.19 0.27 0.27 0.15 All students in calculus and below 5670 0.46 0.82 0.26 0.28 0.39 0.40 0.22 Alternative Samples Only students who in same or lower math level 4670 0.62 0.82 0.32 0.34 0.47 0.48 0.22 Only students who took math w/in 1 year of college enrollment 4220 0.47 0.83 0.24 0.26 0.37 0.38 0.20 No Statistics 5140 0.50 0.83 0.29 0.30 0.37 0.39 0.23 No For-Profits 5550 0.45 0.82 0.26 0.28 0.39 0.39 0.22 Two-Year Colleges Only 1850 0.77 0.84 0.27 0.29 0.43 0.43 0.23 Four-Year Colleges Only 3680 0.29 0.82 0.26 0.28 0.36 0.37 0.22 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. Each sample restriction includes only students who enrolled in college by June 2005. REDUNDANT COLLEGE MATHEMATICS 72 Table 7. Test of Group Mean Differences in Characteristics, Redundant and Non-redundant Math Students Remedial Math Misaligned Math Redundant College Math Model 1 Model 2 Model 3 Model 4 Model 5 Math Proficiency Level 1 -0.0237*** -0.00173 0.0166*** 0.0138*** 0.00279 0.00139 0.0159*** Math Proficiency Level 2 -0.200*** -0.0106 0.124*** 0.102*** 0.0250** 0.0111 0.122*** Math Proficiency Level 3 -0.389*** -0.00862 0.163*** 0.133*** 0.0218 0.000422 0.165*** Math Proficiency Level 4 -0.516*** 0.00915 0.0419** 0.0117 -0.0654*** -0.0762*** 0.0707*** Math Proficiency Level 5 -0.134*** 0.0483*** 0.0116 -0.0014 -0.0469*** -0.0501*** 0.0159* Female 0.0600*** 0.0442* -0.0294 -0.0117 0.0258 0.0269 -0.0168 Asian -0.0709*** 0.0404*** -0.0277** 0.0176 -0.0058 -0.00821 -0.00612 Black 0.0969*** 0.0162 -0.00601 -0.0131 0.0247** 0.0366*** -0.00278 Hispanic 0.0978*** 0.0262* 0.0245* -0.00953 0.00401 0.00922 -0.00291 White 0.0103 -0.0026 0.0218*** 0.0102 0.00696 0.00987 0.0169* Other -0.134*** -0.0802*** -0.0127 -0.00515 -0.0299* -0.0475*** -0.00509 SES (Composite) -0.455*** 0.00209 -0.0485* 0.0293 -0.0682*** -0.0787*** -0.000885 Math Self-Efficacy (Composite) -0.601*** 0.0311 -0.00725 -0.0323 -0.00717 -0.0442 0.042 High School GPA -1.227*** 0.206*** -0.310*** -0.305*** -0.134*** -0.174*** -0.214*** Took Remedial Math in HS 0.0303*** 0.00405 0.00499 0.00926 0.0088 0.00516 0.0115 Math Pipeline Level -1.106*** 0.784*** -0.211*** -0.211*** -0.0746** -0.0858** -0.163*** Academic Concentrator -0.375*** 0.172*** -0.102*** -0.0985*** -0.0432** -0.0567*** -0.0796*** Occupation Concentrator 0.0578*** -0.0116 0.00494 -0.00233 -0.00491 -0.0028 -0.00559 Years of Math Required in HS -0.0862*** 0.0757** -0.0348 -0.0337 0.00731 0.00452 -0.0418 Private College -0.211*** -0.0547*** -0.0821*** -0.0885*** -0.0874*** -0.0498*** -0.0718*** Two-Year College 0.410*** 0.0356* 0.116*** 0.118*** 0.0954*** 0.0518*** 0.0915*** For-Profit College 0.0129** -0.0130* 0.00884* 0.00958* 0.0116** 0.0166*** 0.0112* Gap in College Enrollment 0.0681*** -0.000842 0.0322*** 0.0270*** 0.0249*** 0.0228** 0.0286*** College has Entrance Exam -1.284*** -0.0257 -0.352*** -0.349*** -0.378*** -0.276*** -0.295*** Part-time Student 0.119*** 0.0134 0.0497*** 0.0482*** 0.0306*** 0.0186* 0.0382*** N 5670 5670 5670 5670 5670 5670 5670 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. * p<0.05, ** p<0.01, ***p<0.001. REDUNDANT COLLEGE MATHEMATICS 73 Table 8. Predictors of Redundancy, First Stage of Propensity Score Matching Remedial Math Misaligned Math Redundant College Math Model 1 Model 2 Model 3 Model 4 Model 5 Math Proficiency Level 1 1.168 -1.135 23.451 38.646* 7.127 5.49 15.491 (1.29) (1.37) (18.26) (18.27) (4.29) (3.35) (22.69) Math Proficiency Level 2 -0.081 -0.971* 1.495 -0.677 0.268 0.225 2.729 (0.39) (0.39) (1.13) (1.06) (0.53) (0.47) (1.47) Math Proficiency Level 3 -0.669** -0.239 1.566*** 1.966*** 0.929*** 0.594* 0.793** (0.25) (0.27) (0.27) (0.27) (0.24) (0.24) (0.30) Math Proficiency Level 4 -1.584*** -1.653*** 0.17 -0.19 -0.399* -0.364* 0.216 (0.18) (0.22) (0.17) (0.17) (0.16) (0.17) (0.19) Math Proficiency Level 5 -0.900* 0.739* 1.483*** 1.206*** -0.326 -0.437 0.937*** (0.46) (0.37) (0.24) (0.24) (0.24) (0.24) (0.25) Female -0.072 0.246* 0.143 0.232* -0.055 -0.066 0.115 (0.10) (0.11) (0.09) (0.09) (0.09) (0.09) (0.10) Asian -0.427** -0.081 -0.396** 0.158 -0.143 -0.098 -0.121 (0.16) (0.18) (0.15) (0.13) (0.13) (0.13) (0.15) Black -0.162 -0.069 -0.155 -0.296 -0.182 -0.204 -0.265 (0.17) (0.19) (0.18) (0.18) (0.17) (0.17) (0.21) Hispanic 0.217 0.213 0.261 -0.233 -0.295 -0.255 -0.164 -0.162 -0.184 -0.149 -0.16 -0.154 -0.154 -0.178 Other 0.141 0.12 0.231 -0.112 0.117 0.2 0.029 (0.24) (0.25) (0.21) (0.22) (0.20) (0.20) (0.23) SES (Composite) -0.247*** -0.083 -0.107 0.184** 0.012 -0.076 -0.011 (0.07) (0.08) (0.07) (0.07) (0.07) (0.07) (0.08) Math Self-Efficacy (Composite) -0.057 -0.093 -0.046 -0.029 0.008 -0.036 -0.025 (0.05) (0.06) (0.05) (0.05) (0.05) (0.05) (0.05) High School GPA -0.206*** -0.181** -0.192*** -0.176*** 0.165*** 0.082 -0.022 (0.05) (0.06) (0.05) (0.05) (0.05) (0.05) (0.06) Took Remedial Math in HS -0.004 0.208 -0.097 -0.074 -0.011 -0.038 0.012 REDUNDANT COLLEGE MATHEMATICS 74 (0.17) (0.19) (0.16) (0.16) (0.15) (0.15) (0.18) Math Pipeline Level -0.520*** 2.130*** -0.392*** -0.376*** 0.235** 0.235** -0.280*** (0.08) (0.11) (0.08) (0.08) (0.07) (0.07) (0.08) Academic Concentrator -0.076 -0.134 -0.191 -0.220* -0.041 -0.156 -0.116 (0.12) (0.14) (0.11) (0.11) (0.10) (0.10) (0.12) Occupation Concentrator -0.056 0.002 0.167 0.254 0.024 0.079 0.12 (0.15) (0.16) (0.14) (0.13) (0.13) (0.13) (0.15) Years of Math Required in HS -0.166* -0.145 0.041 0.017 0.015 -0.035 0.005 (0.08) (0.08) (0.07) (0.07) (0.07) (0.07) (0.08) N 3500 3500 3500 3500 3500 3500 3500 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. REDUNDANT COLLEGE MATHEMATICS Table 9. Estimated Differences between Redundant and Non-Redundant Math-Takers, 7 Specifications S1 S2 S3 S4 S5 S6 S7 Unmatched OLS Unmatched OLS PSM PSM PSM IPTW IPTW w/trim Propensity Score x Covariates x x X x Pass First Math w/ B or better Redundant: Model 5 0.120*** 0.103*** 0.099** 0.099** 0.102*** 0.114*** 0.113*** (0.020) (0.020) (0.030) (0.030) (0.030) (0.020) (0.020) N 3500 3500 1030 1030 1030 3500 3500 Redundant: Model 4 0.146*** 0.119*** 0.155*** 0.154*** 0.153*** 0.118*** 0.118*** (0.020) (0.020) (0.030) (0.030) (0.030) (0.020) (0.020) N 3500 3500 1420 1420 1420 3500 3500 Redundant: Model 1 0.103*** 0.125*** 0.141*** 0.141*** 0.139*** 0.129*** 0.126*** (0.020) (0.020) (0.030) (0.030) (0.030) (0.020) (0.020) N 3500 3500 1330 1330 1330 3500 3500 Misalignment 0.179*** 0.181*** 0.335*** 0.335*** 0.311*** 0.237*** 0.225*** (0.020) (0.020) (0.070) (0.070) (0.030) (0.030) (0.030) N 3500 3500 3330 3330 3330 3500 3500 Remediation -0.137*** 0.064** 0.084* 0.084* 0.073* 0.041+ 0.046+ (0.020) (0.020) (0.040) (0.040) (0.040) (0.020) (0.020) N 3500 3500 2030 2030 2030 3500 3500 Total College GPA Redundant: Model 5 0.009 -0.069* -0.110* -0.111* -0.081* -0.075* -0.077* (0.040) (0.030) (0.050) (0.050) (0.040) (0.030) (0.030) N 3480 3480 1020 1020 1020 3480 3480 Redundant: Model 4 0.054+ -0.056* -0.031 -0.031 -0.048 -0.065* -0.065* (0.030) (0.030) (0.040) (0.040) (0.030) (0.030) (0.030) N 3480 3480 1410 1410 1410 3480 3480 Redundant: Model 1 -0.043 -0.027 0.005 0.005 -0.008 -0.029 -0.031 (0.030) (0.030) (0.050) (0.050) (0.040) (0.030) (0.030) N 3480 3480 1320 1320 1320 3480 3480 Misalignment 0.051 -0.023 0.232*** 0.231*** 0.035 0.002 -0.01 (0.030) (0.030) (0.060) (0.060) (0.050) (0.040) (0.040) N 3480 3480 3310 3310 3310 3480 3480 Remediation -0.589*** -0.082* 0.017 0.024 0.009 -0.031 -0.068+ (0.030) (0.030) (0.080) (0.070) (0.060) (0.040) (0.040) N 3480 3480 2010 2010 2010 3480 3480 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. *** p<0.001 * p<0.01 * p<0.05 + p<0.10. REDUNDANT COLLEGE MATHEMATICS Table 10. Estimated Differences between Redundant and Non-Redundant Math-Takers, 7 Specifications S1 S2 S3 S4 S5 S6 S7 Unmatched OLS Unmatched OLS PSM PSM PSM IPTW IPTW w/trim Propensity Score x Covariates x x x x First Year Credits Redundant: Model 5 0.328 -0.292 -0.737 -0.733 -0.922 0.195 0.174 (0.630) (0.600) (0.840) (0.830) (0.740) (0.820) (0.820) N 3460 3460 1010 1010 1010 3460 3460 Redundant: Model 4 0.851 -0.397 0.119 0.115 -0.408 -0.205 -0.205 (0.530) (0.500) (0.700) (0.700) (0.620) (0.550) (0.550) N 3460 3460 1400 1400 1400 3460 3460 Redundant: Model 1 -1.240* -0.742 -0.899 -0.85 -1.029 -0.121 -0.198 (0.540) (0.530) (0.720) (0.700) (0.660) (0.650) (0.650) N 3460 3460 1310 1310 1310 3460 3460 Misalignment -0.167 -0.763 3.592* 3.583* 0.284 0.081 -0.347 (0.530) (0.530) (1.510) (1.470) (0.880) (0.730) (0.690) N 3460 3460 3280 3280 3280 3460 3460 Remediation -7.901*** -1.979*** -0.312 -0.179 -0.676 -0.867 -1.247* (0.390) (0.510) (1.150) (1.060) (0.870) (0.580) (0.580) N 3460 3460 1990 1990 1990 3460 3460 Second Year Credits Redundant: Model 5 -0.952 -1.701** -2.011* -2.008* -2.055* -1.737** -1.794** (0.620) (0.570) (0.880) (0.880) (0.810) (0.640) (0.630) N 3460 3460 1010 1010 1010 3460 3460 Redundant: Model 4 0.3 -1.147* -0.636 -0.645 -1.131+ -1.172* -1.172* (0.530) (0.480) (0.720) (0.700) (0.620) (0.490) (0.490) N 3460 3460 1400 1400 1400 3460 3460 Redundant: Model 1 -2.470*** -1.980*** -2.364** -2.310** -2.430*** -1.654** -1.765** (0.560) (0.510) (0.820) (0.790) (0.720) (0.590) (0.580) N 3460 3460 1310 1310 1310 3460 3460 Misalignment -0.805 -2.248*** 1.179 1.178 -2.264* -2.092* -2.385** (0.580) (0.590) (1.920) (1.910) (1.150) (0.830) (0.800) N 3460 3460 3280 3280 3280 3460 3460 Remediation -9.063*** -2.803*** -1.859 -1.723 -2.223* -1.298* -1.940** (0.430) (0.580) (1.210) (1.140) (1.050) (0.650) (0.630) N 3460 3460 1990 1990 1990 3460 3460 Note: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. *** p<0.001 * p<0.01 * p<0.05 + p<0.10 REDUNDANT COLLEGE MATHEMATICS Table 11. Estimated Differences between Redundant and Non-Redundant Math-Takers, 7 Specifications S1 S2 S3 S4 S5 S6 S7 Unmatched OLS Unmatched OLS PSM PSM PSM IPTW IPTW w/trim Propensity Score x Covariates x x x x Total College Credits Redundant: Model 5 1.969 -2.505 -3.443 -3.44 -3.393 -2.027 -2.19 (2.750) (2.510) (3.940) (3.930) (3.570) (2.560) (2.540) N 3490 3490 1020 1020 1020 3490 3490 Redundant: Model 4 6.613** -0.392 1.521 1.533 -0.895 0.273 0.273 (2.330) (2.130) (3.160) (3.070) (2.690) (2.300) (2.300) N 3490 3490 1410 1410 1410 3490 3490 Redundant: Model 1 -4.907* -3.821+ -7.420* -7.389* -7.537* -2.379 -2.758 (2.490) (2.290) (3.550) (3.490) (3.130) (2.370) (2.350) N 3490 3490 1330 1330 1330 3490 3490 Misalignment 1.217 -6.616** 6.609 6.612 -6.467 -7.026* -8.160* (2.440) (2.560) (7.200) (7.230) (4.000) (3.480) (3.220) N 3490 3490 3320 3320 3320 3490 3490 Remediation -32.145*** -0.545 5.181 5.394 2.582 3.784 2.152 (1.930) (2.420) (5.060) (4.400) (3.760) (2.540) (2.600) N 3490 3490 2020 2020 2020 3490 3490 Earned College Degree Redundant: Model 5 -0.036 -0.060** -0.089** -0.089** -0.097*** -0.056* -0.057* (0.020) (0.020) (0.030) (0.030) (0.030) (0.020) (0.020) N 2920 2920 860 860 860 2920 2920 Redundant: Model 4 -0.014 -0.055** -0.024 -0.024 -0.04 -0.065** -0.065** (0.020) (0.020) (0.030) (0.030) (0.020) (0.020) (0.020) N 2920 2920 1200 1200 1200 2920 2920 Redundant: Model 1 -0.054** -0.045* -0.012 -0.014 -0.028 -0.041+ -0.045* (0.020) (0.020) (0.030) (0.030) (0.030) (0.020) (0.020) N 2920 2920 1110 1110 1110 2920 2920 Misalignment -0.024 -0.074** 0.045 0.045 -0.053 -0.092** -0.082** (0.020) (0.020) (0.090) (0.090) (0.040) (0.030) (0.030) N 2920 2920 2770 2770 2770 2920 2920 Remediation -0.203*** -0.053* -0.027 -0.028 -0.036 -0.045+ -0.053* (0.020) (0.020) (0.040) (0.040) (0.040) (0.030) (0.020) N 2920 2920 1720 1720 1720 2920 2920 Notes: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. *** p<0.001 * p<0.01 * p<0.05 + p<0.10 REDUNDANT COLLEGE MATHEMATICS Notes: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. *** p<0.001 * p<0.01 * p<0.05 + p<0.10 Table 12. Estimated Differences between Redundant and Non-Redundant Math-Takers in STEM Attainment, 3 PSM Specifications Baseline Model Model w/STEM Aspiration S3 S4 S5 S3 S4 S5 PSM PSM PSM PSM PSM PSM Propensity Score x x Covariates x x STEM Credits Earned Redundant: Model 5 -4.316+ -4.311+ -4.488+ -2.513 -2.501 -2.2 (2.620) (2.610) (2.390) (2.810) (2.790) (2.420) N 1020 1020 1020 970 970 970 Redundant: Model 4 -3.627 -3.634 -4.956* -2.879 -2.866 -3.723+ (2.290) (2.240) (1.990) (2.270) (2.250) (1.960) N 1410 1410 1410 1360 1360 1360 Redundant: Model 1 -5.361* -5.346* -5.449* -3.3 -3.302 -3.241 (2.410) (2.400) (2.180) (2.540) (2.500) (2.140) N 1330 1330 1330 1260 1260 1260 Misalignment 7.092 7.096 -1.426 7.341 7.343 -2.237 (5.740) (5.810) (2.610) (7.030) (7.040) (3.020) N 3320 3320 3320 3140 3140 3140 Remediation 3.139 3.241 2.287 5.205* 5.314* 3.333+ (2.610) (2.280) (2.140) (2.530) (2.140) (1.870) N 2020 2020 2020 1880 1880 1880 Earned STEM Degree Redundant: Model 5 -0.102** -0.101** -0.094** -0.065* -0.065* -0.055+ (0.030) (0.030) (0.030) (0.030) (0.030) (0.030) N 710 710 710 670 670 670 Redundant: Model 4 -0.105*** -0.104*** -0.102*** -0.050+ -0.050+ -0.064** (0.030) (0.030) (0.030) (0.030) (0.030) (0.020) N 980 980 980 970 970 970 Redundant: Model 1 -0.061* -0.063* -0.071** -0.070* -0.072* -0.065* (0.030) (0.030) (0.030) (0.030) (0.030) (0.030) N 880 880 880 820 820 820 Misalignment 0.055 0.053 -0.016 0.015 0.012 -0.055+ (0.050) (0.060) (0.030) (0.070) (0.070) (0.030) N 2280 2280 2280 2180 2180 2180 Remediation -0.005 0.003 0.003 -0.029 -0.02 -0.022 (0.020) (0.020) (0.020) (0.030) (0.030) (0.030) N 1130 1130 1130 1070 1070 1070 REDUNDANT COLLEGE MATHEMATICS Table 13. Sensitivity of Model 5/Specification 5 Treatment Effect Estimates to Alternative Samples and Covariates Main Results (M5/S5) vs. Warr. Rem. One Year Math No Stats Two Year Only Four Year Only 5 Nearest Neigh. w/STEM Occ @30 Mean Sub. No Math Self-Eff & HS Reqs. With Imputed Zeroes Pass 1st Math w/B or Better 0.102*** 0.062+ 0.139*** 0.122*** 0.168** 0.143*** 0.109*** 0.089** 0.147*** 0.137*** 0.109*** (0.030) (0.03) (0.04) (0.03) (0.06) (0.03) (0.02) -0.03 (0.02) (0.03) (0.02) N 1030 980 750 950 270 800 2090 970 1560 1300 2090 Total College GPA -0.081* -0.051 -0.022 -0.07 -0.032 -0.034 -0.049 -0.064 -0.069* -0.086* -0.039 (0.040) (0.05) (0.05) (0.05) (0.10) (0.04) (0.03) -0.04 (0.03) (0.04) (0.03) N 1020 980 750 950 270 800 2080 960 1560 1290 2090 First Year Credits -0.922 0.457 -0.078 0.04 -1.947 -0.457 -0.254 -0.975 -0.57 -0.392 -0.201 (0.740) (0.82) (0.84) (0.80) (1.74) (0.82) (0.62) -0.75 (0.60) (0.64) (0.62) N 1010 970 740 940 260 800 2070 960 1540 1290 2090 Second Year Credits -2.055* -0.708 -1.236 -1.567+ -3.964* -2.177** -1.790** -1.810* -1.674** -1.664* -1.746** (0.810) (0.82) (0.92) (0.83) (1.66) (0.84) (0.63) -0.75 (0.62) (0.68) (0.63) N 1010 970 740 940 260 800 2070 960 1540 1290 2090 Total Credits Earned -3.393 -0.741 -0.629 -5.713 -18.127* -1.735 -2.535 -5.656 -4.616+ -5.341+ -2.035 (3.570) (3.65) (4.12) (3.64) (7.49) (3.48) (2.81) -3.62 (2.68) (3.24) (2.81) N 1020 980 750 950 270 800 2090 970 1560 1300 2090 Earned College Degree -0.097*** -0.059+ -0.086* -0.091** -0.114+ -0.03 -0.071** -0.057+ -0.042+ -0.065* -0.022 (0.030) (0.03) (0.03) (0.03) (0.07) (0.03) (0.02) -0.03 (0.03) (0.03) (0.02) N 860 840 640 810 230 680 1750 830 1320 1100 2090 STEM Credits Earned -4.488+ -2.841 -2.691 -7.022** -10.447* -2.951 -4.376* -2.2 -7.626*** -4.486+ -4.186* (2.390) (2.41) (2.69) (2.54) (4.71) (2.64) (1.95) -2.42 (1.96) (2.30) (1.95) N 1020 980 750 950 270 800 2080 970 1560 1300 2090 Earned STEM Degree -0.094** -0.052+ -0.103** -0.061* -0.099 -0.052+ -0.092*** -0.055+ -0.092*** -0.086** -0.078*** (0.030) (0.03) (0.04) (0.03) (0.06) (0.03) (0.02) -0.03 (0.02) (0.03) (0.02) N 710 670 510 670 120 600 1480 670 1020 850 2090 Notes: All unweighted sample sizes were rounded to the nearest 10 per IES guidelines for restricted-use data. *** p<0.001 * p<0.01 * p<0.05 + p<0.10 HOLISTIC MEASURES & MATH PLACEMENT 80 Mathematics Course Placement Using Holistic Measures: Possibilities for Community College Students Federick J. Ngo, W. Edward Chi, & Elizabeth S. Park What does it mean to be ready for college? And how do colleges know? In the community college setting, the answers to these questions are usually informed by a placement test that students take when they begin or restart their educational careers. Over 90 percent of all community colleges in the country use a placement test to determine students’ readiness for college-level coursework (Fields & Parsad, 2012). At the same time, nearly 60 percent of all incoming community college students in the nation enroll in a remedial course, most likely because they are deemed under-prepared for college-level coursework (Bailey, Jeong, & Cho, 2010; NCPPHE & SREB, 2010). Yet placement tests are admittedly imperfect instruments. Recent research has estimated that nearly 25 percent of students may be mis-placed into their math courses by commonly-used placement tests (Mattern & Packman, 2009; Scott-Clayton, Crosta, & Belfield, 2014), with potentially serious consequences for educational attainment (Melguizo, Bos, Ngo, Mills, & Prather, 2016). Researchers examining these issues have found that using additional information such as those available in high school transcripts and math diagnostics could improve placement accuracy and reduce the rate of placement errors (Ngo & Kwon, 2015; Ngo & Melguizo, 2016; Scott-Clayton et al., 2014). In tandem with these findings, a number of states (e.g., Florida, North Carolina, Texas) have enacted legislation promoting the incorporation of multiple indicators of students’ academic readiness into community college placement policies (Burdman, 2012). California, the setting for the study, has mandated the use of these “multiple measures” HOLISITIC MEASURES & MATHEMATICS PLACEMENT 81 since the early 1990s, but there is wide variation in and few evaluations of these practices (Melguizo, Kosiewicz, Prather, & Bos, 2014; Perry, Bahr, Rosin, & Woodward, 2010). What remains unknown is whether using a holistic placement approach that includes non- cognitive 1 measures can improve placement accuracy in community colleges. Non-cognitive measures are those not specifically related to academic content knowledge or skills, such as, but not limited to, students’ college plans (e.g., use of time) and their beliefs about the importance of math or college (Sedlacek, 2004). While indicators of some of these constructs are implicitly and sometimes explicitly used in the selection and sorting processes in four-year institutions (e.g., via college admissions essays and letters of recommendation), they largely remain untested and unused in the community college setting. Further, non-cognitive attributes have been shown to be predictive of students’ postsecondary success (Sedlacek, 2004; Melguizo, 2010) but have rarely been examined in the context of community colleges which serve large numbers of students of color, low-income students, and first-generation college students (Horn, Nevill, & Griffith, 2006). We therefore ask two research questions: 1) What possibilities are there for using a more holistic placement approach that includes non-cognitive measures to better identify college readiness among community college students? 2) Do non-cognitive measures improve placement accuracy in developmental math? We focus on the use of indicators of non-cognitive constructs for course placement in a large urban community college district (LUCCD) in California, and we conduct two sets of analyses to answer these questions. The first is a predictive exercise that examines possibilities for using indicators of non-cognitive constructs in placement decisions. To do so, we use methods outlined by Scott-Clayton et al. (2014) to estimate severe placement errors, but capitalize on the availability of non-cognitive questionnaire data that LUCCD colleges HOLISITIC MEASURES & MATHEMATICS PLACEMENT 82 simultaneously collected during the time of placement testing. We calculate placement error rates for all colleges under existing placement policies (e.g., placement test scores and other academic background measures) and compare these to estimates of placement error when indicators of non-cognitive measures are included in the prediction equation. We emphasize here that the non-cognitive questionnaire items are proxy indicators of students’ non-cognitive attributes and not necessarily measures validated in prior literature. In the second set of analyses, we examine actual placement algorithms in two colleges that factor in these non-cognitive indicators, such as those of motivation and college plans, into placement decisions. We examine the outcomes of students who were able to take a higher-level course due to additional points they earned based on non-cognitive indicators, comparing them to peers placed in the same level but who scored higher on placement tests. Our study adds insight to the broader question of whether a more holistic approach to mathematics course placement that includes indicators of non-cognitive attributes can be useful within the open-access setting of community colleges. In contrast to four-year colleges with selective admissions, selection and sorting in the community college setting primarily happens during remedial screening where students who are deemed not college-ready may be referred to remedial coursework (Hughes & Scott-Clayton, 2011). Since this typically hinges on the result of a placement test, incorporating non-cognitive information into the screening process may provide opportunity and access for students who do not appear to be academically prepared based on their placement test results alone. Indeed, our analyses first show that a substantial portion of students, as many as one quarter, may be considered as mis-placed under current test- based placement practices, and that high school background and non-cognitive indicators may offer an improvement over status quo practices. When we test this hypothesis in two colleges HOLISITIC MEASURES & MATHEMATICS PLACEMENT 83 that actually factor non-cognitive information into placement rules, we find that this holistic approach can increase access to higher-level math courses without compromising the likelihood of success in those courses. The paper proceeds as follows. We first discuss the role of placement testing and selection processes in community colleges. We highlight research on non-cognitive measures, and then draw upon expanding conceptions of college readiness and validation theory to frame the study. We then describe the LUCCD and our two analytical approaches – one that investigates possibilities for using non-cognitive indicators and one that evaluates existing placement practices already using such indicators. We present the findings and discuss how our work can add insight to current reforms in assessment and placement in community colleges. Background Placement Tests and Academic Measures of College Readiness Given that community colleges are open-access institutions serving a diverse body of students, they need some means of identifying academic preparedness and directing students towards appropriate course work. Placement tests are commonly used in community colleges for this purpose, and two tests, the College Board’s ACCUPLACER and the ACT Inc.’s COMPASS, have dominated the market (Conley, 2010). 2 These tests are multiple-choice, adaptive, computer-administered, and are used to assess subjects like math, English, and reading. Regarded as a cost-efficient way to assess students’ academic abilities, the placement test score is the primary measure that determines where students start in their educational trajectory (Hughes & Scott-Clayton, 2011). Despite the near ubiquity of placement testing, studies investigating the predictive validity of placement tests have found that correlations between test scores and student HOLISITIC MEASURES & MATHEMATICS PLACEMENT 84 achievement are weaker than those between student background variables and achievement (Armstrong, 2000). In fact, Jenkins, Jaggars, and Roksa (2009), examining data from Virginia community colleges, found no significant relationship between reading and writing placement tests and whether students passed gatekeeper English courses, though they did find a relationship between math placement tests and whether students passed gatekeeper math courses. These findings, along with concerns about the accuracy of placement tests, have fostered growing interest in using multiple measures and a more holistic approach to improve placement decisions (Burdman, 2012; Smith, 2016). Measures such as the level of prior math courses taken and high school GPA are known to be strong predictors of college course completion and success and can be used to identify readiness for college-level work (e.g., Armstrong, 2000; Noble & Sawyer, 2004). Adding to this evidence, Ngo and Kwon (2015) found that community college students who were placed using academic background measures (e.g., prior math and GPA) in addition to test scores performed no differently from their peers who earned higher test scores. This study and others (e.g., Fong & Melguizo, 2016; Marwick, 2004) suggest that using multiple measures may increase access to higher-level math without compromising students’ likelihood of success in those courses. Research on Non-Cognitive Measures One understudied question is whether non-cognitive measures can also improve placement decisions. The logic for incorporating these measures stems from research in educational psychology, which demonstrates that an array of non-cognitive attributes beyond cognitive skills are predictive of college success and future outcomes (Duckworth, Peterson, Matthews, & Kelly, 2007; Duckworth & Yeager, 2015; Noonan, Sedlacek, & Veerasamy, 2005; Porcea, Allen, Robbins, & Phelps, 2010; Robbins et al., 2006). In lieu of providing a HOLISITIC MEASURES & MATHEMATICS PLACEMENT 85 comprehensive review of all non-cognitive factors and measures associated with college student success, we focus only on those we believe to be related to the measures used by LUCCD colleges to assess and place students in developmental math sequences. These include use of time, motivation, and social support. To our knowledge, the measures used by LUCCD are not directly tied to particular constructs or scales in the literature. We therefore discuss the general literature on each of these non-cognitive areas. One such area is student’s use of time. Researchers examining college students’ time use found that in certain populations, working concurrently in college predicts weaker academic outcomes (Ehrenberg & Sherman, 1987; Pascarella et al., 1998; Stinebrickner & Stinebrickner, 2003) and that time studying predicts improved academic outcomes (Michaels & Miethe, 1989; Rau & Durand, 2000; Stinebrickner & Stinebrickner, 2004; 2008). These studies suggest that employment while attending school is associated with decreased likelihood of persistence and lower academic outcomes. Therefore, measuring students’ intended use of time may be an important consideration in the remedial screening process, as community college students in particular are more likely to work while attending college (Horn et al., 2006). Motivation is another well-studied non-cognitive construct that is predictive of college success (Pintrich & Shunk, 2002; Robbins et al., 2006). Theories of motivation can explain an individual’s choices, effort, and persistence in various tasks (Pintrich, 2003; Covington, 2000). For example, the concept of expectancy value within motivation research suggests that individuals make certain decisions or enact certain behaviors because they are motivated by the expected results of those behaviors (Eccles & Wigfield, 2002). Relatedly, motivation may stem from utility or task value, which refers to how and whether individuals perceive tasks as having positive value or utility because they facilitate important future goals (Husman & Lens, 1999). HOLISITIC MEASURES & MATHEMATICS PLACEMENT 86 These values can therefore directly influence performance, persistence, and task choice. A student’s motivation, as understood through these values, may encourage persistence in the face of challenging or boring academic learning contexts and therefore be predictive of success in those contexts (Miller & Brickman, 2004). One’s sense of social support may also influence college outcomes (Noonan et al., 2005). This may be related to the concept of mattering, defined as the feeling one is personally important to someone else (Cooper, 1997; Gossett et al., 1996; Marshall, 2001; Rosenberg & McCullough, 1981; Schlossberg, 1989; Tovar, Simon, & Lee, 2009). In studies of college students, a stronger sense of mattering is linked to pro-academic behaviors and affects (Dixon & Kurpius, 2008; Dixon Rayle & Chung, 2007; France & Finney, 2010). In another study, Dennis, Phinney and Chuateco (2005) assessed the extent to which motivation to attend college and the availability of social support from family and peers influenced academic success in ethnic minority college students. They included survey items such as how supportive family and peers were of students’ college aspirations, and students’ beliefs about attending college. The researchers found that personal interest, desire to attain a rewarding career, and intellectual curiosity were all related to successful adjustment in college. Finally, Sedlacek (2004) demonstrated that non-cognitive measures of adjustment, motivation, and leadership are predictors of postsecondary success, particularly for under-represented minority students. Use of Non-Cognitive Measures While studies find positive associations between non-cognitive measures and college outcomes, whether it is beneficial to use non-cognitive measures or indicators of them to inform selection processes remains an outstanding question. Some evidence from four-year institutions has suggested that incorporating non-cognitive measures into college admissions can be HOLISITIC MEASURES & MATHEMATICS PLACEMENT 87 favorable (Noonan et al., 2005; Sedlacek, 2004; Sternberg, Gabora, & Bonney, 2012). However, despite calls in the literature for the use of holistic assessments of student readiness for college coursework (Boylan, 2009; Hughes & Scott-Clayton, 2011), non-cognitive measures are rarely used to make course placement decisions in community colleges (Gerlaugh, Thompson, Boylan, & Davis, 2007). We found just one completed study examining placement using non-cognitive measures in the community college setting. 3 The Educational Testing Service (ETS) conducted a study to investigate the usefulness of SuccessNavigator, a commercial product that incorporates psychosocial/non-cognitive measures, including personality, motivation, study skills, intrapersonal and interpersonal skills, and other factors beyond cognitive ability (Markle et al., 2013). Examining a set of community colleges, Rikoon et al. (2014) compared mathematics course passing rates between students placed in math courses using standard institutional practice (i.e., the COMPASS placement test) and those placed using the ETS SuccessNavigator mathematics course placement index in conjunction with test scores. They found no statistically significant differences in course passing rates between the two groups. Since students placed using the ETS instrument and test scores were just as successful as their peers in a higher level course, this suggests that the non-cognitive measures may be useful for course placement. The goal of the present study is to complement this work by examining non-cognitive measure use in another community college setting. We also analyze survey data rather than data gathered from a proprietary instrument, which may be a more viable option for resource-strapped colleges. We frame the study using expanding conceptions of college readiness and modern approaches to validation, which we discuss next. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 88 Conceptual Framework College Readiness The policy interest in using both high school background measures and non-cognitive measures is in accordance with expanding notions of college-readiness. Historically, college- readiness has been measured by students’ academic ability and cognitive skills but it has also expanded to include non-cognitive attributes and college knowledge that are thought to be essential for success in college (Almeida, 2015; Duncheon, 2015; Roderick, Nagaoka, & Coca, 2009; Sedlacek, 2004). The underlying logic is that what determines whether students will be successful in college is broader than cognitive skill or academic background. The research described in the previous section demonstrates that non-cognitive attributes play an important role in explaining persistence and success in college. In line with expanding notions of college readiness, we test whether these broader concepts of college readiness can be imported into assessment and placement processes in community colleges. Could selection into developmental courses be improved by expanding concepts of college-readiness beyond academic and cognitive attributes of students? Validation Modern validation theory makes it possible to identify and evaluate the usefulness of non-cognitive measures of college-readiness. In modern validation theory, a validation argument considers the interpretation, purposes, and uses of a measure in addition to its predictive properties; it emphasizes the examination of outcomes that result from the uses of the measure (Kane, 2006). While past conceptions of validation relied mainly on determining the correlation between a measure such as tests scores and college outcomes, this theory suggests that the HOLISITIC MEASURES & MATHEMATICS PLACEMENT 89 intended use and purpose should guide how the analysis is conducted to determine predictive properties (Kane, 2001). Therefore, the validity of a measure such as a placement measure is based on the actual decisions or proposed decisions made using the measure, and not simply the correlation between a measure such as test scores and subsequent outcomes. In the current context, the measures used to make course placement decisions in developmental math would therefore be evaluated in terms of the relevant student outcomes – placement and success in the highest-level course possible (Kane, 2006), and the frequency with which these accurate placements occur (Sawyer, 1996; 2007; Scott-Clayton, 2012). If a measure places students into higher-level courses and they are successful in those courses, then using those measures improved placement accuracy. If the measures led to placements resulting in worse outcomes, then using those measures did not improve accuracy. In order to validate non-cognitive measures in the context of community colleges, there must be a context where measures of non-cognitive attributes, or in this case indicators of them, are actually used to make placement decisions. Our study takes advantage of the fact that some colleges factor in what we are argue are indicators of non-cognitive attributes into their placement process. This enables us to assess the validity of these placement measures (Sawyer, 1996; 2007; Kane, 2006). In a related analysis, we also predict the potential usefulness of these indicators for identifying and avoiding placement error in colleges that mainly rely on placement tests. Setting & Data The setting for the study is a Large Urban Community College District (LUCCD) in California that enrolls over 100,000 students each semester. 4 Being open-access institutions, the HOLISITIC MEASURES & MATHEMATICS PLACEMENT 90 nine colleges in the district serve a widely diverse body of students, with more than a quarter of students over 35 years of age, and over 40 percent indicating that their native language is not English. Close to 90 percent of students report having completed a high school level education. 5 This student population is different from the national community college student population since about two-thirds of all students in these California colleges identify as African-American or Latina/o. In contrast, the majority of students who enter a community college in the U.S. are White, and just over one-third identify as African-American or Latina/o (NCES, 2014). Each college has considerable autonomy over choice and use of placement tests. The colleges are also required by California law to utilize some combination of “multiple measures” to inform placement decisions (Perry et al., 2010). In the LUCCD, colleges chose to consider items from self-reported background questionnaires as multiple measures. Table 1 shows the placement tests and additional measures used to make course assignments in developmental math in each college. Table 2 shows the types of self-reported background information from the Educational Background Questionnaire (EBQ) also collected at the time of assessment. [Insert Tables 1 and 2 about here] The data for the study consist of all student-level assessment and enrollment records for students assessed between 2005 and 2012, tracked through fall 2013. We focus on the sample of students who took math placement assessments, had not already earned a college degree, were not concurrently enrolled in high school, and were under the age of 65. Since we are interested in non-cognitive measures, we focus on six colleges that collect indicators of this information: B, C, D, F, G, and J. 6 The data available enable us to examine important short- and long-term student outcomes such as passing the math course in which they are placed and earning 30 degree-applicable units. Table 3 presents a demographic profile of each college included in our HOLISITIC MEASURES & MATHEMATICS PLACEMENT 91 analyses. The table shows that each college is unique in its student composition. However, the overall pattern is that Latinas/os and African Americans are the two largest racial groups served by all the colleges in the study. In addition, the table delineates how students were placed in each level of the developmental math sequence by college. While placement distribution varies among different colleges, the majority of students were placed in the three lowest levels -- either elementary algebra, pre-algebra, and arithmetic – and few students placed into intermediate algebra or above. [Insert Table 3 about here] Methods Which Students May Have Been Placed in Error? We examined the possibility of using non-cognitive measures for placement in developmental math by capitalizing on indicators of this information collected via the EBQ in each college. Our first methodological approach enabled us to assess whether utilizing indicators of non-cognitive attributes can help to identify and thus avoid placement errors. Based on our analysis of each college’s EBQs, three areas of non-cognitive attributes were common across sets of colleges: motivation, college plans (e.g., use of time), and social supports (see Table 2). The analytical approach we used to understand how non-cognitive measures can identify placement errors follows the sequence described in Scott-Clayton et al. (2014). The procedure estimates the overall proportions of students placed successfully and students placed in error for each level of math in a developmental sequence. Students placed successfully are those who were either placed into a math class level they were predicted to pass or those who were placed into one level below a math class they were predicted to fail. Students placed in error are those HOLISITIC MEASURES & MATHEMATICS PLACEMENT 92 who were either: over-placed, placed into a math class that they were predicted to fail, or under- placed, placed into one level below a math class that they were predicted to pass. We identified placement error for every level of the developmental math sequence, from arithmetic to college-level math. We also performed the procedure using different combinations of cognitive and non-cognitive indicators (described further below). In the case of college-level math (CM), the respective logit models are: logit [ 𝐹𝐹𝐹𝐹 𝐹𝐹𝐹𝐹 𝐶𝐶𝐶𝐶 ] 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 2 𝑁𝑁 𝐶𝐶 𝑁𝑁 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 3 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (1) logit[ 𝑃𝑃 𝐹𝐹 𝑃𝑃 𝑃𝑃 𝐶𝐶𝐶𝐶 𝑤𝑤 / 𝐵𝐵 𝑜𝑜 𝑜𝑜 𝑏𝑏 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑜𝑜 ] 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 2 𝑁𝑁 𝐶𝐶 𝑁𝑁 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 3 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (2) Here 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 is the test score, and 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 and 𝑁𝑁 𝐶𝐶 𝑁𝑁 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 are proposed additional measures used for placement, and 𝑿𝑿 is a vector of student level demographic characteristics, including age, race, gender, language, and residence status, added as controls for factors that may be associated with college success. The obtained coefficients are extrapolated to students placed in the course below (e.g., intermediate algebra is one-level below college math) to predict each student’s probabilities of success and failure in intermediate algebra. We used the probabilities to identify students placed successfully and students placed in error at each level. Specifically, we identified severe placement errors, defined by two criteria: 1) students predicted to fail the upper-level course they were placed into, or 2) students predicted to pass the upper-level course with a B or better, but were placed into a course one level below. 7 We estimated the proportion of severe placement errors at each level of math in the developmental sequence for each college. Since we are interested in comparing various placement scenarios, we calculated the percent of severe placement errors using different combinations of measures: 1) With placement test scores alone, 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 , 8 HOLISITIC MEASURES & MATHEMATICS PLACEMENT 93 2) with additional academic background measures (e.g., HS GPA or prior math experience), 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 , 3) with non-cognitive indicators obtained from colleges’ EBQs (e.g., motivation or college plans), 𝑁𝑁 𝐶𝐶 𝑁𝑁 𝐶𝐶 𝐶𝐶 𝐶𝐶 𝑖𝑖 . This analysis enabled us to determine whether high school background and non-cognitive measures – a more holistic profile – could improve upon placement results based on cognitive measures (placement tests) alone. 9 The calculated rate of severe placement errors for each set of alternative criteria is the proportion of students that can be considered as having been placed in error by status quo practices, and therefore, the set of students for whom placement errors could be avoided. The rate of severe placement errors is not a measure of the placement accuracy of each set of measures. Instead, it estimates the amount of error in existing placement policy. Do Non-Cognitive Measures Actually Improve Placement Accuracy? We then took advantage of the placement decision rules in two colleges (Colleges F and J) where proxy indicators of non-cognitive constructs were actually factored into placement algorithms. College F awarded up to two additional points to students based on their college plans (units enrolled and expected employment), how important a college education was to them, and how long they have been out of school. 10 College F used the COMPASS, with 100 test points possible. College J awarded one additional point to students who indicated that math was important, and four other possible points for cognitive measures (high school math background and receipt of a diploma or GED). College J used the ACCUPLACER, with 120 points possible. Since we could identify student responses on each college’s EBQ along with raw placement test scores, we could therefore examine the success outcomes of students whose final placements were due to the additional points earned from these non-cognitive indicators. Although the HOLISITIC MEASURES & MATHEMATICS PLACEMENT 94 additional points may seem nominal relative to the placement tests, they did provide a benefit to some students. About 1.8 percent and 26.4 percent 11 of students in Colleges F and J, respectively, were placed in a higher-level course based on their combined score. Modern validation theory would suggest that measures used for placement in developmental courses are valid if they increase access to higher-level math courses without compromising the likelihood of success in those courses (Kane, 2006; Sawyer, 1996). Therefore, we compared the outcomes of students whose resulting placements were “boosted up” due to non-cognitive indicators with the outcomes of students in the same level of math whose placements were the result of their placement test scores. Specified as a linear regression model, the main variable of interest ( 𝐵𝐵 𝐶𝐶 𝐶𝐶 𝑇𝑇 𝑇𝑇 𝑖𝑖 ) is a dummy indicator that equals one for students whose responses to the non-cognitive oriented questions on the EBQ resulted in a “multiple measure boost” to a higher-level course. We tested the relationship between earning this multiple measure boost and the outcome of interest ( 𝑦𝑦 𝑖𝑖 ), passing the course in which the student was placed. We also examined the relationship between receiving a placement boost and the outcome of earning 30 degree-applicable credits, which are half the units required for an associate’s degree. The linear probability model is: 𝑦𝑦 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝐵𝐵 𝐶𝐶 𝐶𝐶 𝑇𝑇 𝑇𝑇 𝑖𝑖 + 𝛽𝛽 2 𝐶𝐶𝐶𝐶𝑃𝑃 𝐶𝐶 𝑀𝑀 𝑁𝑁 𝑇𝑇 𝑇𝑇 𝑖𝑖 + 𝛽𝛽 3 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (3) 𝛽𝛽 1 R captures the difference in average outcomes between students who were assigned to the course due to additional points from non-cognitive indicators and students who had higher raw test scores. We include 𝐶𝐶𝐶𝐶𝑃𝑃 𝐶𝐶 𝑀𝑀 𝑁𝑁 𝑇𝑇 𝑇𝑇 𝑖𝑖 , the number of multiple measure points, 12 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 the raw placement test score (normalized), and all 𝑿𝑿 covariates as before. We also estimated models controlling for math level and cohort. For each model we compared boosted students to other students just above the cutoff, as well as to all students in the same math level. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 95 Unlike College F, which used only non-cognitive indicators to augment its placement algorithm, College J awarded points for both academic and non-cognitive indicators. Thus, for College J we could differentiate between students whose boost was solely from academic measures from those who obtained points for both academic and non-cognitive indicators. 13 We identified the differences between these students by including an interaction term with a dummy variable 𝑁𝑁 𝐶𝐶 𝑖𝑖 . The variable 𝑁𝑁 𝐶𝐶 𝑖𝑖 is an indicator of the student’s motivation and equals one when students marked that math was very important to their goals. It equals zero for students who responded not important or somewhat important. The model with this interaction is: 𝑦𝑦 𝑖𝑖 = 𝛼𝛼 0 + 𝛽𝛽 1 𝐵𝐵 𝐶𝐶 𝐶𝐶 𝑇𝑇 𝑇𝑇 𝑖𝑖 + 𝛽𝛽 2 𝑁𝑁 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 3 𝐵𝐵 𝐶𝐶 𝐶𝐶 𝑇𝑇 𝑇𝑇 𝑖𝑖 ∗ 𝑁𝑁 𝐶𝐶 𝑖𝑖 + 𝛽𝛽 4 𝐶𝐶𝐶𝐶𝑃𝑃 𝐶𝐶 𝑀𝑀 𝑁𝑁 𝑇𝑇 𝑇𝑇 𝑖𝑖 + 𝛽𝛽 5 𝑇𝑇 𝑇𝑇 𝑇𝑇𝑇𝑇 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 (4) The interaction term enabled us to determine whether there were differential effects for those who earned points for both cognitive and non-cognitive indicators relative to those whose boost was from cognitive measures alone. Findings Possibilities for Non-Cognitive Measures We first investigated possibilities for the use of non-cognitive indicators either in addition to or in place of common placement tests. We did so by estimating the percent of severe placement errors at each placement cutoff within each college. As described above, the procedure involves using different sets of alternative placement criteria (e.g., placement tests, high school background measures, and non-cognitive indicators) to model successful placement of students, which we defined as: students predicted to pass the math course of interest with a B or better and placed into that math course, and students predicted to fail the math course of interest and placed into one level below that math course. Severe placement errors are those HOLISITIC MEASURES & MATHEMATICS PLACEMENT 96 where students predicted to fail the math course of interest were placed into that math course, and students predicted to pass the math course of interest with a B or better were placed into one level below that math course. 14 The full results are presented in Tables 4 and 5. Table 4 shows the results of the analysis when the entire sample of students within each pair of math levels is included (e.g., all students in intermediate algebra (IA) or elementary algebra (EA)), and Table 5 shows the results for students within a five-point bandwidth around the cutoff. The first column of each table shows the estimated percent of severe placement errors under the status quo practice (which is essentially placement test score alone). There is a range of estimates across colleges, from nearly zero error to over thirty percent of students identified as placed in error. In comparing error rates across placement scenarios, a higher rate of errors does not suggest the scenario’s placement criteria would produce more placement errors. Rather, a higher estimate suggests the alternative placement criteria identify a larger fraction of placement errors in existing practice that can possibly be avoided. We also decomposed the total estimated placement error into its two respective parts – the percent of students considered as under-placed and the percent considered as over-placed. This also reveals variation across levels and colleges. This is expected given variation in the level of the placement cutoff set at each college and institutional differences between colleges. Nevertheless, a common pattern emerges. Comparing columns 1-4 with columns 5-7 reveals that the proportion of placement error identified in existing practice rises when high school background and non-cognitive indicators are utilized to model student success. This is the finding in nearly all colleges and levels, and it suggests that more students can be considered as having been placed in error by existing practice when these alternative criteria are used to HOLISITIC MEASURES & MATHEMATICS PLACEMENT 97 model the likelihood that students would fail or pass the higher-level course, compared to when test scores are incorporated in the model. Patterns are consistent across the full and narrow bandwidth specifications, though the severe placement error estimates are larger when all students are considered. Moving forward, we focus on the narrower bandwidth of five points right around each placement cutoff to better understand sorting and accuracy around each cutoff. Given the similarity in patterns across colleges and levels, we also calculated the average severe errors for each column to better summarize these trends. Figures 1 and 2 visually present these average results. The first bar of Figure 1 shows the severe placement error of existing placement practices, which typically consider a student’s placement test score and additional points from multiple measures (academic) when making placement decisions. On average, the estimate is that 8.7 percent of students within a five-point bandwidth around the cutoff can be considered as severe placement errors (11 percent for the full bandwidth). Figure 2 decomposes errors into the proportion of under- and over-placement, and these results indicate that more students appear to be under-placed when high school background and non-cognitive indicators are used to examine placement errors. 15 Students who are identified as under-placed presumably could have passed the higher-level course. [Insert Figure 1 and Figure 2 about here] The primary goal of the exercise is to understand whether or not alternative measures identify more or less placement error than the status quo practice. Overall, the results indicate that substantially more students can be considered as having been placed in error when high school background and non-cognitive indicators are used to identify placement errors, compared to when placement test scores are used. For example, if the estimating equation for success/failure in the upper-level course consists of high school background measures (e.g., HS HOLISITIC MEASURES & MATHEMATICS PLACEMENT 98 GPA, prior math, etc.) or non-cognitive measures (e.g., motivation), then it appears that 22-25 percent of students in the full bandwidth and 17-21 percent in the five-point bandwidth have been placed in error by current practices. Since these students that can be considered as errors are those who could have passed the upper level course if placed there or would have fared better in the lower-level course, then it follows that these alternative schemes may offer an improvement over current practices. The answer to whether non-cognitive indicators of motivation, social support, and college plans might improve placement accuracy is less clear. We see that indeed the use of HSB and non-cognitive indicators identified the most amount of error, suggesting that these measures may be suitable alternatives to current practices. Comparing the HSB-only results with HSB and non- cognitive indicators, we see that non-cognitive indicators may make marginal improvements, but our supplementary analysis also revealed that the HSB measures were generally positively correlated with the non-cognitive indicators. Non-Cognitive Measures in Use We therefore complement this analysis with evidence from two colleges in the district, Colleges F and J, which actually do factor in indicators of non-cognitive constructs into their placement algorithms. These colleges award supplemental points which are added (or in some cases subtracted) from the raw placement test scores to determine the final score used to determine math placement. To this end, students’ placement results may be directly related to some non-cognitive attributes. In fact, the policy of using these non-cognitive indicators increased access to higher-level courses for students who otherwise would have been in a lower- level math course based on placement test scores alone. As mentioned above, 1.8 percent and HOLISITIC MEASURES & MATHEMATICS PLACEMENT 99 26.4 percent of students in Colleges F and J, respectively, were placed in a higher-level course based on their final scores after the multiple measures were considered. The question of interest is whether students who received a multiple measure boost generally performed differently from their higher-scoring peers in terms of passing the placed math course and completing 30 degree-applicable units. 16 Tables 6 and 8 present the results from this analysis. Based on our estimation of equation (3) we found no evidence of differences for students around the placement cutoffs for College F (Table 6) or J (Table 8). We show that these are robust to model specifications that include placement level dummies, cohort fixed effects, and student background variables. The null results in Colleges F and J suggest that the use of non-cognitive indicators increased access to higher-level courses without compromising the likelihood of success in those courses. [Insert Table 6 here] Since the multiple measure boost in these two colleges consisted of additional points drawn from a number of survey questions (e.g., college enrollment and employment plans, importance of math, and time since last enrollment in College F; highest math, HS diploma/GED, and importance of math in College J) we also investigated heterogeneity by boost type. That is, we attempted to disentangle those multiple measure boosts that were largely due to points from academic measures from those that included points from the non-cognitive indicators. This was only possible in College J, where there was enough variation in answer choice and a large enough sample size to identify a group of students who, although they earned a multiple measure boost into a higher-level course, indicated that they consider math to not be important or only somewhat important personally towards their educational goals (See Table 7, N=547). HOLISITIC MEASURES & MATHEMATICS PLACEMENT 100 [Insert Table 7 here] We therefore included a dummy variable that equaled one for each student that said math was important, and interacted this with the boost variable as shown in equation (4). The results in Table 8 show no differential relationship between boost and student outcomes for the interaction term. This suggests that boosted students who indicated they did believe that math was important, along with those who did not indicate that math was important, did not exhibit differences in their outcomes. The two groups had statistically equivalent probabilities of passing the course. While this can be interpreted as evidence of the irrelevance of the motivation measure, we remind the reader that this result can still be considered as an improvement in placement accuracy. The students who received this boost due to a non-cognitive indicator were able to access a higher-level course and their likelihood of success in the course was the same as their peers. [Insert Table 8 here] Discussion The study contributes to concerns about selection and sorting processes at the start of community college, which have been of increasing policy interest in recent years. A number of states (e.g., Colorado, Florida, Texas) have begun to consider multiple measures, including non- cognitive measures, for developmental student advising, assessment, and placement practices (Bracco et al., 2014). However, there has been scant evaluation of these practices to inform policy, and our study attempts to fill this gap. We drew upon expanding conceptions of college readiness to frame our investigation of possibilities for a more holistic approach to placement that includes the use of non-cognitive indicators in contexts where they are currently not used. We found that using high school HOLISITIC MEASURES & MATHEMATICS PLACEMENT 101 background and non-cognitive indicators may help to identify and thus reduce and avoid some placement errors associated with test-based placement. This is key descriptive evidence indicating that alternative placement criteria may offer a means of improving upon status quo practices, and further research should continue to test this hypothesis. We were able to validate the use of non-cognitive indicators in two colleges. The results from Colleges F and J, which essentially incorporate non-cognitive characteristics in the identification of low-scoring students who could be moved to a higher-level course, reveal that students placed under this approach performed no differently from their higher-scoring peers. Interestingly, they also performed no differently from their peers whose boost was based primarily on academic background measures. The study therefore contributes to the burgeoning literature on using multiple measures such as high school transcript information to optimally place students (Fong & Melguizo, 2016; Scott-Clayton et al., 2014; Ngo & Kwon, 2015), but adds the important component of evaluating the use of indicators of non-cognitive attributes. Similar to the aforementioned studies of academic measures, we find that non-cognitive indicators may make marginal improvements in placement accuracy over high school background factors alone or test scores alone. Limitations A limitation of these analyses is that we were unable to observe instructor characteristics, which may be important determinants of developmental math student outcomes (Chingos, 2016). We also relied on instructors’ grades as a metric of success. If math instructors adjust their instruction or grading in order to meet the needs of students, this would bias the estimate of the relationship between academic background measures, such as placement test scores, and course success. We reasoned that since math is a fairly hierarchical subject, instructor’s grading HOLISITIC MEASURES & MATHEMATICS PLACEMENT 102 practices may not vary as much as it may in other subjects (e.g., English). To mitigate this bias, we also chose to focus on the B or better criterion, which is a more cautious approach to identifying error than using a C or better criterion (Scott-Clayton et al., 2014). There likely is less variation in grading practices related to awarding an A or B grade than a C grade. A second limitation is that results may be more emblematic of where placement cutoffs are set rather than the accuracy of measures themselves. The boost analysis, for example, may not be reflecting the validity of the additional measures used, but rather, the measurement error inherent in placement testing. Test scores are noisy measures and students a few points apart may be very similar. To address this concern, we ran models where we compared students to similar- scoring peers right above the cutoff, as well as to all students in a given level. We found the results to be consistent across model specification for College F, but we found differences between the “around” and “entire” groups in College J (see Table 8). However, it is likely that the significant negative coefficients for “entire” in College J may be related to the fact that there is as much as a 30-point range of placement scores that result in assignment to the same courses. Therefore students in the “entire” regressions in College J may be substantially different along unobservable characteristics. Future Considerations While the results from both sets of analyses point towards a potential advantage to using high school background measures and, to a lesser extent, non-cognitive measures in the assessment and placement process, a study in which students were experimentally placed using these measures would provide stronger evidence on the usefulness of these measures. Indeed, the results from Colleges F and J provide slightly more convincing evidence that non-cognitive indicators improve placement accuracy, but again, these students are placed using a combination HOLISITIC MEASURES & MATHEMATICS PLACEMENT 103 of placement test scores, high school background, and non-cognitive indicators. Further research should examine student success under conditions where non-cognitive indicators are the sole placement measure in order to determine the actual usefulness of these measures as an alternative to academic placement measures. Further, while we chose colleges’ existing items from questionnaires and classified some as cognitive and others as non-cognitive, we are unsure of their psychometric properties and validity in the traditional sense. It stands to reason that different cognitive and non-cognitive measures would yield different results than what we obtained. We encourage more work in investigating differences in measures that can be potentially used for placement. To this end we encourage a thorough exploration of the burgeoning literature on non-cognitive constructs and their scales in predicting college outcomes for their applicability in placement policy at the community college level. Finally, we also note that these may be noisy measures of high school background and non-cognitive attributes since they are gathered from a self-reported student questionnaire administered at the time of placement testing. A placement policy that incorporated such additional self-reported information could be susceptible to misinterpretation, reference bias, faking, or “gaming” (Duckworth & Yeager, 2015). Colleges therefore need to take caution and consider evaluation of the measures and the accessibility of placement policy information. Incorporating high school transcript information to automate these decisions may be more efficient and accurate, but this would necessitate data-sharing agreements between K-12 and community college districts. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 104 Conclusion Ultimately, these findings concerning non-cognitive indicators are related to the fact that psychosocial attributes such as motivation, and non-academic characteristics such as students’ use of time and their degree of social support, are useful for explaining why some students do well in school while others fall behind (Pintrich, 2003; Pajares, Frank, & Rayner, 2001). Our findings suggest that gauging these various measures at the start of each student’s college career may assist colleges as they sort students into community college coursework. Although evaluation and selection on non-cognitive skills may unfairly focus attention on student characteristics rather than the role of institutions, using non-cognitive measures may nevertheless promote equity. Holistic placement practices that increase access to upper-level courses without compromising the likelihood of student success in those courses provide opportunities for community college students to progress faster and further in their college careers. Notes 1 We recognize that there is debate over terminology, with some scholars preferring terms such as character skills, social and emotional competencies, dispositions, personality, temperament, 21 st century skills, and personal qualities (Duckworth & Yeager, 2015). We choose the term non-cognitive since it provides a contrast to the academic/cognitive measures discussed and because these terms “refer to the same conceptual space,” (Duckworth & Yeager, 2015, pg. 239). 2 The ACT, Inc. has recently decided to phase out the use of the COMPASS (Fain, 2015). 3 The Multiple Measures Assessment Project reported collecting non-cognitive data from one college for analysis (Bahr et al., 2014). 4 About one-quarter of all community college students in the U.S. are enrolled in California community colleges, many of which are in located in urban centers (Foundation for Community Colleges, n.d.). 5 Source: California Community College Chancellor’s Office DataMart (http://datamart.cccco.edu/datamart.aspx) 6 We could not identify student survey responses in College H (2009-2012) because we did not have access to the actual questionnaire. College H did collect indicators of some non-cognitive attributes from 2005-2009 (shown in Table 2), but the college used a diagnostic test instead of the ACCUPLACER or COMPASS. We therefore did not include College H in the analysis. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 105 7 Students are considered likely to pass if their predicted probability of success in the upper-level course is 50% or greater, and considered likely to fail if the predicted probability of failure is 50% or greater. 8 We use the adjusted score, which includes multiple measure points for each college. 9 We also run pooled college models by standardizing student test scores. For these pooled analyses, we identify items on colleges’ EBQs that measure common constructs, such as use of time, motivation, and social support. 10 College F subtracted points for high hours of schooling and work (use of time), low importance of education (motivation), and returning students who had been out of education for more than 5 years. 11 This is 221 out of 12,224 students in College F and 2,054 students out of 7,782 students in College J. 12 BOOST and MMPOINTS are correlated but this should not be a multicollinearity issue. We ran the models without the multiple measure points and did not observe significant differences in the magnitude, direction, or statistical significance of the boost variable. 13 There were 157 cases of boosts solely determined by non-cognitive measures in College J and 547 with cognitive measures alone. 14 We also conducted the same analyses with a “Pass at all” outcome. These results are available from the authors upon request. 15 We caution against interpreting the results as indicators of accuracy of the current system. This is because accuracy is related to the predictive validity of the placement instrument and we do not perform an analysis that measures predictive validity per se. Other methods may be used to determine whether placement tools are accurate and, relatedly, whether cutoffs are set correctly (Melguizo et al., 2016). 16 The regression results for the longer-term outcomes are available in the Appendix. HOLISTIC MEASURES & MATH PLACEMENT 106 References Armstrong, W. B. (2000). The association among student success in courses, placement test scores, student background data, and instructor grading practices. Community College Journal of Research and Practice, 24(8), 681-695. Almeida, D. (2015). The roots of college readiness. In W.G. Tierney & J. Duncheon (Eds). The Problem of College Readiness (pp. 3-44). SUNY Press: Albany. Bahr, P. R., Hayward, C., Hetts, J., Lamoree, D., Newell, M., Pellegrin, N., & Willett, T. (2014). Multiple Measures for Assessment and Placement (White Paper). Retrieved from http://www.rpgroup.org/system/files/MMAP_WhitePaper_Final_September2014.pd f Bailey, T., Jeong, D. W., & Cho, S.-W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29(2), 255–270. http://doi.org/10.1016/j.econedurev.2009.09.002 Bracco, K. R., Dadgar, M., Austin, K., Klarin, B., Broek, M., Finkelstein, N… & Bugler, D. (2014). Exploring the use of multiple measures for placement into college level courses: Seeking alternatives or improvements to the use of a single standardized test. San Francisco, CA: WestEd. Burdman, P. (2012). Where to begin? The evolving role of placement exams for students starting college. Boston, MA: Jobs for the Future. Carnevale, A.P., & Strohl, J. (2010) How increasing college access is increasing inequality, and what to do about it. In R. D. Kahlenberg (Ed.) Rewarding Strivers: Helping Low-Income Students Succeed in College. New York: Century Foundation Press, 137. Chingos, M. M. (2016). Instructional quality and student learning in higher education: Evidence from developmental algebra courses. The Journal of Higher Education, 87(1), 84-114. Conley, D. (2010). Replacing remediation with readiness. An NCPR Working Paper. Retrieved from: http://www.postsecondaryresearch.org/conference/pdf/ncpr_panel2_conley.pdf Cooper, J. (1997). Marginality, mattering, and the African American student: Creating an inclusive college environment. College Student Affairs Journal, 16(2), 15–20. Covington, M. V. (2000). Goal theory, motivation, and school achievement: An integrative view. Annual Review of Psychology, 51, 171–200. Dennis, J. M., Phinney, J. S., & Chuateco, L. I. (2005). The role of motivation, parental support, and peer support in the academic success of ethnic minority first generation college students. Journal of College Student Development, 46(3), 223-236. Dixon Rayle, A., & Chung, K.-Y. (2007). Revisiting first-year college students’ mattering: Social support, academic stress, and the mattering experience. Journal of College Student Retention: Research, Theory and Practice, 9(1), 21–37. Dixon, S. K., & Kurpius, S. E. R. (2008). Depression and college stress among university undergraduates: Do mattering and self-esteem make a difference? Journal of College Student Development, 49(5), 412–424. Duckworth, A. L., Peterson, C., Matthews, M. D., & Kelly, D. R. (2007). Grit: Perseverance and passion for long-term goals. Journal of Personality and Social Psychology, 92(6), 1087– 1101. Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Educational Researcher, 44(4), 237–251. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 107 Duncheon, J. (2015). The problem of college readiness. In W.G. Tierney & J. Duncheon (Eds). The Problem of College Readiness (pp. 3-44). SUNY Press: Albany. Ehrenberg, R. G., & Sherman, D. R. (1987). Employment while in college, academic achievement, and postcollege outcomes: A summary of results. Journal of Human Resources, 22(1), 1. Eccles, J., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual Review of Psychology, 53, 109-132. Fields, R., & Parsad, B. (2012). Tests and cut scores used for student placement in postsecondary education: Fall 2011. Washington, DC: National Assessment Governing Board. Fong, K., & Melguizo, T. (2015). Utilizing additional measures to buffer against students’ lack of math confidence and improve placement accuracy in developmental math. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL. France, M. K., & Finney, S. J. (2010). Conceptualization and utility of university mattering: A construct validity study. Measurement and Evaluation in Counseling and Development, 43(1), 48–65. Gossett, B. J., Cuyjex, M. J., & Cockriel, I. (1996). African Americans’ and Non-African Americans’ sense of mattering and marginality at public, predominantly white institutions. Equity and Excellence in Education, 29(3), 37–42. Gerlaugh, K., Thompson, L., Boylan, H., & Davis, H. (2007). National study of developmental education II: Baseline data for community colleges. Research in Developmental Education, 20(4), 1-4. Heckman, J., Stixrud, J., & Urzua, S. (2006). The effects of cognitive and noncognitive abilities on labor market outcomes and social behavior. Journal of Labor Economics, 24(3), 411– 482. Horn, L., Nevill, S., & Griffith, J. (2006). Profile of undergraduates in US postsecondary education institutions, 2003-04: With a special analysis of community college students. Statistical Analysis Report. NCES 2006-184.National Center for Education Statistics. Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39(4), 327–351. Husman, J., & Lens, W. (1999). The role of the future in student motivation. Educational Psychologist, 34, 113–125. Jenkins, D., Jaggars, S. S., & Roksa, J. (2009). Promoting gatekeeper course success among community college students needing remediation: Findings and recommendations from a Virginia Study. New York, NY: Community College Research Center. Retrieved from: http://eric.ed.gov/?id=ED507824 Kane, M. T. (2001). Current concerns in validity theory. Journal of Educational Measurement, 38(4), 319-342. Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational Measurement (4 th ed., pp. 17–64). Westport, CT: ACE/Praeger Publishers. Lewallen, W. C. (1994). Multiple measures in placement recommendations: An examination of variables related to course success. Lancaster, CA: Antelope Valley College. (ERIC Document No. 381 186). Markle, R., Olivera-Aguilar, M., Jackson, R., Noeth, R., & Robbins, S. (2013). Examining evidence of reliability, validity and fairness for the SuccessNavigator assessment HOLISITIC MEASURES & MATHEMATICS PLACEMENT 108 (Research Report No. RR-13-12). Princeton, NJ: Educational Testing Service. http://dx.doi.org/ 10.1002/j.2333-8504.2013.tb02319.x Marshall, S. K. (2001). Do I matter? Construct validation of adolescents’ perceived mattering to parents and friends. Journal of Adolescence, 24(4), 473–490. Marwick, J. D. (2004). Charting a path to success: The association between institutional placement policies and the academic success of Latino students. Community College Journal of Research and Practice, 28(3), 263–280. Mattern, K. D., & Packman, S. (2009). Predictive validity of ACCUPLACER scores for course placement: A meta-analysis (Research Report No. 2009-2). New York, NY: College Board. Medhanie, A. G., Dupuis, D. N., LeBeau, B., Harwell, M. R., & Post, T. R. (2012). The role of the ACCUPLACER mathematics placement test on a student’s first college mathematics course. Educational and Psychological Measurement, 72(2), 332-351. Melguizo, T., Bos, J., Ngo, F., Mills, N., & Prather, G. (2016). Using a regression discontinuity design to estimate the impact of placement decisions in developmental math. Research in Higher Education, 57(2), 123-151. Michaels, J. W., & Miethe, T. D. (1989). Academic effort and college grades. Social Forces, 68(1), 309. Miller, R. B., and Brickman, S. J. A model of future-oriented motivation and self-regulation. Educational Psychology Review, 16, 9–33. National Center for Public Policy and Higher Education & Southern Regional Education Board (NCPPHE & SREB). (2010). Beyond the rhetoric: Improving college readiness through coherent state policy. Atlanta, GA: NCPPHE. Retrieved from http://publications.sreb.org/2010/Beyond%20the%20Rhetoric.pdf Ngo, F., & Kwon, W. (2015). Using multiple measures to make math placement decisions: Implications for access and success in community colleges. Research in Higher Education, 56(5), 442-470. Ngo, F., & Melguizo, T. (2016). How can placement policy improve math remediation outcomes? Evidence from community college experimentation. Educational Evaluation and Policy Analysis, 38(1), 171-196. Noble, J. P., & Sawyer, R. L. (2004). Is high school GPA better than admission test scores for predicting academic success in college? College and University Journal, 79(4), 17–22. Noonan, B. M., Sedlacek, W. E., & Veerasamy, S. (2005). Employing noncognitive Variables in admitting and advising community college students. Community College Journal of Research and Practice, 29(6), 463-469. Pajares, F, & Rayner, S. (2001). Self-beliefs and school success: Self-efficacy, self-concept, and school achievement. Perception, 239-266. Pascarella, E. T., Edison, M. I., Nora, A., Hagedorn, L. S., & Terenzini, P. T. (1998). Does work inhibit cognitive development during College? Educational Evaluation and Policy Analysis, 20(2), 75. Perry, M., Bahr, P. M., Rosin, M., & Woodward, K. M. (2010). Course-taking patterns, policies, and practices in developmental education in the California Community Colleges. Mountain View, CA: EdSource. Retrieved from http://www.edsource.org/assets/files/ccstudy/FULL-CC- DevelopmentalCoursetaking.pdf HOLISITIC MEASURES & MATHEMATICS PLACEMENT 109 Pintrich, P. (2003). A motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95(4), 667-686. Pintrich, P. & Schunk, D. (2002). Motivation in education. Englewood Cliffs, N.J.: Merrill. Porcea, S. F., Allen, J, Robbins, S., & Phelps, R. P. (2010). Predictors of long-term enrollment and degree outcomes for community college students: Integrating academic, psychosocial, socio-demographic, and situational factors. The Journal of Higher Education, 81(6), 750-778. Rau, W., & Durand, A. (2000). The academic ethic and college grades: Does hard work help students to “make the grade”? Sociology of Education, 73(1), 19. Rikoon, S., Liebtag, T., Olivera-Aguilar, M., Robbins, S., & Jackson, T. (2014). A pilot study of holistic assessment and course placement in community college: Findings and recommendations. Retrieved from https://www.ets.org/Media/Research/pdf/RM-14- 10.pdf Robbins, S., Allen, J., Casillas, A., Peterson, C. H., & Le, H. (2006). Unraveling the differential effects of motivational and skills, social, and self-management measures from traditional predictors of college outcomes. Journal of Educational Psychology, 98(3), 598–616. Roderick, M., Nagaoka, J., & Coca, V. (2009). College readiness for all: The challenge for urban high schools. The Future of Children, 19(1), 185-210. Rosenberg, M., & McCullough, B. C. (1981). Mattering: Inferred significance and mental health among adolescents. Research in Community and Mental Health, 2, 163–182. Sawyer, R. (1996). Decision theory models for validating course placement tests. Journal of Educational Measurement, 33(3), 271–290. Sawyer, R. (2007). Indicators of usefulness of test scores. Applied Measurement in Education, 20(3), 255-271. Sedlacek, W. E. (2004). Beyond the big test: Noncognitive assessment in higher education. San Francisco, CA: Jossey-Bass. Schlossberg, N. K. (1989). Marginality and mattering: Key issues in building community. New Directions for Student Services, 1989(48), 5–15. Schunk, D.H., (2012). Social cognitive theory. In. K.R. Harris, S. Graham, & T. Urban (Eds.). APA Educational Psychological Handbook, (pp. 101-123), Vol. 1. Scott-Clayton, J. E. (2012). Do high-stakes placement exams predict college success? (Working Paper No. 41). New York, NY: Community College Research Center. Retrieved from http://academiccommons.columbia.edu/catalog/ac:146482 Scott-Clayton, J. E., Crosta, P. M., & Belfield, C. R. (2014). Improving the Targeting of Treatment: Evidence From College Remediation. Educational Evaluation and Policy Analysis, 36(3), 371–393. http://doi.org/10.3102/0162373713517935 Silver, B. Smith, E. and Greene, B. (2001). A study strategies self-efficacy instrument for use with community college students. Educational and Psychological Management, 61(5), 849-865. Smith, A. (2016, May 26). Determining a student’s place. Inside Higher Ed. Retrieved from http://www.insidehighered.com/news/2016/05/26/growing-number-community- colleges-use-multiple-measures-place-students Stinebrickner, R., & Stinebrickner, T. R. (2003). Working during school and academic performance. Journal of Labor Economics, 21(2), 473–491. Stinebrickner, R., & Stinebrickner, T. R. (2004). Time-use and college outcomes. Journal of Econometrics, 121(1–2), 243–269. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 110 Stinebrickner, R., & Stinebrickner, T. R. (2008). The causal effect of studying on academic performance. The B.E. Journal of Economic Analysis & Policy, 8(1). Sternberg, R. J., Gabora, L., & Bonney, C. R. (2012). Introduction to the special issue on college and university admissions. Educational Psychologist, 47(1), 1-4. Tovar, E., Simon, M. A., & Lee, H. B. (2009). Development and validation of the college mattering inventory with diverse urban college students. Measurement and Evaluation in Counseling and Development, 42(3), 154–178. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 111 Figures & Tables Note: *Placement Test is representative of status quo placement practice, which in some cases is placement test plus additional multiple measures (see Table 1). Figure 1. Severe placement errors identified in status quo practice (Bandwidth=5 points around placement cutoff) 8.7% 6.1% 6.1% 6.1% 17.4% 20.1% 21.4% 1) Placement Test* 2) Test+HSB 3) Test+NC 4) Test +HSB+NC 5) HSB 6) HSB+NC 7) NC HOLISITIC MEASURES & MATHEMATICS PLACEMENT 112 Note: *Placement Test is representative of status quo placement practice, which in some cases is placement test plus additional multiple measures (see Table 1). Figure 2. Type of severe placement error identified (Bandwidth=5 points around placement cutoff) 42.5% 34.5% 33.4% 34.9% 56.6% 58.2% 64.5% 52.8% 65.5% 66.6% 65.1% 43.4% 41.8% 35.5% 1) Placement Test* 2) Test+HSB 3) Test+NC 4) Test +HSB+NC 5) HSB 6) HSB+NC 7) NC Underplacement Overplacement HOLISITIC MEASURES & MATHEMATICS PLACEMENT 113 Table 1. Multiple measures used for math placement College Point Range Academic Background College Plans Motivation HS Diploma/ GED HS GPA Prior Math A 0 to 4 + B 0 to 3 + + + C N/A D 0 to 2 + E 0 to 3 + F -2 to 2 +/- +/- G 0 to 3 + + + H 0 to 4 + J -2 to 5 + +/- +/- Note: (+) indicates measures for which points are added, and (-) indicates measures for which points are subtracted. Academic Background includes whether the student received a diploma or GED, high school GPA, and prior math course-taking (including achievement and highest- level completed). College plans include hours planned to attend class, hours of planned employment, and time out of formal education. Motivation includes importance of college and importance of mathematics. Multiple measure information was not available for one of the nine LUCCD colleges. The study time period is 2005 to 2012, but information show here for College G is 2011-2012, and for College H is 2005- 2009. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 114 Table 2. Types of information collected via Education Background Questionnaires College Academic Background College Plans Motivation Social Support HS Diploma/ GED HS GPA Prior math A x B x x x x x x C x x x x x x D x x x x E x x F x x x G x x x x x H x x x x x J x x x Note: Academic Background includes whether the student received a diploma or GED, high school GPA, and prior math course- taking (including achievement and highest-level completed). College plans include hours planned to attend class, hours of planned employment, and time out of formal education. Motivation includes importance of college and importance of mathematics. Students were also asked about social support – how important is it for the people closest to you that you go to college? The study time period is 2005 to 2012, but information show here for College G is 2011-2012, and for College H is 2005-2009. HOLISTIC MEASURES & MATH PLACEMENT 115 Table 3. College assessment, placement, and demographic profiles, six LUCCD colleges 2005-2012 College B College C College D College F College G College J Placement Levels N % N % N % N % N % N % > Intermediate Algebra 153 2.0% 99 0.5% 2481 9.7% 518 4.2% 710 9.4% 28 0.4% Intermediate Algebra 715 9.2% 742 3.8% 6355 24.9% 1099 8.9% 1411 18.7% 630 8.1% Elementary Algebra 1049 13.4% 3160 16.0% 6409 25.1% 3825 30.8% 1049 13.9% 1927 24.8% Pre-Algebra 4965 63.6% 2623 13.3% 1604 6.3% 4310 34.7% 2603 34.6% 1766 22.7% Arithmetic 930 11.9% 9194 46.5% 8673 34.0% 2652 21.4% 1754 23.3% 2757 35.4% <Arithmetic 3934 19.9% 674 8.7% Student Demographics Female 4447 56.9% 10031 50.8% 13472 52.8% 6837 55.1% 3891 51.7% 5288 68.0% African-American 367 4.7% 7745 39.2% 1965 7.7% 5516 44.5% 125 1.7% 6102 78.4% Latina/o 5977 76.5% 9367 47.4% 11661 45.7% 3827 30.9% 5508 73.2% 1206 15.5% Asian/Pacific Islander 376 4.8% 905 4.6% 2512 9.8% 765 6.2% 1244 16.5% 95 1.2% White (Non-Hispanic) 607 7.8% 584 3.0% 6908 27.1% 1156 9.3% 122 1.6% 50 0.6% Other 485 6.2% 1151 5.8% 2476 9.7% 1140 9.2% 528 7.0% 329 4.2% Total Assessed in Math 7812 19752 25522 12404 7527 7782 Placement Test ACCUPLACER ACCUPLACER ACCUPLACER COMPASS ACCUPLACER COMPASS Years in Sample 2005-2009 2005-2012 2005-2012 2005-2012 2011-2012 2005-2009 Notes: For each math level, we calculated the average value for the 2005-2012 academic years. For College B the average values are calculated from 2005-2009. College C had 454 students who placed below Arithmetic and the data runs from 2008-2012. College J had 138 students who placed below Arithmetic. Colleges B, G, and J have different time periods due to placement policy changes. Source: 2005-2012 LUCCD transcript data HOLISITIC MEASURES & MATHEMATICS PLACEMENT 116 Table 4. Severe placement errors (SPE), under-placement (UP), and over-placement (OP) identified in current practice under different placement scenarios, Bandwidth=All Students, Pass with B or better 1) Placement Test* 2) Test+HSB 3) Test+NC 4) Test+HSB+NC 5) HSB 6) HSB+NC 7) NC College Math Levels Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP B IA/EA .02 .40 .60 .04 .35 .65 .03 .38 .63 .05 .33 .67 .09 .72 .28 .10 .70 .30 .08 .84 .16 EA/PA .01 .36 .64 .03 .21 .79 .02 .30 .70 .03 .20 .80 .09 .81 .19 .10 .80 .20 .11 .93 .07 C IA/EA .03 .83 .17 .04 .86 .14 .04 .84 .16 .10 .91 .09 .22 .97 .03 .28 .97 .03 .23 .97 .03 EA/PA .23 .95 .05 .37 .97 .03 .28 .88 .12 .36 .91 .09 .39 .97 .03 .42 .93 .07 .43 .93 .07 PA/AR .08 1.00 .00 .10 1.00 .00 .08 1.00 .00 .10 1.00 .00 .57 1.00 .00 .57 1.00 .00 .64 1.00 .00 D IA/EA .18 .01 .99 .16 .01 .99 .18 .02 .98 .16 .02 .98 .16 .04 .96 .16 .04 .96 .19 .07 .93 PA/AR .09 .00 1.00 .08 .99 .01 .09 .00 1.00 .08 .99 .01 .11 .99 .01 .11 .99 .01 .21 1.00 .00 F IA/EA .06 .03 .45 .07 .05 .95 .07 .05 .95 .08 .06 .94 .10 .30 .70 .11 .30 .70 .10 .42 .58 EA/PA .08 .96 .04 .05 .83 .17 .10 .97 .03 .07 .87 .13 .07 .91 .09 .08 .87 .13 .08 .98 .02 PA/AR .37 .00 1.00 .37 .00 1.00 .37 .00 1.00 .37 .00 1.00 .39 .01 .99 .39 .01 .99 .39 .01 .99 G IA/EA .08 .57 .43 .17 .69 .31 .11 .57 .43 .18 .66 .34 .07 .27 .73 .08 .34 .66 .08 .35 .65 EA/PA .17 .02 .98 .22 .01 .99 .19 .01 .99 .23 .02 .98 .21 .02 .98 .22 .03 .97 .24 .05 .95 PA/AR .00 .00 1.00 .01 .16 .84 .00 .00 1.00 .01 .06 .94 .01 .75 .25 .01 .57 .43 .01 .85 .15 AVERAGES IA/EA .07 .37 .53 .10 .39 .61 .08 .37 .63 .11 .40 .60 .13 .46 .54 .15 .47 .53 .13 .53 .47 EA/PA .12 .57 .43 .17 .51 .49 .15 .54 .46 .17 .50 .50 .19 .68 .32 .20 .66 .34 .21 .72 .28 PA/AR .14 .25 .75 .14 .54 .46 .14 .25 .75 .14 .51 .49 .27 .69 .31 .27 .64 .36 .31 .71 .29 TOTAL .11 .45 .49 .13 .53 .47 .13 .44 .56 .14 .53 .47 .22 .67 .33 .23 .66 .34 .25 .71 .29 Notes: Intermediate algebra (IA; elementary algebra (EA); pre-algebra (PA); arithmetic (AR); high school background (HSB); non-cognitive (NC). HOLISITIC MEASURES & MATHEMATICS PLACEMENT 117 Table 5. Severe placement error (SPE), under-placement (UP), and over-placement (OP) identified in current practice under different placement scenarios, BW=5, Pass with B or better 1) Placement Test* 2) Test+HSB 3) Test+NC 4) Test+HSB+NC 5) HSB 6) HSB+NC 7) NC College Math Levels Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP Total SPE UP OP B IA/EA .10 .87 .13 .09 .86 .14 .09 .86 .14 .09 .80 .20 .09 .72 .28 .10 .70 .30 .08 .84 .16 EA/PA .01 .13 .87 .02 .16 .84 .02 .11 .89 .02 .22 .78 .09 .81 .19 .10 .80 .20 .11 .93 .07 C IA/EA .05 .96 .04 .05 .96 .04 .06 .95 .05 .06 .93 .07 .22 .97 .03 .28 .97 .03 .23 .97 .03 EA/PA .39 .97 .03 .42 .93 .07 .43 .93 .07 PA/AR .01 .71 .29 .01 .78 .22 .01 .70 .30 .01 .83 .17 .57 1.00 .00 .57 1.00 .00 .64 1.00 .00 D IA/EA .07 .02 .98 .06 .02 .98 .07 .02 .98 .06 .02 .98 .16 .04 .96 .16 .04 .96 .19 .07 .93 PA/AR .04 .89 .11 .05 .91 .09 .04 .89 .11 .04 .92 .08 .11 .99 .01 .11 .99 .01 .21 1.00 .00 F IA/EA .03 .05 .39 .04 .09 .91 .03 .07 .93 .04 .09 .91 .10 .30 .70 .11 .30 .70 .10 .42 .58 EA/PA .07 .91 .09 .08 .87 .13 .08 .98 .02 PA/AR .25 .01 .99 .25 .01 .99 .25 .01 .99 .25 .01 .99 .39 .01 .99 .39 .01 .99 .39 .01 .99 G IA/EA .01 .00 1.00 .02 .00 1.00 .02 .00 1.00 .02 .00 1.00 .07 .27 .73 .08 .34 .66 .08 .35 .65 EA/PA .07 .00 1.00 .06 .00 1.00 .07 .02 .98 .07 .02 .98 .21 .02 .98 .22 .03 .97 .24 .05 .95 PA/AR .00 .50 .50 .01 .00 1.00 .01 .06 .94 .01 .00 1.00 .01 .75 .25 .01 .57 .43 .01 .85 .15 AVERAGES IA/EA .05 .38 .51 .05 .39 .61 .06 .38 .62 .05 .37 .63 .13 .46 .54 .15 .47 .53 .13 .53 .47 EA/PA .16 .37 .63 .04 .08 .92 .04 .07 .93 .05 .12 .88 .12 .58 .42 .20 .66 .34 .21 .72 .28 PA/AR .08 .53 .47 .08 .42 .58 .08 .41 .59 .08 .44 .56 .27 .69 .31 .27 .64 .36 .31 .71 .29 TOTAL .09 .42 .53 .06 .34 .66 .06 .33 .67 .06 .35 .65 .17 .57 .43 .20 .58 .42 .21 .65 .35 Notes: Intermediate algebra (IA; elementary algebra (EA); pre-algebra (PA); arithmetic (AR); high school background (HSB); non-cognitive (NC). HOLISTIC MEASURES & MATH PLACEMENT 118 Table 6. Regression results: Whether students passed the placed math course within one-year of assessment, College F Model 1 Model 2 Model 3 Model 4 (1) Around (2) Entire (3) Around (4) Entire (5) Around (6) Entire (7) Around (8) Entire Multiple Measure Boost -0.035 -0.032 -0.061 -0.041 -0.079 -0.048 -0.068 -0.047 (0.06) (0.03) (0.06) (0.03) (0.06) (0.03) (0.06) (0.03) Multiple Measure Points 0.007 -0.008 0.01 -0.009 0.009 -0.01 -0.003 -0.019** (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Test Score (z) 0.007 0.025*** -0.014 0.018** -0.01 0.019** -0.014 0.016* (0.01) (0.00) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Placed Math Level 1 Level Below 0.059 -0.077** 0.038 - 0.083*** 0.05 -0.067** (0.05) (0.03) (0.05) (0.03) (0.05) (0.03) 2 Levels Below -0.068 - 0.162*** -0.073 - 0.162*** -0.052 - 0.135*** (0.05) (0.02) (0.05) (0.02) (0.05) (0.02) 3 Levels Below -0.104* - 0.112*** -0.063 - 0.104*** -0.04 - 0.074*** (0.05) (0.02) (0.05) (0.02) (0.05) (0.02) 4 Levels Below - 0.155*** - 0.159*** - 0.126*** (0.02) (0.02) (0.02) Age at Assessment 20 - 24 0.006 -0.024* (0.02) (0.01) 25 - 34 -0.008 0.002 (0.02) (0.01) 35 - 54 -0.057* -0.017 (0.03) (0.01) 55-65 -0.058 -0.046 (0.08) (0.03) Female 0.019 0.033*** (0.02) (0.01) Race Asian/PI -0.048 0.014 (0.04) (0.02) African-American -0.070* - 0.061*** (0.03) (0.01) Latina/o -0.001 0.014 (0.03) (0.02) Other -0.017 -0.008 (0.04) (0.02) HOLISITIC MEASURES & MATHEMATICS PLACEMENT 119 English not prim. lang. 0.059* 0.031* (0.03) (0.01) Perm. Res. -0.008 0.032* (0.03) (0.02) Other Visa 0.106* 0.091*** (0.05) (0.02) Cohort Fixed Effects No No No No Yes Yes Yes Yes Constant 0.189*** 0.243*** 0.253*** 0.372*** 0.246*** 0.355*** 0.250*** 0.333*** (0.01) (0.00) (0.04) (0.02) (0.05) (0.03) (0.06) (0.03) R-squared -0.001 0.004 0.01 0.011 0.024 0.018 0.038 0.032 N 2328 12377 2328 12377 2328 12377 2328 12377 * p<0.05, ** p<0.01, *** p<0.001 Notes: The estimates shown are coefficients of the linear probability model, with standard errors in parentheses. Students in College F could earn up to an additional 2 points based on responses on the Educational Background Questionnaire. Models: M1 includes Boost, Multiple Measure Points, and Test Score; M2 includes Boost, Multiple Measure Points, Test Score, and Math Level; M3 includes Boost, Multiple Measure Points, Test Score, Math Level, and Cohort Fixed Effects; M4 includes Boost, Multiple Measure Points, Test Score, Math Level, Cohort Fixed Effects, and demographic controls. Around restricts sample to students 5 points above the cutoff; Entire includes all students within the math level. Reference groups: Students in transfer- level math, students 18-20 years old; Race=white. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 120 Table 7. Composition of multiple measure boost in College J Total Cognitive + Non-Cognitive Cognitive Only Non-Cognitive Only Number of students receiving boost 2054 (26.4% of 7,782) 1350 547 157 Percentage of all boosts 65.7 26.6 7.6 Diploma/GED Importance of math Highest math passed with C or better HOLISITIC MEASURES & MATHEMATICS PLACEMENT 121 Table 8. Regression results: Whether students passed the placed math course within one-year of assessment, College J Model 1 Model 2 Model 3 Model 4 (1) Around (2) Entire (3) Around (4) Entire (5) Around (6) Entire (7) Around (8) Entire MM Boost -0.032 -0.065** -0.048 - 0.069*** -0.048 -0.067** -0.046 -0.067** (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) NC (Imp. Math) -0.024 - 0.054*** -0.024 -0.041** -0.027 - 0.042*** -0.03 - 0.046*** (0.02) (0.01) (0.02) (0.01) (0.02) (0.01) (0.02) (0.01) MM Boost*NC -0.035 0.026 -0.029 0.007 -0.028 0.005 -0.024 0.007 (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) MM Points 0.058*** 0.076*** 0.058*** 0.061*** 0.058*** 0.061*** 0.058*** 0.061*** (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Test Score (z) 0.018 0.042*** -0.026 0.011 -0.03 0.008 -0.033* 0.006 (0.01) (0.01) (0.02) (0.01) (0.02) (0.01) (0.02) (0.01) Placed Math Level 1 Level Below -0.494** -0.125 -0.482** -0.128 -0.448* -0.117 (0.18) (0.08) (0.18) (0.08) (0.18) (0.08) 2 Levels Below -0.670*** -0.219** -0.661*** -0.225** -0.624*** -0.206** (0.18) (0.08) (0.18) (0.08) (0.18) (0.08) 3 Levels Below -0.534** -0.12 -0.525** -0.128 -0.489** -0.106 (0.18) (0.08) (0.18) (0.08) (0.18) (0.08) 4 Levels Below -0.660*** -0.222** -0.656*** -0.233** -0.614*** -0.208** (0.18) (0.08) (0.18) (0.08) (0.18) (0.08) Age at Assessment 20 - 24 -0.044* - 0.041*** (0.02) (0.01) 25 - 34 -0.016 -0.02 (0.02) (0.01) 35 - 54 0.045* 0.014 (0.02) (0.01) 55-65 -0.081 -0.053 (0.07) (0.04) Female 0.015 0.022* (0.02) (0.01) Race Asian/PI -0.092 0.109 (0.13) (0.07) African-American -0.124 0.04 (0.11) (0.06) Latina/o -0.009 0.129* (0.11) (0.06) HOLISITIC MEASURES & MATHEMATICS PLACEMENT 122 Other -0.012 0.108 (0.11) (0.06) English not prim. lang. -0.017 -0.011 (0.03) (0.02) Perm. Res. 0.078* 0.125*** (0.04) (0.02) Other Visa 0.022 0.055 (0.06) (0.04) Cohort Fixed Effects No No No No Yes Yes Yes Yes Constant 0.141*** 0.113*** 0.742*** 0.343*** 0.712*** 0.332*** 0.771*** 0.247* (0.02) (0.01) (0.18) (0.08) (0.18) (0.08) (0.21) (0.10) R-squared 0.011 0.04 0.034 0.056 0.036 0.057 0.052 0.071 N 3416 7782 3416 7782 3416 7782 3416 7782 * p<0.05, ** p<0.01, *** p<0.001 Notes: The estimates shown are coefficients of the linear probability model, with standard errors in parentheses. Students in College H could earn up to an additional 5 points based on responses on the Educational Background Questionnaire. Models: M1 includes Boost, Multiple Measure (MM) Points, and Test Score; M2 includes Boost, Multiple Measure Points, Test Score, and Math Level; M3 includes Boost, Multiple Measure Points, Test Score, Math Level, and Cohort Fixed Effects; M4 includes Boost, Multiple Measure Points, Test Score, Math Level, Cohort Fixed Effects, and demographic controls. Around restricts sample to students 5 points above the cutoff; Entire includes all students within the math level. Reference groups: Students in transfer-level math, students 18-20 years old; Race=white. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 123 APPENDIX Table A1. Regression results: Whether students completed 30 degree-applicable units, College F Model 1 Model 2 Model 3 Model 4 (1) Around (2) Entire (3) Around (4) Entire (5) Around (6) Entire (7) Around (8) Entire Multiple Measure Boost 0.006 0.009 -0.012 0.002 -0.019 0.005 -0.028 0.004 (0.06) (0.03) (0.06) (0.03) (0.06) (0.03) (0.06) (0.03) Multiple Measure Points -0.011 - 0.025*** -0.009 - 0.025*** -0.01 - 0.026*** -0.021 - 0.033*** (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.02) (0.01) Test Score (z) 0.016* 0.043*** 0.002 0.020*** -0.007 0.011 -0.013 0.006 (0.01) (0.00) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Placed Math Level 1 Level Below 0.004 -0.048* 0.018 -0.029 0.036 -0.017 (0.06) (0.02) (0.06) (0.02) (0.06) (0.02) 2 Levels Below -0.024 -0.050** -0.019 -0.048* 0.004 -0.032 (0.05) (0.02) (0.05) (0.02) (0.05) (0.02) 3 Levels Below -0.072 - 0.087*** -0.084 - 0.098*** -0.066 - 0.086*** (0.05) (0.02) (0.05) (0.02) (0.05) (0.02) 4 Levels Below - 0.142*** - 0.149*** - 0.134*** (0.02) (0.02) (0.02) Age at Assessment 20 - 24 -0.086*** - 0.077*** (0.02) (0.01) 25 - 34 -0.127*** - 0.073*** (0.03) (0.01) 35 - 54 -0.076** - 0.061*** (0.03) (0.01) 55-65 -0.026 -0.079* (0.08) (0.03) Female 0.016 0.026*** (0.02) (0.01) Race Asian/PI 0.052 0.019 (0.05) (0.02) African-American -0.001 0.002 (0.03) (0.01) Latina/o 0.026 0.032* (0.03) (0.01) HOLISITIC MEASURES & MATHEMATICS PLACEMENT 124 Other 0 0.02 (0.04) (0.02) English not prim. lang. 0.051 0.040*** (0.03) (0.01) Perm. Res. -0.008 0.033* (0.04) (0.02) Other Visa 0.188*** 0.101*** (0.05) (0.02) Cohort Fixed Effects No No No No Yes Yes Yes Yes Constant 0.240*** 0.220*** 0.275*** 0.300*** 0.291*** 0.341*** 0.290*** 0.327*** (0.01) (0.00) (0.05) (0.02) (0.06) (0.03) (0.06) (0.03) R-squared 0.001 0.013 0.002 0.018 0.004 0.027 0.025 0.041 N 2328 12377 2328 12377 2328 12377 2328 12377 * p<0.05, ** p<0.01, *** p<0.001 Notes: The estimates shown are coefficients of the linear probability model, with standard errors in parentheses. Students in College F could earn up to an additional 2 points based on responses on the Educational Background Questionnaire. Models: M1 includes Boost, Multiple Measure Points, and Test Score; M2 includes Boost, Multiple Measure Points, Test Score, and Math Level; M3 includes Boost, Multiple Measure Points, Test Score, Math Level, and Cohort Fixed Effects; M4 includes Boost, Multiple Measure Points, Test Score, Math Level, Cohort Fixed Effects, and demographic controls. Around restricts sample to students 5 points above the cutoff; Entire includes all students within the math level. Reference groups: Students in transfer- level math, students 18-20 years old; Race=white. HOLISITIC MEASURES & MATHEMATICS PLACEMENT 125 Table A2. Regression results: Whether students 30 degree-applicable credits, College J Model 1 Model 2 Model 3 Model 4 (1) Around (2) Entire (3) Around (4) Entire (5) Around (6) Entire (7) Around (8) Entire MM Boost -0.006 -0.019 -0.016 -0.018 -0.014 -0.018 -0.015 -0.018 (0.02) (0.02) (0.02) (0.02) (0.02) (0.02) (0.02) (0.02) NC (Imp. Math) -0.043* - 0.050*** -0.040* - 0.044*** -0.040* - 0.043*** -0.042* - 0.046*** (0.02) (0.01) (0.02) (0.01) (0.02) (0.01) (0.02) (0.01) MM Boost*NC -0.022 -0.013 -0.017 -0.015 -0.018 -0.016 -0.015 -0.015 (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) (0.03) (0.02) MM Points 0.039*** 0.048*** 0.034*** 0.038*** 0.033*** 0.037*** 0.032*** 0.037*** (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Test Score (z) 0.034** 0.031*** -0.004 0.01 0.001 0.011* -0.001 0.008 (0.01) (0.00) (0.01) (0.01) (0.01) (0.01) (0.01) (0.01) Placed Math Level 1 Level Below -0.330* -0.07 -0.333* -0.074 -0.343* -0.07 (0.16) (0.07) (0.16) (0.07) (0.16) (0.07) 2 Levels Below -0.429** -0.144* -0.428** -0.145* -0.433** (0.13) (0.16) (0.07) (0.16) (0.07) (0.16) (0.07) 3 Levels Below -0.441** -0.166* -0.438** -0.168* -0.442** -0.156* (0.16) (0.07) (0.16) (0.07) (0.16) (0.07) 4 Levels Below -0.499** -0.211** -0.493** -0.210** -0.494** -0.196** (0.16) (0.07) (0.16) (0.07) (0.16) (0.07) Age at Assessment 20 - 24 -0.068*** - 0.058*** (0.02) (0.01) 25 - 34 -0.039* -0.024* (0.02) (0.01) 35 - 54 0.022 0.01 (0.02) (0.01) 55-65 -0.063 -0.039 (0.06) (0.04) Female 0.011 0.028** (0.01) (0.01) Race Asian/PI -0.201 0.075 (0.11) (0.06) African-American -0.091 0.016 (0.09) (0.05) Latina/o -0.081 0.024 (0.10) (0.05) HOLISITIC MEASURES & MATHEMATICS PLACEMENT 126 Other -0.016 0.075 (0.10) (0.05) English not prim. lang. 0.063* 0.009 (0.03) (0.02) Perm. Res. 0.136*** 0.184*** (0.03) (0.02) Other Visa -0.062 0.070* (0.05) (0.03) Cohort Fixed Effects No No No No Yes Yes Yes Yes Constant 0.116*** 0.100*** 0.570*** 0.291*** 0.586*** 0.310*** 0.682*** 0.267** (0.02) (0.01) (0.16) (0.07) (0.16) (0.07) (0.18) (0.09) R-squared 0.009 0.022 0.021 0.032 0.022 0.032 0.042 0.052 N 3416 7782 3416 7782 3416 7782 3416 7782 * p<0.05, ** p<0.01, *** p<0.001 Notes: The estimates shown are coefficients of the linear probability model, with standard errors in parentheses. Students in College H could earn up to an additional 5 points based on responses on the Educational Background Questionnaire. Models: M1 includes Boost, Multiple Measure Points, and Test Score; M2 includes Boost, Multiple Measure Points, Test Score, and Math Level; M3 includes Boost, Multiple Measure Points, Test Score, Math Level, and Cohort Fixed Effects; M4 includes Boost, Multiple Measure Points, Test Score, Math Level, Cohort Fixed Effects, and demographic controls. Around restricts sample to students 5 points above the cutoff; Entire includes all students within the math level. Reference groups: Students in transfer-level math, students 18-20 years old; Race=white. EXTENDING TIME IN DEVELOPMENTAL MATH 127 How Extending Time in Developmental Math Impacts Student Persistence and Success: Evidence from a Regression Discontinuity in Community Colleges Federick J. Ngo and Holly Kosiewicz Improving the outcomes of students in developmental or remedial math remains a puzzle in higher education. Nationally, about 60 percent of incoming community college students are deemed academically under-prepared for college-level math and referred to take remedial coursework (Bailey, Jeong, & Cho, 2010). Of these students, only about 40 percent persist and complete the gatekeeper college-level math courses typically required to earn postsecondary credentials. The lower that students’ initial level of math remediation is the less likely they will earn their desired credentials or transfer to four-year institutions (Bahr, 2012; Fong, Melguizo, & Prather, 2015). While some of these outcomes can be explained by students’ academic preparation, they also may be related to the amount of time students spend in remediation (Edgecombe, 2011), to the additional costs of taking remedial courses (Melguizo, Hagedorn, & Cypers, 2008), or to the quality of instruction in these courses (Grubb, 1999). The puzzle is that the very intervention that is aimed at preparing students to be successful in gatekeeper college- level courses may at the same time be an obstacle and deterrent to their persistence in college. Concerns with low persistence and completion rates have motivated proponents of reform to reconsider the delivery of developmental math (Burdman, 2012; Rutschow & Schneider, 2011). Moving away from the traditional sequence of multiple semester-long courses taught in lecture-based formats, practitioners and policy-makers have explored alternative models of delivery that accelerate student progress, contextualize curriculum and instruction, or provide additional supports to students in developmental courses (Rutschow & Schneider, 2011). But in EXTENDING TIME IN DEVELOPMENTAL MATH 128 an examination of math course offerings in a set of urban community colleges in California, Kosiewicz, Ngo, and Fong (2016) found that a far more prevalent model of delivery was to extend time in remediation, essentially slowing down the delivery of math content. Specifically, some of the colleges implemented placement policies that assigned students to either an extended two-semester 1 or a traditional one-semester elementary algebra course that serve as prerequisites for college-level math. Both types of courses start with roughly the same content from day one but the extended algebra courses move at a slower pace and force students to make two enrollment decisions: one to enroll in the first half of the course, and another to enroll in the second. Lengthening the amount of time in math in this way is thought to be an intervention that improves academic achievement. The underlying logic is that lower-skilled students may need more time and instruction to master necessary algebraic concepts (Aronson, Zimmerman, & Carlos, 1999). However, while there is some evidence that increasing the amount of instructional time in algebra benefits middle and high school students (Cortes, Goodman, & Nomi, 2015; Taylor, 2014), it is unclear whether this practice is beneficial to community college students, for whom the additional costs in terms of time and money may outweigh the benefits of remediation. Does extending time in algebra by a semester help community college students persist and succeed in developmental math and college? We investigate this research question using administrative data from four large urban community colleges in California. The nature of math course placement policies within these colleges provides the opportunity to use a regression discontinuity (RD) design to estimate the impact of assigning students to extended math courses relative to single-semester courses on student achievement outcomes. While other researchers have used RD designs in the setting of 1 Here we use extended elementary algebra synonymously with two-semester elementary algebra EXTENDING TIME IN DEVELOPMENTAL MATH 129 developmental math to identify the effects of placement in remediation, they have predominantly examined placement into disparate math courses (e.g., the effects of placement in elementary versus intermediate algebra on student outcomes). Thus it is difficult to disentangle whether differences in outcomes are attributable to differences in time spent in remediation or to the academic preparation within remedial courses. This study is unique because we are able to focus on the effects of requiring students to spend additional time in math remediation. Students around the placement cutoffs are considered statistically identical, but one group must take two semesters of algebra instead of one. This paper proceeds as follows. We first review the literature on increasing time in algebra, drawing largely from studies of middle and high school settings. We then introduce the literature on student persistence decisions in the community college setting and highlight the ways it interacts with developmental math reforms. We next describe the data and the methodological approach – regression discontinuity design – that we used to estimate the effects of enrollment in the extended algebra sequence, focusing on four California community colleges that assign students who earn lower placement scores to an extended two-semester elementary algebra course instead of a typical one-semester course. Results from this study suggest that extending the amount of time in algebra and adding the need for an extra persistence decision is more harmful than beneficial for students at the margin of the cutoff. These students are much less likely to enroll in and complete gatekeeper courses and persist towards credential attainment. We conclude with a discussion of how developmental math reforms can increase persistence and success for community college students. EXTENDING TIME IN DEVELOPMENTAL MATH 130 Does Increasing Time in Math Help? Increasing instructional time is not a new idea, as several states have implemented reforms to either increase instructional minutes in classes or extend the school year (Aronson et al., 1999; Patall, Cooper, & Harris, 2010). At the core of these initiatives is the notion that allocating more time to instruction will increase the amount of time students can actively participate in classroom exercises and learn course material. With additional time, teachers may feel less hurried and have more time to cover material in more depth (Aronson et al., 1999; Patall et al., 2010). Most of the evidence on the advantages of increasing time in math stems from middle and high school settings and focuses on algebra, a gatekeeper course that can be a significant barrier to overall success in school (Cortes et al., 2015; Stein, Kauffman, Sherman, & Hillen, 2011). This evidence suggests that increasing time in math is beneficial for middle and high school students and improves student achievement. Taylor (2014) examined a policy in Miami- Dade County schools and found that increasing math instruction by requiring that students concurrently take two courses instead of one had positive effects on the math achievement of middle school students. Similarly, Cortes et al. (2015) studied the Chicago Public School system, which implemented an algebra policy that required students scoring below the national median on an 8 th grade math test to take two periods of freshman algebra instead of the typical one- period course. Students affected by this “double-dose” policy were taught approximately 45 minutes of lectured-based instruction in algebra alongside non-affected students, and received an additional 45 minutes of non-course based instruction where they worked in small groups, engaged with course-based materials absent from textbooks, and focused more on concepts they had difficulty mastering (Nomi & Allensworth, 2009). Evaluations of this double dose of algebra EXTENDING TIME IN DEVELOPMENTAL MATH 131 demonstrate that it largely generated positive effects on student achievement. Employing a regression-discontinuity framework, these researchers found that the double dose algebra policy increased math test scores, closed the black-white achievement gap on the math portion of the ACT by fifteen percent, and improved high school graduation and college enrollment rates (Cortes et al., 2015). However, because the double dose algebra policy not only doubled instructional time but also provided professional development training and greater instructional flexibility to teachers, Cortes and colleagues acknowledge that it is difficult to disentangle which mechanisms improved academic performance. The likelihood that extending instructional time alone boosted achievement falls flat when considering the body of literature that consistently reveals little or no correlation between allocated time and achievement (Aronson et al., 1999). Investment and Persistence in College Math Remediation Whether the positive benefits produced by doubling the dose of algebra in middle and high school actually translate into similar benefits at the college-level is difficult to determine, mainly because college students have more autonomy than high school students in making educational investment decisions. Postsecondary students can choose to enroll or not to enroll in the courses that are offered to help them succeed. From an economic perspective, these decisions can be considered as investments in human capital (Becker, 1964), and involve a host of factors such as intended course of study and career goals, the likelihood of completion, total costs and debt value, and expected lifetime earnings (Oreopoulos & Petrojinevic, 2013). More specifically, Paulsen (2001) characterizes the choice to enroll in a course as a persistence decision that is related to the expected return on human capital investment. Therefore, when examining college student choices, it is not only important to consider achievement outcomes in higher education EXTENDING TIME IN DEVELOPMENTAL MATH 132 such as credentials obtained or subsequent earnings, but also measures of persistence from semester to semester, which are also indicative of students’ human capital investments. A potential problem with the practice of extending math from one semester to two – which we note is different from the double dose algebra policies described above – is that the additional time requirement may influence students’ persistence decisions. Rather than doubling the amount of time within one school day or semester, this practice of extending courses inherently increases the number of points at which a student can exit the developmental math sequence. The first human capital investment decision is to enroll in the two-semester course sequence knowing that the road will be longer and the costs higher. Students face a second persistence decision after completing the first course in the two-course sequence where they must again consider the costs of an additional semester of math remediation. Evidence shows that community college students may be sensitive to policies that lengthen the amount of time they need to spend to reach their academic goals. In fact, students who are assigned to longer developmental course sequences have the toughest time obtaining an associate’s degree or meeting the requirements for upward transfer (Bailey et al., 2010; Crisp & Delgado, 2014; Fong et al., 2015). While this may reflect rational decisions based on economic choice models, it is important to note that student persistence can also be understood through sociological and psychological lenses (Melguizo, 2011). Sociological theories would highlight institutional practices and uncover the roles of academic, social, and cultural capital in explaining patterns of student achievement (Bourdieu, 1986). For example, it may be that students in developmental math courses systematically fail to receive the high quality instruction and support services necessary to keep them committed to pursuing their goals, and that academic environments are constructed in ways to reproduce existing inequalities. EXTENDING TIME IN DEVELOPMENTAL MATH 133 Psychological theories might suggest that persistence is heavily influenced by non-cognitive abilities such as a student’s level of motivation, sense of self-efficacy, and preference for long- term goals. The concept of self-efficacy, for example, would suggest that people choose actions that they determine to be consistent with their perceived abilities (Bandura, 1995). It is possible that placement exam results could shape students’ perceptions of their abilities and thus influence college persistence decisions (Deil-Amen & Tevis, 2010). Drawing from these frameworks, theories of college student persistence have underscored the important roles of such factors as social and academic integration (Tinto, 1993), support systems (e.g., parents and financial aid) (Crisp & Nora, 2009), high quality instruction (Pascarella, Salisbury, & Blaich, 2011), and validating interactions with faculty (Barnett, 2011; Rendón, 1994) in influencing college student persistence. While these sociological and psychological theories provide important frameworks for understanding college student persistence, the practice of extending math remediation over two semesters inherently increases costs and thus can be characterized as a human capital investment decision (Paulsen, 2001). We therefore frame our study using human capital theory, which suggests that students fail to persist because the direct costs (i.e., tuition, books, fees) and the indirect costs (i.e., opportunity cost) of taking an additional math course far outweigh the private returns that results from investing in that course, or that the desired goal feels too far off and unattainable and therefore not worth the investment (Manski & Wise, 1983; Paulsen, 2001). A policy that lengthens time in developmental math may thus deter students from persisting and succeeding in college, even if it promises to build student skill and knowledge. As we will describe next, we can specifically test how increasing the cost of remediation in terms of time and money affects student persistence in community colleges. Our study EXTENDING TIME IN DEVELOPMENTAL MATH 134 contributes empirical evidence to the literature by examining the effects of extending developmental math from one to two semesters. While other empirical studies have evaluated alternative models of delivery such as acceleration, student success courses, learning communities, and supplemental instruction, our study examines the impact of a far more prevalent model of delivery in some California community colleges, which may also exist in other community colleges across the country. Although extending time in developmental math may provide academic benefits to students, a human capital framework and theories of college student persistence posit that the additional time students must spend in math remediation may also deter or discourage students from persisting towards their academic goals. Therefore, determining the achievement and persistence impacts of extending time in math remediation can provide important direction for community colleges and their delivery of developmental math. Setting The setting for the study is four large urban community colleges in a large metropolitan area of California. About one-quarter of all community college students in the U.S. are enrolled in California community colleges, many of which are in located in urban centers (Foundation for Community Colleges, n.d.). Being open-access institutions, these four community colleges serve a widely diverse body of students, with more than a quarter of students over 35 years of age, and over 40 percent indicating that their native language is not English. Close to 90 percent of students report having completed a high school level education. 2 This student population is different from the national community college student population since about two-thirds of all students in these California colleges identify as African-American or Latina/o. In contrast, the majority of students who enter a community college in the U.S. are White, and just over one- third identify as African-American or Latino (NCES, 2014). 2 Source: California Community College Chancellor’s Office DataMart (http://datamart.cccco.edu/datamart.aspx) EXTENDING TIME IN DEVELOPMENTAL MATH 135 Placement Into and Delivery of Developmental Math Students with the goal of earning an associate’s degree or transferring to a four-year university in these four colleges must take placement tests in math and English to determine the courses in which they should enroll. These courses are often of a remedial or developmental nature and designed to prepare students to be successful in college-level courses. The developmental math sequence can include as many as four or five courses, and typically follows the progression of arithmetic (AR), pre-algebra (PA), elementary algebra (EA), and intermediate algebra (IA). Students must pass these latter two courses to be considered eligible for an associate’s degree 3 or to transfer 4 to the California State University or University of California four-year systems, respectively. Due to the decentralized nature of California community college governance, colleges have considerable discretion in determining how students are placed into the developmental math sequence and how the sequence is taught. This includes autonomy in selecting placement instruments 5 , setting cut scores, choosing additional measures to incorporate into placement decisions (e.g., high school GPA or prior math courses), as well as flexibility in designing the courses themselves (Melguizo, Kosiewicz, Prather, & Bos, 2014; Perry, Bahr, Rosin, & Woodward, 2010). Three of the four colleges in the study used the ACCUPLACER placement test and one used the COMPASS to place students in courses. These instruments, developed by the College Board and ACT Inc., respectively, are among the most commonly-used placement tests in the country (Fields & Parsad, 2012). They are computer-based and rely on an adaptive algorithm to identify student skill in arithmetic, algebra, and college-level math (Mattern & Packman, 2009). 3 In 2009, California increased the minimum math requirement for an Associate’s degree from elementary algebra to intermediate algebra. 4 While the State of California determines the academic requirements for the associate’s degree, two- and four-year institutions establish local articulation agreements that determine which math courses transfer and which do not. 5 California is moving towards a common placement system for community colleges (See Burdman, 2015). EXTENDING TIME IN DEVELOPMENTAL MATH 136 Extended Elementary Algebra Courses Despite some variation in choice of placement test and associated cutoffs, the four colleges that we focus on in the study deliver developmental math in the same way. They have veered from the typical developmental math sequence described above by lengthening the amount of time some students spend in EA from one to two semesters and assigning students to these courses based on placement test results. Figure 1 portrays this extended course sequence and Table 1 shows the cut scores used to determine whether students are assigned to one or two semesters of EA based on placement test results. According to our examination of descriptions in course schedules from each college, as well as those available from math department websites, students in either type of course are expected to cover roughly the same content from day one. Both types of algebra courses typically begin with a review of pre-algebra topics (e.g., algebraic expressions, exponents), move on to linear equations, and conclude with quadratic functions. The only difference is that students in extended EA courses cover these topics over the duration of two semesters instead of one. [Insert Figure 1. Math Sequence with Extended Algebra] [Insert Table 1: Placement Policies] Although Table 1 shows that these colleges use different cut scores and placement tests to sort students into the regular versus extended EA course, a common characteristic of these placement policies is that assignment to either course can be determined by a single point. For example, students in College B who obtained a placement score of 45 on ACCUPLACER’s Elementary Algebra subtest would have been placed into a two-semester EA course. Had those students obtained an additional point on their placement score, they would have been subsequently placed into the typical one-semester EA course. In the next section, we describe the data used for the study and argue that the way in which these placement policies are EXTENDING TIME IN DEVELOPMENTAL MATH 137 implemented creates a natural experiment that can be exploited to estimate the causal effects of placement into a two- versus one-semester EA course on academic achievement. Data We utilized student-level administrative records that link students’ demographic data, assessment records, course enrollment, and academic outcomes. The sample from the four colleges included student cohorts assessed for math for the first time between the spring 2005 and the spring 2012 semesters. We tracked enrollment and performance outcomes for the 2005- 2008 cohorts through spring 2012, and the 2008-2012 cohorts through fall 2013. 6 This study window was chosen to include only those cohorts that were subject to placement in an extended EA course and those where no other placement policy changes occurred. This is the 2009-2012 cohorts in College A, the 2005-2011 cohorts in Colleges B and C, and all cohorts in College D. To create a sample that most closely resembled typical degree-seeking community college students, we excluded students who were concurrently enrolled in high school, over the age of 65, or had already received an Associate’s or Bachelor’s degree at the time of testing. Table 2 presents summary statistics for the students placed into extended versus one- semester EA course in the four colleges. The pooled sample from Colleges A-D includes 12,805 students, of which 6,228 (48.6 percent) were placed in an extended course sequence based on their test score. Because we used a regression discontinuity framework for this analysis (described further below), we show statistics for the full sample alongside those for a restricted sample of students who scored within a one, half, third, and quarter standard deviation (SD) test score bandwidth from the placement cutoff in each college. While there are some differences in 6 The last cohort in the sample consists of those assessed in spring 2012. We can observe their enrollment outcomes for spring and fall of 2012, and spring and fall of 2013. We also included any summer or winter quarter enrollments. EXTENDING TIME IN DEVELOPMENTAL MATH 138 group means for demographics and outcomes between those placed in two- and one-semester algebra, the samples are more similar within narrower bandwidths. Outcomes We also present sample mean outcomes at the bottom of Table 2. The short-term outcomes of interest are whether the student: 1) enrolled in any math course within a year after taking the math assessment, 2) passed EA (i.e., completing the EA sequence), 3) attempted 7 the next course in the sequence, IA, and 4) passed IA. We note that we inputted zeroes for these short-term outcomes. If, for example, a student attempted and passed EA but decided not to enroll in IA, then they were assigned a value of zero for both attempting and passing IA. We reasoned that the goal of offering developmental math courses and assigning students to them via placement policies is to prepare students to pass these upper-level math courses, and if students do not progress, that is largely a consequence of the initial placement decision. Also, while students who earn a C in a developmental math course are allowed to enroll in the next course, we also examined the “B or better” outcome because it is a less contentious evaluative outcome. It is more likely that a student just below the cutoff and assigned to extended EA could have earned a C in one-semester EA, but it presumably would be more difficult for that same student to earn a B. In addition, research has shown that placement tests are more accurate at identifying students who would earn a B or better versus some lower criterion (Scott-Clayton et al., 2014). Thus our estimate using the B or better criterion may be more related to course placement than to placement test accuracy. The data also enabled us to examine longer-term outcomes of interest in the community college setting. We calculated the total units completed within one year of the assessment, as 7 An attempt is defined as an enrollment past the no-penalty drop date, after which the student would receive a mark on his/her transcript. The no penalty drop date is usually about 1-2 weeks after classes begin. EXTENDING TIME IN DEVELOPMENTAL MATH 139 well as total degree-applicable units completed through spring 2012 for the 2005-2008 assessment cohorts, and through fall 2013 for the 2009-2012 cohorts. This means that we could track student degree-progress for up to seven years for the 2005 cohort, and one and a half years for the 2012 cohort. While we could not observe whether or not students actually earn a degree or transfer to a four-year college, the number of degree-applicable units completed served as an indicator of progress towards these goals. [Insert Table 2 here: Descriptive statistics of sample and outcomes] Overall, the descriptive results show low rates of persistence and completion for students placed into two-semester or one-semester algebra. Only 26 percent go on to attempt IA, and only about 18 percent complete the course with a C or better (which is the minimum for an associate’s degree and as a prerequisite for some STEM courses). Only nine percent earn a B or better. When disaggregating these results by initial placement, we observe that 19 percent of students assigned to extended EA ever attempted IA and 7 percent passed it with a B or better, whereas about 33 and 11 percent of students placed directly in one-semester EA achieved these outcomes. Conditioning student persistence outcomes on enrollment (i.e., without inputting zeroes for unobserved outcomes) presents a different but equally important picture of student progression (Fong et al., 2015). In Figure 2, we present progression rates conditional on attempting each course. Doing so enables us to examine persistence decisions (attempting courses) and completion rates (passing courses) for students who chose to progress through the math sequence. We see that at the first persistence decision point, students placed in extended EA courses are less likely than students placed in one-semester EA to attempt the first course (71 versus 89 percent). Yet surprisingly, after one semester of coursework these students have a lower rate of attrition (19 percent) than one-semester EA students (27 percent). However, the EXTENDING TIME IN DEVELOPMENTAL MATH 140 third persistence decision results in another decrease in cohort size. Of those who pass the second course in the extended EA sequence, 23 percent choose not to enroll in IA, resulting in just 13 percent of the original sample completing the developmental math sequence, compared to 30 percent students initially placed in one-semester EA. While this is suggestive of a problem with extended EA, these progression and completion rates are conditional on attempting the course and may reflect the role of unobserved factors such as student motivation and effort. We next describe the empirical strategy we used to account for these unobservable factors and to isolate the impact of extended EA courses on college student persistence and completion. [Insert Figure 2. Conditioning outcomes on enrollment] Empirical Strategy Regression Discontinuity Design Each college in the study utilizes a placement policy where a cutoff score differentiates assignment into two-semester versus one-semester algebra. We were therefore able to use a regression discontinuity (RD) approach and exploit the exogenous variation in student outcomes resulting from a sharp cutoff in initial course assignment. This enabled us to estimate the causal effect of placement into two-semester EA relative to one-semester EA on student outcomes. Because the RD approach assumes baseline equivalency for students at the margin of a cutoff score, differences in outcomes between groups can be attributed directly to assignment to extended EA courses. A basic model for the RD design for each campus is: 𝑌𝑌 𝑖𝑖 𝑖𝑖 = 𝛽𝛽 1 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 𝑖𝑖𝑖𝑖 + 𝛽𝛽 2 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 𝑖𝑖 𝑖𝑖 + 𝛽𝛽 3 ( 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 ∗ 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 ) 𝑖𝑖 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 𝑖𝑖 + 𝜑𝜑 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 𝑖𝑖 (1) where i indexes students and t indicates the assessment cohort. Y i are the outcomes of interest, and β 1 is our coefficient of interest and captures the estimated treatment effect of assignment to extended EA in each campus. β 2 describes the relationship between the test score running EXTENDING TIME IN DEVELOPMENTAL MATH 141 variable and student outcomes, and β 3 describes the two-way interaction between assignment to extended EA and test scores. 8 The coefficients within γ’ capture the influence of the following student-level demographic variables on the outcomes of interest: age at assessment, sex, race/ethnicity; primary language, and residence/visa status. These were gathered from student records. Although covariates are not necessary in RD design since discontinuities in outcomes should only be correlated with treatment status, including them can increase precision since these particular variables may explain some of the variation in persistence outcomes (Fong et al., 2015). The model also includes cohort dummies φ t to account for variation by semester. Pooled model. While the results of individual campus estimation provide an understanding of the effect of extended EA at each campus, they have limited external validity and may possibly be more reflective of campus characteristics than the extended EA treatment. We therefore estimated the remaining models with a pooled four-campus sample. This increases statistical power and external validity, and inclusion of covariates can account for some of the variation in outcomes that is related to differences in student populations across campuses. A concern with pooling the four campuses in this way, however, is the threat to the internal validity of the β 1 coefficient. There may be unobservable differences across campuses that are correlated with student outcomes. For example, if College A had more experienced faculty than College B or had other unobserved characteristics that influenced student outcomes, then this would potentially bias a pooled RD treatment effect estimate. In addition, each campus used a different test and different cut scores for placement in extended EA, meaning that each campus cut score may be at different point along the distribution of student ability. 8 Since RD estimation hinges on correct specification of functional form, we also tested higher-order polynomials of the running variable to account for the possibility of non-linearity (Lee & Lemieux, 2010). EXTENDING TIME IN DEVELOPMENTAL MATH 142 To address these concerns we normalized each student’s test score around the mean test score for each college and assessment cohort and centered this around the standardized cutoff value for each college. Importantly, we also included fixed effects by campus μ j that enable us to account for systematic differences between colleges, such as type of placement test and choice of cutoff, and to identify the RD estimate within each college. 9 Similar approaches have been used in other RD studies where treatments are assigned at multiple cutoffs. For example, van der Klaauw (2002) examined the impact of receiving financial aid on college enrollment in a context where aid was awarded in different doses at different thresholds of student ability. He pooled observations from different samples along the ability distribution and included sample-specific intercepts to obtain a weighted average of the treatment effect of financial aid. Taking a similar approach, the pooled RD model across campuses j that we use is: 𝑌𝑌 𝑖𝑖 𝑖𝑖 𝑖𝑖 = 𝛽𝛽 1 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛽𝛽 2 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛽𝛽 3 ( 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 ∗ 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇𝑇𝑇 ) 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜇𝜇 𝑖𝑖 + 𝜑𝜑 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 𝑖𝑖 𝑖𝑖 (2) Estimation of this model gives the average treatment effect of extended EA across the four- campus sample. Including covariates and cohort dummies can also account for variation related to enrollment trends across the district and improve precision of the RD coefficient. Issues of Non-Compliance The estimate for β 1 obtained from equations (1) and (2) above would be unbiased if there was perfect compliance with placement results, that is, all students who were assigned to extended EA actually enrolled in it. However, we did observe a group of non-compliers – students who simply did not enroll, and of students who enrolled, those who did not comply with their placement decision. Table 3 presents overall enrollment and compliance rates for the full 9 We reason that differences in placement tests and cutoffs may not be a major concern because quantitative studies have found relatively low correlation between these placement tests and student outcomes (Scott-Clayton et al., 2014). Furthermore, qualitative research suggests has shown that test choice and placement cutoffs are typically chosen and set fairly arbitrarily (Melguizo, Kosiewicz, Prather, & Bos, 2014). EXTENDING TIME IN DEVELOPMENTAL MATH 143 sample. 10 About 85 percent of students enrolled in any course in any department and 62 percent enrolled in a math course. It is important to note that students assigned to extended EA in these colleges were about three percentage points less likely to enroll in any course (p<.001), and about nine percentage points (p<.001) less likely to enroll in a math course after the assessment compared with students assigned to a typical one semester course. This is consistent with findings indicating that students assigned to the lower levels of the developmental math sequence are less likely to “show up” after the placement test (Bailey et al., 2010; Fong et al., 2015). [Insert Table 3 here: Enrollment and Compliance] [Insert Figure 3 here: Compliance] Figure 3 also reveals imperfect compliance around the cutoff, and we attribute this mainly to a California community college policy that allows students to challenge their placement results. Doing so generally involves a process of providing evidence of prior math achievement and obtaining permission to enroll in an alternative course from an instructor. Overall, we observe that about 81 percent of students placed in one- or two-semester algebra subsequently enrolled complied with their placement. However, the disaggregated rates are different between groups. Among students assigned to extended EA, 71 percent complied with their placement, and about 20 percent enrolled in a higher-level course (e.g., one-semester EA, IA, or transfer-level math). Among students assigned to one-semester EA, nearly 90 percent complied with their placement and four percent enrolled in a higher-level course. These differences in compliance pose a threat to the internal validity of RD estimates. Compliers may be different from non-compliers along observable characteristics such as race or language status, or unobservable characteristics such as effort or motivation. If these variables predict non-compliance, then they may also be correlated with the outcomes of interest and result 10 See Table 2 for compliance rates by college. EXTENDING TIME IN DEVELOPMENTAL MATH 144 in biased estimates of the effect of extended EA courses. We discuss a strategy to remedy this problem in the next section. Fuzzy Regression Discontinuity Design Instrumental variables (IV) estimation can be used in scenarios where there is imperfect compliance with the assignment to treatment, and in the RD setting this is commonly referred to as “fuzzy” RD design (Murnane & Willett, 2010). Here, the endogenous variable is enrollment in the extended EA course (i.e., compliance with the placement rule), and the exogenous instrument is assignment to or placement into extended EA based on the placement cutoff. The key assumptions underlying the IV strategy are that the instrument, in this case assignment to extended EA, strongly predicts treatment compliance, and that assignment is only correlated with the outcomes of interest through compliance with the treatment assignment (Murnane & Willett, 2010). Below, we outline the two-stage least squares estimation and discuss tests of the instrument below. In the first stage of the fuzzy RD, we predicted compliance with the treatment (i.e. enrollment in extended EA) using the same predictors as the model above: 𝐸𝐸 𝐸𝐸 𝑇𝑇𝑇𝑇𝐴𝐴 𝐸𝐸 𝑖𝑖 𝑖𝑖 𝑖𝑖 = 𝛿𝛿 1 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛿𝛿 2 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇𝑇𝑇 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛿𝛿 3 ( 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 ∗ 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 ) 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜇𝜇 𝑖𝑖 + 𝜑𝜑 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 𝑖𝑖 𝑖𝑖 (3) We used the estimated coefficients to obtain the predicted probabilities of compliance with the treatment. The predicted probability of compliance serves as the regressor of interest in the second stage regression. Per the recommendation of Murnane and Willett (2010), we included the same baseline covariates from the first stage in the second stage model shown here: 𝑌𝑌 𝑖𝑖 𝑖𝑖 𝑖𝑖 = 𝛽𝛽 1 𝐸𝐸 𝐸𝐸 𝑇𝑇𝑇𝑇𝐴𝐴 𝐸𝐸 � 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛽𝛽 2 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝛽𝛽 3 ( 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴 𝐴𝐴 𝐴𝐴 ∗ 𝑇𝑇 𝑇𝑇 𝐴𝐴 𝑇𝑇 ) 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜸𝜸 ′ 𝑋𝑋 𝑖𝑖 𝑖𝑖 𝑖𝑖 + 𝜇𝜇 𝑖𝑖 + 𝜑𝜑 𝑖𝑖 + 𝜀𝜀 𝑖𝑖 𝑖𝑖 𝑖𝑖 (4) Here, β 1 is the local average treatment effect (LATE) of complying with placement in an extended EA course on student success outcomes. In other words, β 1 approximates the expected EXTENDING TIME IN DEVELOPMENTAL MATH 145 impact of being placed into extended EA on academic achievement. Before presenting the results of analyses using this IV estimation method in a RD setting, we discuss two checks we performed to test key assumptions underlying the validity of the RD approach. Manipulation of the running variable. A requirement for obtaining internally valid estimates in the RD setting is that some running variable, here the test score, discontinuously assigns students to the treatment and control conditions based on an exogenously determined cutoff and students cannot manipulate their scores or treatment status (Murnane & Willett, 2010). One way this exogeneity assumption might be invalidated is through retesting, since students may be able to repeat tests in order to achieve a certain score. While retesting may be a cause for concern in other testing contexts, students in the LUCCD typically are not allowed to retest until after one year from the original assessment date and cutoffs are not made public so students would not know what score they would need to attain. Furthermore, both tests used for this analysis, the ACCUPLACER and COMPASS, are computer-adaptive placement tests that would be difficult to manipulate to attain scores just above or just below the cutoffs. 11 We therefore include students who retested in our analyses. 12 We also ran all models with retesters excluded and the results we obtained were of very similar magnitude and statistical significance compared to those presented below. Covariate balance and continuity. A second key assumption underlying the internal validity of the RD design concerns the continuity of covariates at the test score cutoff. Discontinuities in these covariates would bias the estimate of the treatment effect (Murnane & Willett, 2010). We visually examined this by plotting trends in covariates around the placement 11 If students were aware of the test score cutoff and manipulated their scores in order to surpass it, or alternatively, if colleges utilized assessment practices that resulted in non-smooth distributions of test scores, then this would threaten the validity of the RD estimates. We therefore conducted McCrary (2008) density tests to examine manipulation of the running variable around the cutoff in each college and provide these results and corresponding figures in the Appendix. There is no evidence of manipulation at any college cutoff for assignment to extended courses. 12 Re-testers constitute about eight percent of the sample EXTENDING TIME IN DEVELOPMENTAL MATH 146 cutoffs using a local polynomial smoothing function (see Appendix). Following the recommendations of Lee and Lemieux (2010), we also conducted a set of parallel RD analyses to estimate discontinuities in observable covariates (e.g., age at assessment, sex, race/ethnicity; primary language, and residence/visa status). The results, which are available in the Appendix, indicate that there are no covariate discontinuities around the placement cutoff. 13 With these internal validity checks completed, we proceed with presentation of the main RD findings. Findings We first present graphical displays of discontinuities in select outcomes. In Figure 4, we plot the relationship between test score and these outcomes around the cutoff between extended and one-semester EA using locally weighted scatterplot smoothing (Lowess) curves. 14 On the left of each cutoff are the mean outcomes of students assigned to extended EA, and on the left are the outcomes of students assigned to one-semester EA. We observe evidence of a discontinuity for each outcome at the cutoff, suggesting that assignment to extended EA resulted in lowered probabilities of achieving each outcome. [Insert Figure 4: Outcomes] The two-stage estimation of fuzzy RD design complements these graphical analyses. The results of these first stage regressions for each college are shown in Table 4. It is clear that assignment to extended EA is a strong predictor of actual enrollment in an extended EA course. The coefficient for the assignment indicator variable is .682 for the full sample (p<.001), and the estimates are similar for narrower bandwidths. Furthermore, the F-statistics reported in Table 4 indicate that we find evidence of a strong and relevant instrument for each bandwidth. It is 13 There is some indication of a discontinuity in the age variable. While this may provide a threat to the internal validity of the RD estimate, it is possible that a discontinuity may have been estimated by chance. To test this, Lee and Lemieux (2010) suggest computing a set of Seemingly Unrelated Regressions, in which the errors across the regressions described above are allowed to be correlated across regression equations. We then conduct a joint test that the coefficient for treatment assignment (indicating any discontinuities at the cutoff) is zero. This joint test indicates that there is no discontinuity in the set of covariates at the cutoff. We therefore include all covariates in the final RD models to control for these observable differences and increase precision of the treatment effect estimates. 14 The curves are drawn using a Lowess smoother with running means, which is a local linear regression model. EXTENDING TIME IN DEVELOPMENTAL MATH 147 interesting to note that higher-scoring students, younger students, African-American students, and white students were less likely to comply with their placement assignments. Female students were more likely to comply with their placement assignments. Individual results from first stage regression for each campus are available in the Appendix. [Insert Table 4 here: First stage] Table 5 outlines the main results from the second stage regressions. The dependent variables include the following main outcomes of interest: enrolling in any math course within one year of assessment, attempting and passing EA, attempting and passing IA, total units completed within a year of the assessment date, and total units completed through the study period. We present estimates for the entire range of students placed in two- and one-semester EA, along with the estimates obtained when restricting the sample to various bandwidth sizes. This is recommended for testing sensitivity of the results to different bandwidths in the RD design (Lee & Lemieux, 2010). We present bandwidths of 1.0, 0.5, and 0.33, and 0.25 standard deviations (SD) above and below each cutoff, but most of our discussion of the results will focus on the fuzzy RD estimates within a bandwidth of 0.5 SD. We reason that the bandwidth of 0.5 SD above and below the cutoff increases statistical power relative to a bandwidth of .33 SD, and is more likely to maintain equality in expectation around the cutoff relative to a bandwidth of 1 SD. In addition, we calculated the optimal bandwidth for each outcome using the method outlined in Imbens and Kalyanaraman (2012), and found that they generally ranged from 0.3- 0.6SD, so we use 0.5SD for consistency of interpretation. The full regression models include the covariates and fixed effects described earlier but these estimated coefficients are not shown here for simplicity of presentation. 15 Individual campus results are also available in the Appendix. 15 The full results are available from the authors upon request. EXTENDING TIME IN DEVELOPMENTAL MATH 148 While there is variation across campuses, suggesting extended EA may affect student persistence in nuanced ways on each campus, the overall pattern of results is similar to that described next. Enrolling in Math We previously documented in Table 3 that students assigned to extended EA were less likely to enroll in any math regardless of the level after receiving their placement results, and this is in accordance with other findings showing that lower-ability students are less likely to enroll after initial assessment (Bailey et al., 2010). However, it is possible that students faced with the decision to enroll in the two-semester sequence are deterred by the cost of an additional semester of remediation. To investigate whether this may be attributable to the placement policy, we use the regression discontinuity setup in equation (2) to examine differences in enrollment decisions between students scoring just above and below the cutoff. We use the “sharp” RD approach in equation (2) since the outcome of interest is initial enrollment and there is no second stage. The results of this are presented in Table 5 and indicate that assignment to extended EA did not affect the decision to enroll within any of the bandwidths shown. 16 Thus while enrollment rates are lower on average for the whole group, it appears that two-semester algebra students of similar ability near the one-semester cutoff did not have a different propensity to enroll compared to their one-semester counterparts. This corroborates findings from other researchers showing that assignment to remediation does not appear to discourage students from enrolling in math (Martorell, McFarlin, & Xue, 2015; Scott-Clayton & Rodriguez, 2015). Attempting and Passing Gatekeeper Courses We next examined completion of the EA course (either the second part of two-semester sequence, or the one-semester course) and IA, the next course in the sequence, using the fuzzy regression discontinuity approach indicated by equations (3) and (4). In Table 5, the FRD 16 It does appear to do so in College A (see Appendix) EXTENDING TIME IN DEVELOPMENTAL MATH 149 estimates indicate that there is a significant negative effect of placement in extended courses on eventually completing the EA level. Students at the margin of the cutoff and placed in extended EA were about 8-10 percentage points less likely to eventually pass EA with a B or better. These discontinuities are demonstrated visually in Figure 4. As expected, point estimates were larger for the C or better criterion, with the likelihood of passing EA being about 20 percentage points lower for those placed in the extended courses. 17 [Insert Table 5: Main FRD Results] The RD results also indicate that students placed into a two-semester versus a one- semester EA course had a lower probability of attempting and passing IA courses, the next course in the sequence, which is a gatekeeper course to transfer-level math and some introductory STEM courses. The estimates indicate that enrollment in extended EA reduced the likelihood of attempting IA by about 13 percentage points, passing IA with a C or better by about 8 percentage points, and passing with a B or better by about 5-6 percentage points, depending on bandwidth size. College Persistence Finally, the data enable us to examine the total number of units completed after taking the math assessment for each cohort. The estimates have a higher degree of variation across bandwidths, since a host of other unobservable student-level variables such as motivation or financial need may affect persistence decisions. Overall, the RD results indicate that there was no difference in total units completed within one year of the assessment between students assigned to extended EA and those assigned to one-semester EA at the margin of the cutoff. Examination of the completion of degree-applicable credit produced a slightly different result. It appears that students in extended EA completed fewer degree-applicable units than their counterparts in one- 17 These results are available from the authors upon request. EXTENDING TIME IN DEVELOPMENTAL MATH 150 semester EA, ranging from about 3 to 8 units depending on the bandwidth. The estimates are not all statistically significant but their similar magnitude and negative direction suggest that enrollment in the extended EA course points towards deleterious long-term effects on persistence and progress towards obtaining a college degree. Generalizability, Robustness, & Falsification Tests The RD design produces causal estimates of the impact of extending time in math remediation, but it is important to emphasize that the estimated treatment effect only applies to students at the margin of the cutoff and does not extrapolate down to students with lower test scores. While our estimates show a significant negative effect for students with scores that are within narrow bandwidths around the cutoff, it is possible that these effects may be different for students even further below the cutoff. The extended algebra sequence, by providing additional time to master algebra skills and progressing through content at a slower pace, may be more beneficial to these students who obtained lower placement scores. However, Scott-Clayton et al. (2014) find that there are actually relatively weak correlations between placement test scores and student outcomes, lending credence to the idea that RD estimates can be extrapolated further down the continuum of scores. Indeed, some of the estimates we obtained (e.g., for passing EA) are fairly robust for different bandwidth sizes, suggesting that the results are internally valid and generalizable to a broader range of students around each cutoff. Sensitivity & Robustness We conducted several sensitivity analyses to check the robustness of the estimates discussed above and present these in Table 6. In column 1, we show the estimated effect from our preferred specification (0.5 SD, with a linear interaction of the test score and treatment variable, with covariates, and with campus and cohort fixed effects). In column 2 we show the EXTENDING TIME IN DEVELOPMENTAL MATH 151 model without any covariates and excluding cohort and campus fixed effects. The main concern with larger bandwidth sizes in the RD design is that the data are less likely to be modeled best by just a linear specification. In columns 3-4 we show the inclusion of quadratic and cubic forms of the running variable and the interaction of these with the treatment indicator. Overall, these sensitivity analyses indicate that the results we obtained are robust to model specifications. [Insert Table 6: Sensitivity Analyses] Finally, to ensure that the RD estimates are local to the placement cutoffs and not just an artifact of the RD design, we also performed falsification tests by running sharp RD analyses at placebo cutoffs 0.25 SD above and 0.25 SD below the actual cutoff. These are shown in columns 5 and 6 of Table 6. We cannot use the fuzzy RD approach here since we do not observe compliance under these hypothetical scenarios. The results of these tests fail to show any significant treatment effects at any of the placebo cutoffs above and below the actual cutoff for any of the outcomes. We are therefore confident that the estimates we obtained are specific to the extended algebra course placement policy. Discussion & Conclusion This study provides insight into the effectiveness of an alternative model of delivery of developmental math in community colleges, one for which there is little empirical evidence. Intuition suggests that extended math sequences can provide students with the time and learning context they may need to develop math skills that might be lacking. This extra preparation may enable students to be more successful in subsequent STEM courses or their postsecondary careers. At the same time, extending math courses increases the amount of time students are expected to be in math remediation, and increases the number of opportunities that a student can exit the developmental math sequence. This is a concern because having more exit points is EXTENDING TIME IN DEVELOPMENTAL MATH 152 associated with higher rates of attrition (Bailey et al., 2010; Fong et al., 2015), and additional time in remediation can incur substantial costs in terms of time and money that students may not be able or willing to pay (Melguizo et al., 2008). According to the colleges’ fee schedules for spring 2013, the actual cost of the additional course would be about $184 ($46/unit x 4 units). However, this estimate does not include any additional costs associated with attending college (e.g., books, materials, transportation), or the opportunity cost associated with time spent in the additional remedial course. Overall, the RD results show that placement in extended math courses in these four colleges resulted in unfavorable student outcomes. Although students who were at the margin of assignment to extended algebra courses were mostly just as likely to enroll in math as students assigned to the one semester course, they were less likely over time to complete subsequent courses compared to their counterpart right above the cutoff. Enrollment in extended algebra reduced the probability that students at the margin of the cutoff completed, with a B or better, elementary algebra by 8-10 percentage points, and the probability that they progressed on to pass intermediate algebra by about 5-6 percentage points. While there are largely no differences in credit attainment one year after assessment, students in extended algebra appeared to have completed fewer degree-applicable credits than students placed in the one-semester course. This corroborates the “diversion” hypothesis outlined by Scott-Clayton and Rodriguez (2015) – students in remediation earn credits at about the same rate as students in higher-level courses, but not ones that count towards degrees progress. Although we cannot specifically tease out whether this is directly attributable to increasing the number of exit points, or just increasing the overall duration of the sequence, or a combination of both, the practice of extending math courses in this EXTENDING TIME IN DEVELOPMENTAL MATH 153 way reduces the likelihood that students will complete the developmental math sequence and persist in college towards credential attainment. This study contributes to the math education literature because increasing time in math has been found to be beneficial for student achievement in K-12 settings (Cortes et al., 2015; Taylor, 2014). We provide evidence from California community colleges suggesting that this may not necessarily hold in postsecondary contexts where students have more autonomy in making enrollment and persistence decisions. Attrition from developmental math sequences, for example, may largely be the result of the freedom afforded within them. Models that add exit points where students must make additional enrollment decisions may therefore not be helpful for community college students since they can exacerbate the unstructured nature of community college experiences. Our findings suggest that guided pathway models that are more structured may be beneficial for developmental math students (Bailey, Jaggars, & Jenkins, 2015). Similarly, models that extend time and increase costs can also deter or discourage students from continuing on in their college careers. Reforms in developmental education should thus consider the influence of time and cost on students’ enrollment decisions as well as the structure of academic experiences; the opportunity costs to students associated with any reform should not outweigh the reforms’ potential academic benefits. A promising approach may be acceleration models, which combine courses and shorten the overall time spent in remediation (Hern, 2012; Jaggars, Hodara, Cho, & Xu, 2013; Moltz, 2010; Rutschow & Schneider, 2011). However, aside from a few studies highlighting the promise of acceleration models, there is still limited evidence of the effectiveness of this approach. Although our research design accounts for unobservable factors related to student success, we note that faculty, instruction, and other institutional factors may play a large role in EXTENDING TIME IN DEVELOPMENTAL MATH 154 explaining differences between the outcomes of students placed in alternative models of delivery such as extended courses. Further research should examine the nature of classroom and institutional environments, the stigma of extended time in remediation, and how these may influence developmental math outcomes. Nevertheless, the results of the study provide rigorous and important evidence on the impact of extending time in math remediation. Despite the potential benefits to learning of increasing time in math, extending math over two semesters inherently increases the need for students to make persistence decisions at exit points and increases the costs associated with completing math remediation. Delivering developmental math in this way may ultimately offset the benefits that additional time spent studying math may yield and, as our study shows, prevent students from attaining their educational goals. EXTENDING TIME IN DEVELOPMENTAL MATH 155 References Aronson, J., Zimmerman, J., & Carlos, L. (1999). Improving student achievement by extending school: Is it just a matter of time? Retrieved from http://files.eric.ed.gov /fulltext/ED435127.pdf Bahr, P. R. (2012). Deconstructing remediation in community colleges: Exploring associations between course-taking patterns, course outcomes, and attrition from the remedial math and remedial writing sequences. Research in Higher Education, 53, 661–693. Bailey, T. R., Jaggars, S. S., & Jenkins, D. (2015). Redesigning America's community colleges: A clearer path to student success. Cambridge, MA: Harvard University Press. Bailey, T., Jeong, D. W., & Cho, S. W. (2010). Referral, enrollment, and completion in developmental education sequences in community colleges. Economics of Education Review, 29, 255-270. Bandura, A. (1995). Self-efficacy in changing societies. New York: Cambridge University Press. Barnett, E. A. (2011). Validation experiences and persistence among community college students. The Review of Higher Education, 34(2), 193-230. Becker, G. S. (1964). Human capital: A theoretical analysis with special reference to education. National Bureau for Economic Research. Bettinger, E., Boatman, A., & Long, B. (2013). Student supports: Developmental education and other academic programs. The Future of Children, 23, 93–115. Bourdieu, P. (1986). The forms of capital. In J. G. Richardson (Ed.), Handbook of Theory and Research for the Sociology of Education (pp. 241–258). New York: Greenwood Press. Burdman, P. (2012). Where to begin? The evolving role of placement exams for students starting college. Boston, MA: Jobs for the Future. Burdman, P. (2015). Degrees of freedom: Probing math placement policies at California colleges and universities (No. 3). Stanford, CA: Policy Analysis for California Education. Cortes, K. E., Goodman, J., & Nomi, T. (2015). Intensive math instruction and educational attainment: Long-run impacts of double-dose algebra. Journal of Human Resources, 50(1), 108-158. Crisp, G., & Delgado, C. (2014). The impact of developmental education on community college persistence and vertical transfer. Community College Review, 42(2), 99-117. Crisp, G., & Nora, A. (2009). Hispanic student success: Factors influencing the persistence and transfer decisions of Latino community college students enrolled in developmental education. Research in Higher Education, 51, 175–194. Deil-Amen, R., & Tevis, T. L. (2010). Circumscribed agency: The relevance of standardized college entrance exams for low SES high school students. The Review of Higher Education, 33(2), 141-175. Edgecombe, N. D. (2011). Accelerating the academic achievement of students referred to developmental education. Retrieved from http://academiccommons.columbia.edu.libproxy.usc.edu/catalog/ac:146646 Fields, R., & Parsad, B. (2012). Tests and cut scores used for student placement in postsecondary education: Fall 2011. Washington, DC: National Assessment Governing Board. Fong, K. E., Melguizo, T., & Prather, G. (2015). Increasing success rates in developmental math: The complementary role of individual and institutional characteristics. Research in Higher Education, 56(7), 719-749. EXTENDING TIME IN DEVELOPMENTAL MATH 156 Grubb, N. (1999). Honored but invisible: An inside look at America’s community colleges. New York: Routledge. Hern, K. (2012). Acceleration across California: Shorter pathways in developmental English and math. Change: The Magazine of Higher Learning, 44(3), 60-68. Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39(4), 327-351. Imbens, G., & Kalyanaraman, K. (2011). Optimal bandwidth choice for the regression discontinuity estimator. The Review of Economic Studies, 79(3), 933-959. Jaggars, S. S., Hodara, M., Cho, S. W., & Xu, D. (2014). Three accelerated developmental education programs: Features, student outcomes, and implications. Community College Review, 43(1), 3-26. Kosiewicz, H., Ngo, F., & Fong, K. (2016). Alternative models to deliver developmental math Issues of use and student access. Community College Review, 0091552116651490. Lee, D. S., & Lemieux, T. (2010). Regression discontinuity designs in economics. Journal of Economic Literature, 48(2), 281-355. Manski, C. F., &Wise, D. (1983). College choice in America. Cambridge, MA: Harvard University Press. Martorell, P., & McFarlin Jr, I., & Xue, Y. (2015). Does failing a placement exam discourage underprepared students from enrolling college? Education Finance and Policy, 10(1), 46- 80. Mattern, K. D., & Packman, S. (2009). Predictive validity of ACCUPLACER scores for course placement: A meta-analysis (Research Report No. 2009-2). New York, NY: College Board. McCrary, J. (2008). Manipulation of the running variable in the regression discontinuity design: A density test. Journal of Econometrics, 142(2), 698-714. Melguizo, T. (2011). A review of the theories developed to describe the process of college persistence and attainment. In Higher education: Handbook of theory and research (pp. 395-424). Springer Netherlands. Melguizo, T., Hagedorn, L. S., & Cypers, S. (2008). Remedial/developmental education and the cost of community college transfer: A Los Angeles County sample. The Review of Higher Education, 31(4), 401-431. Melguizo, T., Kosiewicz, H., Prather, G., & Bos, H. (2014). How are community college students assessed and placed in developmental math? Grounding our understanding in reality. Journal of Higher Education, 85(5), 691-722. Moltz, D. (2010, July 6). Picking up the pace. Inside Higher Ed. Retrieved from http://www.insidehighered.com/news/2010/07/06/accelerated. Murnane, R. J., & Willett, J. B. (2010). Methods matter: Improving causal inference in educational and social science research. Oxford University Press. National Center for Education Statistics. (2014). Digest of Education Statistics 2014. The U.S. Department of Education. Washington D.C. Nomi, T. & E. Allensworth (2009). “Double-dose” algebra as an alternative strategy to remediation: Effects on students’ academic outcomes. Journal of Research on Educational Effectiveness, 2(2), 111–148. Oreopoulos, P., & Petronijevic, U. (2013). Making college worth it: A review of research on the returns to higher education. The Future of Children, 23(1), 41-65. EXTENDING TIME IN DEVELOPMENTAL MATH 157 Pascarella, E. T., Salisbury, M. H., & Blaich, C. (2011). Exposure to effective instruction and college student persistence: A multi-institutional replication and extension. Journal of College Student Development, 52(1), 4-19. Patall, E. A., Cooper, H., & Allen, A. B. (2010). Extending the school day or school year: A systematic review of research (1985–2009). Review of Educational Research, 80(3), 401- 436. Paulsen, M. B. (2001). The economics of human capital and investment in higher education. In M. B. Paulsen (Ed.), The Finance of higher education: Theory, research, policy, and practice, 55-94. Agathon Press. Perry, M., Bahr, P. M., Rosin, M., & Woodward, K. M. (2010). Course-taking patterns, policies, and practices in developmental education in the California Community Colleges. Mountain View, CA: EdSource. Retrieved from http://www.edsource.org/assets/files/ccstudy/FULL-CC-DevelopmentalCoursetaking.pdf Rendón, L. I. (1994). Validating culturally diverse students: Toward a new model of learning and student development. Innovative Higher Education, 19(1), 33-51. Rutschow, E. Z., & Schneider, E. (2011). Unlocking the gate: What we know about improving developmental education. New York, NY: MDRC. Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371-393. Scott-Clayton, J., & Rodriguez, O. (2015). Development, discouragement, or diversion? New evidence on the effects of college remediation. Education Finance and Policy, 10(1), 4- 45. Stein, M. K., Kaufman, J. H., Sherman, M., & Hillen, A. F. (2011). Algebra: A challenge at the crossroads of policy and practice. Review of Educational Research, 81(4), 453-492. Taylor, E. (2014). Spending more of the school day in math class: Evidence from a regression discontinuity in middle school. Journal of Labor Economics, 117, 162-181. Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition. Chicago: University of Chicago Press. van der Klaauw, W. (2002). Estimating the effect of financial aid offers on college enrollment: A regression–discontinuity approach. International Economic Review, 43(4), 1249-1287. EXTENDING TIME IN DEVELOPMENTAL MATH 158 FIGURES & TABLES Figure 1. Delivery of developmental math in LUCCD. Figure 2. Progression through algebra sequence EXTENDING TIME IN DEVELOPMENTAL MATH 159 Figure 3. Enrollment in extended elementary algebra courses, 4 colleges (a) Pass EA with a B or better (b) Attempt IA (c) Pass IA with a B or better (d) Total Degree-Applicable Units Completed Figure 4. Discontinuities in persistence and completion outcomes EXTENDING TIME IN DEVELOPMENTAL MATH 160 Table 1. Cut scores differentiating placement into two versus one semester math courses Two Semesters One Semester Two Semesters One Semester ACCUPLACER AR Subtest ACCUPLACER EA Subtest College A 50<=PS<65 65<=PS<76 College B PS>73.5 - 35.5<=PS<45.5 45.5<=PS<60.5 College C 100<=PS<113 113<=PS<=120 20<=PS<34 34<=PS<77 COMPASS PA Subtest COMPASS Algebra Subtest Two Semesters One Semester Two Semesters One Semester College D 40<=PS<49 PS>=49 1<=PS<=19 20<=PS<=40 Note: PS denotes the student's adjusted placement score. Running head: EXTENDING TIME IN DEVELOPMENTAL MATH Table 2. Summary statistics, students placed in extended versus one-semester elementary algebra Full Sample of Students Individual College Samples Total Ext. One- Sem Bandwidth Around Cutoff College A College B College C College D 1SD .5SD .33SD .25SD Ext. One- Sem Ext. One- Sem Ext. One- Sem Ext. One- Sem Student Characteristics Age 22.9 23.2 22.6 22.4 22.6 22.7 22.6 23.2 23.2 24.0 22.7 21.2 20.8 25.8 25.5 Female .503 .513 .493 .505 .511 .505 .500 .506 .471 .511 .464 .509 .523 .530 .488 Asian/PI .087 .084 .089 .090 .090 .088 .089 .149 .199 .051 .048 .080 .074 .060 .064 Black .193 .211 .176 .173 .180 .200 .187 .102 .086 .379 .253 .077 .067 .422 .385 Latina/o .480 .479 .481 .502 .504 .496 .501 .517 .471 .475 .595 .539 .515 .330 .282 White .151 .140 .161 .150 .144 .132 .137 .154 .158 .038 .035 .210 .240 .101 .158 Other .088 .085 .092 .086 .083 .084 .086 .077 .085 .057 .069 .108 .098 .087 .112 Engl. not prim. lang. .260 .257 .262 .274 .274 .270 .276 .322 .343 .257 .296 .266 .269 .155 .138 Permanent resident .099 .098 .099 .101 .099 .095 .098 .132 .144 .075 .080 .095 .104 .079 .075 Other visas .080 .074 .084 .084 .083 .085 .084 .115 .118 .083 .107 .070 .075 .027 .047 Outcomes Enrolled in any course .854 .839 .869 .855 .850 .846 .845 .780 .851 .807 .818 .894 .904 .850 .874 Enrolled in math .620 .575 .662 .620 .607 .592 .599 .500 .674 .416 .517 .705 .758 .612 .632 Placement compliance .808 .709 .889 .810 .799 .789 .797 .763 .911 .463 .872 .748 .930 .736 .784 Attempted EA .481 .337 .618 .481 .475 .466 .477 .317 .636 .266 .479 .423 .727 .289 .575 Passed EA w/C or better .320 .224 .410 .320 .310 .295 .303 .235 .451 .133 .310 .283 .480 .200 .353 Passed EA w/B or better .182 .125 .237 .182 .176 .164 .164 .145 .299 .076 .175 .144 .268 .111 .195 Attempted IA .260 .191 .326 .261 .250 .237 .239 .208 .274 .088 .205 .234 .399 .196 .282 Passed IA w/C or better .178 .133 .222 .180 .172 .163 .168 .143 .249 .061 .141 .170 .281 .124 .176 Passed IA w/B or better .094 .072 .114 .095 .090 .087 .090 .091 .153 .037 .081 .082 .135 .066 .082 Total units w/in 1 yr. 11.1 10.1 12.0 11.1 11.0 10.9 11.1 10.4 12.7 8.6 11.2 10.5 12.9 10.4 10.7 Total degree-appl. units 21.6 19.4 23.7 21.8 21.5 21.3 21.7 19.2 24.0 15.6 21.1 22.1 26.9 18.9 20.2 N 12805 6228 6577 11425 7545 5387 4114 1674 1089 1210 1575 2111 2631 1233 1282 Running head: EXTENDING TIME IN DEVELOPMENTAL MATH Table 3. Enrollment and compliance Full range around cutoff 0.5 SD Bandwidth around cutoff All Extended One- semester Difference All Extended One- Semester Difference Enrolled 0.85 0.84 0.87 -.031*** 0.85 0.84 0.87 -.030*** Enrolled in Math 0.62 0.58 0.66 -.087*** 0.61 0.56 0.65 -.090*** N 12805 6228 6577 7545 3882 3663 Complied with Placement 0.81 0.71 0.89 -.180*** 0.80 0.69 0.90 -.213*** First Math Course Arithmetic 0.02 0.02 0.01 0.02 0.02 0.01 Pre-algebra 0.05 0.07 0.03 0.04 0.06 0.02 Extended algebra 0.34 0.71 0.04 0.35 0.69 0.03 One-sem. algebra 0.57 0.17 0.89 0.56 0.19 0.90 Int. algebra 0.02 0.02 0.03 0.02 0.02 0.02 Transfer-level 0.01 0.01 0.01 0.01 0.01 0.01 N 7917 3570 4347 4571 2184 2387 Note: *** p<.001, **p<.01, *p<.05 EXTENDING TIME IN DEVELOPMENTAL MATH 163 Table 4. First stage IV regressions, enrollment in extended elementary algebra All 1 SD 0.5 SD 0.33 SD 0.25 SD Assignment to Extended Algebra .682*** .665*** .655*** .654*** .637*** .061 .070 .074 .081 .084 Centered Test Score -.009 -.058 -.143 -.246* -.235* .033 .055 .118 .117 .103 Treatment*Test Score .003 .006 .012 .021* .015*** .004 .004 .008 .005 .002 Age At Assessment -.003** -.002** -.001 -.001 -.002 .0009 .001 .001 .001 .001 Female -.004 -.002 .003 .017*** .030*** .004 .004 .003 .005 .009 Asian -.010 -.008 -.007 .016 -.002 .013 .013 .019 .023 .022 Black -.031** -.025*** -.026 -.002 -.004 .010 .007 .014 .018 .027 Hispanic .002 .005 .013 .030 .029 .008 .005 .012 .016 .017 White -.026*** -.026*** -.010** .008 .007 .007 .006 .004 .012 .021 English not primary language .001 .002 .010 .011 .027 .011 .011 .014 .024 .036 Permanent Resident -.001 -.003 -.017 -.002 -.005 .013 .013 .013 .018 .020 Other Visas .011 .009 -.001 .030 -.027 .015 .016 .022 .016 .028 Constant .170*** .162*** .202** .199** .185** .042 .048 .045 .069 .074 F-test 95.1*** 67.4** 58.6** 59.2** 43.6** 7384 7080 4579 3188 2465 Note: *** p<.001, **p<.01, *p<.05 EXTENDING TIME IN DEVELOPMENTAL MATH 164 Table 5. The effect of extended elementary algebra courses on college outcomes, fuzzy RDs Bandwidth All 1 SD 0.5 SD 0.33 SD 0.25 SD Enrolled in Any Math (Sharp RD) Extended Alg. -.049 -.031 -.019 -.011 -.024 .030 .033 .031 .029 .027 Constant .730*** .709** .605** .570*** .559*** .067 .086 .065 .044 .041 Passing EA (C or better) Extended Alg. -.279*** -.207*** -.205*** -.190*** -.218*** .030 .033 .043 .049 .058 Constant .516*** .471*** .456*** .441*** .515*** .063 .071 .096 .114 .127 Passing EA (B or better) Extended Alg. -.158*** -.081*** -.085* -.082^ -.107* .028 .030 .039 .044 .051 Constant .154*** .071 .112 .081 .105 .058 .065 .087 .102 .113 Attempting IA Extended Alg. -.193*** -.127*** -.128** -.130** -.173** .030 .033 .042 .049 .057 Constant .480*** .445*** .490*** .403*** .447*** .063 .071 .096 .113 .126 Passing IA (C or better) Extended Alg. -.108*** -.058^ -.080* -.101* -.124* .028 .031 .039 .045 .053 Constant .295*** .266*** .348*** .288** .313** .058 .066 .088 .104 .116 Passing IA (B or better) Extended Alg. -.056* -.028 -.062* -.050 -.060 .022 .025 .031 .036 .042 Constant .071 .039 .129 .075 .075 .047 .053 .070 .083 .093 Units Completed in One Year Extended Alg. -1.64* -.452 -.141 -.262 -1.43 .791 .878 1.12 1.30 1.53 Constant 15.1*** 14.2*** 15.7*** 14.6*** 16.8*** 1.65 1.87 2.53 3.01 3.37 Degree-Applicable Units through Fall 2013 Extended Alg. -5.25** -3.41 -2.94 -4.29 -8.92* 1.77 1.97 2.50 2.93 3.48 Constant 37.0*** 36.4*** 42.4*** 43.9*** 49.8*** 3.71 4.21 5.66 6.81 7.68 N 7384 7080 4579 3188 2465 Note: Instrumental variables regression with campus fixed effects. Covariates include: age, race, sex, language, residence status, and cohort. ***p<.001 **p<.01 *p<.05 ^p<.10 EXTENDING TIME IN DEVELOPMENTAL MATH 165 Table 6. Sensitivity analyses for outcomes (1) Full (2) No Cov. or Fixed Effects (3) Quadratic (4) Cubic (5) Cutoff at -0.25SD (6) Cutoff at +0.25SD Passed EA (C or better) -.205*** -.211*** -.258*** -.214*** .023 .010 .043 .045 .055 .070 .024 .024 Passed EA (B or better) -.085* -.082^ -.143** -.075 .023 .010 .039 .044 .050 .063 .024 .024 Attempted IA -.128** -.136*** -.185*** -.180** -.002 .005 .042 .030 .054 .069 .046 .013 Passed IA (C or better) -.080* -.092*** -.106* -.078 -.017 .012 .039 .024 .050 .064 .022 .009 Passed IA (B or better) -.062* -.063*** -.078^ -.074 -.017 .012 .031 .019 .040 .051 .022 .009 Total units completed w/in 1 yr of assessment -.141 .001 -.619 1.03 -.243 -.173 1.12 .886 1.44 1.83 .998 .866 Total degree-applicable units completed -2.94 -3.47 -5.35 -3.53 -.419 .017 2.50 2.14 3.22 4.10 1.73 2.90 Note: Full model is 0.5SD, with covariates, campus and cohort fixed effects, and linear interaction term of treatment and assignment variables. *** p<.001, **p<.01, *p<.0.5, ^p<.10 EXTENDING TIME IN DEVELOPMENTAL MATH 166 APPENDIX Figure A.1. Trends in covariates, local linear regression College A: Log Discontinuity Estimate: 0.092 Standard Error: 0.055 College B: Log Discontinuity Estimate: 0.12 Standard Error: 0.10 College C: Log Discontinuity Estimate: -0.052 Standard Error: 0.057 College D: Log Discontinuity Estimate: 0.22 Standard Error: 0.109 Figure A.2. McCrary Density Tests, 4 Colleges EXTENDING TIME IN DEVELOPMENTAL MATH 167 Table A.1. Test of discontinuities in covariates, Seemingly Unrelated Regressions All 1 SD 0.5 SD 0.33 SD 0.25 SD Age at assessment 1.38*** .473 .539 .330 .976* .203 .244 .318 .376 -.030 Female -.0002 -.023 -.035 -.050 .029 .014 .017 .022 .026 .022 Asian -.009 .003 .002 .000 -.012 .008 .010 .012 .014 .016 Black -.018 .014 -.016 -.014 .009 .010 .012 .016 .019 .021 Hispanic -.025 -.022 -.022 .001 -.005 .014 .017 .022 .026 .029 English not primary language -.007 .002 -.002 .005 -.016 .012 .015 .019 .023 .026 Permanent Resident -.002 .006 .006 .004 -.003 .008 .010 .013 .015 .017 Other Visas -.012 -.005 -.084 .012 .012 .008 .009 .039 .014 .016 Joint Significance Test Chi2 54.10 7.31 6.86 5.8 8.7 Prob>Chi2 0.00 0.5 0.552 0.67 0.369 N 11906 11425 7545 5387 4114 Note: Each covariate is the dependent variable in a Seemingly Unrelated Regression using a regression discontinuity design, and the reported estimate is of the discontinuity in the covariate at the placement cutoff. ***p<0.001, **p<0.01, *p<0.05 EXTENDING TIME IN DEVELOPMENTAL MATH 168 Table A.2a. First stage IV regressions by campus, enrollment in extended elementary algebra College A College B All 1 SD 0.5 SD 0.33 SD 0.25 SD All 1 SD 0.5 SD 0.33 SD 0.25 SD Assignment to Ext. Alg. .780*** 0.780*** 0.767*** 0.736*** 0.740*** 0.434*** 0.430*** 0.425*** 0.436*** 0.427*** (0.032) (0.032) (0.035) (0.046) (0.052) (0.037) (0.038) (0.047) (0.054) (0.061) Centered Test Score -0.031 -0.031 -0.035 0.027 0.065 -0.037 -0.052 -0.097 -0.020 0.444 (0.067) (0.067) (0.077) (0.152) (0.205) (0.038) (0.044) (0.126) (0.235) (0.331) Treatment*Test Score 0.007 .007 0.002 -0.015 -0.017 0.004 0.004 0.005 0.004 -0.022* (0.005) (0.005) (0.007) (0.014) (0.020) (0.004) (0.004) (0.005) (0.007) (0.011) Age At Assessment -0.001 -0.001 -0.000 0.001 -0.000 0.000 0.001 0.001 0.001 0.001 (0.001) (0.001) (0.001) (0.002) (0.002) (0.001) (0.001) (0.002) (0.002) (0.002) Female 0.007 0.007 0.003 0.023 0.041 -0.000 -0.000 -0.003 0.008 0.013 (0.016) (0.016) (0.017) (0.022) (0.024) (0.019) (0.020) (0.026) (0.029) (0.033) Asian -0.009 -0.009 -0.002 0.056 0.012 -0.079 -0.084 -0.131 -0.118 -0.140 (0.036) (0.036) (0.038) (0.046) (0.051) (0.060) (0.063) (0.084) (0.094) (0.105) Black 0.009 0.009 0.043 0.081 0.033 -0.026 -0.030 -0.050 -0.052 -0.014 (0.043) (0.043) (0.046) (0.055) (0.062) (0.045) (0.047) (0.061) (0.066) (0.076) Hispanic 0.006 0.006 -0.003 0.051 0.028 -0.016 -0.019 -0.035 -0.051 -0.017 (0.033) (0.033) (0.035) (0.042) (0.047) (0.043) (0.045) (0.058) (0.064) (0.073) White -0.043 -0.043 -0.023 -0.007 -0.040 0.055 0.051 0.048 0.061 0.094 (0.037) (0.037) (0.039) (0.049) (0.054) (0.074) (0.075) (0.092) (0.103) (0.110) English not prim. lang. -0.014 -0.014 -0.016 0.007 0.035 0.062* 0.062* 0.080* 0.098** 0.147*** (0.020) (0.020) (0.021) (0.025) (0.028) (0.024) (0.025) (0.033) (0.037) (0.041) Permanent Resident 0.032 0.032 0.011 0.028 0.023 -0.007 -0.005 -0.014 -0.044 -0.081 (0.027) (0.027) (0.028) (0.035) (0.040) (0.037) (0.039) (0.051) (0.057) (0.063) Other Visas 0.047 0.047 0.044 0.011 -0.003 -0.050 -0.058 -0.081 -0.100 -0.128* (0.027) (0.027) (0.028) (0.034) (0.037) (0.035) (0.036) (0.049) (0.055) (0.063) Constant 0.003 0.003 -0.023 -0.175* -0.148 0.077 0.088 0.116 0.185 0.101 (0.065) (0.065) (0.069) (0.083) (0.092) (0.106) (0.113) (0.135) (0.154) (0.162) F-test 576.1*** 576.1*** 477.5*** 260.1*** 202.1*** 135.9*** 128.2*** 81.9*** 65.7*** 49.7*** N 1571 1571 1083 693 558 1318 1282 973 845 640 Note: Regressions also include cohort (not shown). *** p<.001, **p<.01, *p<.05 EXTENDING TIME IN DEVELOPMENTAL MATH 169 Table A.2b. First stage IV regressions by campus, enrollment in extended elementary algebra College C College D All 1 SD 0.5 SD 0.33 SD 0.25 SD All 1 SD 0.5 SD 0.33 SD 0.25 SD Assignment to Ext. Alg. 0.737*** 0.736*** 0.729*** 0.693*** 0.712*** 0.575*** 0.550*** 0.546*** 0.515*** 0.477*** (0.022) (0.022) (0.027) (0.033) (0.039) (0.043) (0.060) (0.078) (0.086) (0.092) Centered Test Score 0.023 0.021 0.090 0.059 0.031 0.013 0.033 0.336 -0.123 -0.233 (0.023) (0.023) (0.054) (0.087) (0.122) (0.021) (0.105) (0.213) (0.337) (0.484) Treatment*Test Score -0.003 -0.003 -0.012* -0.026* -0.014 -0.007 -0.015 -0.061 -0.004 -0.024 (0.003) (0.003) (0.006) (0.010) (0.016) (0.006) (0.014) (0.031) (0.051) (0.075) Age At Assessment -0.004*** -0.004*** -0.002 -0.003* -0.004** -0.006*** -0.005* -0.004 -0.007* -0.004 (0.001) (0.001) (0.001) (0.001) (0.002) (0.002) (0.002) (0.003) (0.003) (0.004) Female -0.000 -0.001 0.008 0.015 0.024 -0.014 0.009 0.010 0.049 0.098 (0.010) (0.010) (0.013) (0.016) (0.018) (0.025) (0.030) (0.041) (0.047) (0.053) Asian 0.015 0.015 0.025 0.016 -0.009 -0.021 -0.015 -0.032 0.027 0.038 (0.025) (0.025) (0.034) (0.039) (0.044) (0.073) (0.084) (0.119) (0.136) (0.144) Black -0.020 -0.020 -0.029 -0.076 -0.102* -0.088 -0.071 -0.113 0.016 0.077 (0.026) (0.026) (0.035) (0.042) (0.047) (0.047) (0.056) (0.078) (0.088) (0.100) Hispanic 0.013 0.013 0.032 0.039 0.014 -0.053 -0.043 -0.036 0.073 0.172 (0.017) (0.017) (0.023) (0.028) (0.032) (0.048) (0.058) (0.078) (0.090) (0.102) White -0.012 -0.014 -0.000 0.006 -0.008 -0.037 -0.021 0.022 0.157 0.169 (0.019) (0.019) (0.026) (0.031) (0.035) (0.061) (0.073) (0.099) (0.111) (0.126) English not prim. lang. -0.003 -0.003 0.000 -0.021 -0.025 -0.037 -0.038 -0.039 -0.121 -0.200* (0.013) (0.013) (0.016) (0.020) (0.022) (0.041) (0.047) (0.065) (0.076) (0.077) Permanent Resident -0.020 -0.020 -0.029 -0.009 0.014 -0.010 -0.048 -0.120 -0.086 -0.094 (0.018) (0.018) (0.024) (0.029) (0.032) (0.054) (0.064) (0.097) (0.120) (0.133) Other Visas 0.002 0.002 0.007 0.032 0.024 -0.012 -0.035 -0.165 -0.126 -0.055 (0.020) (0.020) (0.027) (0.032) (0.037) (0.083) (0.089) (0.113) (0.122) (0.127) Constant 0.011 0.011 -0.045 0.002 0.030 0.407*** 0.355*** 0.455** 0.378* 0.220 (0.030) (0.030) (0.039) (0.046) (0.052) (0.081) (0.104) (0.147) (0.177) (0.198) F-test 1143.4*** 1135.6*** 712.5*** 438.1*** 339.3*** 177.1*** 84.5*** 49.1*** 35.7*** 27.0*** N 3484 3468 2116 1367 1057 1011 759 407 283 210 Note: Regressions also include cohort (not shown). *** p<.001, **p<.01, *p<.05 EXTENDING TIME IN DEVELOPMENTAL MATH 170 Table A.3a. The effect of extended elementary algebra courses on developmental math outcomes, fuzzy RDs by campus COLLEGE A: Bandwidth COLLEGE B: Bandwidth All 1.0sd 0.50sd 0.33sd 0.25sd All 1.0sd 0.50sd 0.33sd 0.25sd Enrolled in Any Math (Sharp RD) Enrolled in Any Math (Sharp RD) Extended Alg. -0.117* -0.117* -0.123*** -0.120*** -0.123** 0.009 0.012 0.015 0.038 0.009 (0.039) (0.039) (0.026) (0.024) (0.033) (0.026) (0.026) (0.031) (0.030) (0.033) Constant 0.735*** 0.735*** 0.743*** 0.741*** 0.692*** 0.362*** 0.340*** 0.311*** 0.351*** 0.349*** (0.051) (0.051) (0.080) (0.094) (0.117) (0.055) (0.054) (0.068) (0.081) (0.078) N 2763 2763 1812 1161 935 2785 2724 2179 1913 1398 Passing EA (B or better) Passing EA (B or better) Extended Alg. -0.037 -0.037 -0.002 -0.026 0.032 0.023 0.063 -0.014 0.058 -0.054 (0.060) (0.060) (0.076) (0.101) (0.114) (0.103) (0.105) (0.111) (0.116) (0.141) Constant 0.043 0.043 0.083 -0.064 -0.135 0.113 0.121 0.204 0.190 0.238 (0.093) (0.093) (0.114) (0.133) (0.147) (0.130) (0.135) (0.137) (0.150) (0.164) Attempting IA Attempting IA Extended Alg. -0.115^ -0.115^ -0.119 -0.110 -0.094 0.038 0.062 0.058 -0.040 -0.102 (0.063) (0.063) (0.079) (0.105) (0.119) (0.109) (0.111) (0.121) (0.125) (0.151) Constant 0.419*** 0.419*** 0.393*** 0.291* 0.393* 0.457*** 0.432** 0.409** 0.450** 0.481** (0.099) (0.099) (0.119) (0.138) (0.153) (0.137) (0.143) (0.150) (0.161) (0.176) Passing IA (B or better) Passing IA (B or better) Extended Alg. -0.025 -0.025 -0.018 0.027 0.022 0.005 0.002 -0.103 -0.123 -0.138 (0.051) (0.051) (0.063) (0.084) (0.095) (0.082) (0.083) (0.086) (0.091) (0.111) Constant 0.075 0.075 0.058 0.028 0.132 0.197^ 0.205^ 0.321** 0.368** 0.367** (0.080) (0.080) (0.096) (0.110) (0.122) (0.103) (0.107) (0.106) (0.118) (0.130) Units Completed in One Year Units Completed in One Year Extended Alg. 1.438 1.438 2.002 0.413 -0.388 1.249 1.437 0.756 -0.810 -3.982 (1.573) (1.573) (1.976) (2.620) (2.931) (3.271) (3.310) (3.673) (3.858) (4.696) Constant 16.021*** 16.021*** 15.574*** 12.819*** 14.965*** 14.128*** 13.857** 14.571** 13.913** 15.574** (2.458) (2.458) (2.975) (3.448) (3.772) (4.096) (4.272) (4.558) (4.978) (5.471) Degree-Applicable Units through Spr. 2013 Degree-Applicable Units through Spr. 2013 Extended Alg. -1.148 -1.148 0.988 0.095 -0.459 -7.229 -7.095 -10.035 -10.604 -21.331^ (3.512) (3.512) (4.316) (5.814) (6.632) (7.221) (7.339) (8.244) (8.862) (11.109) Constant 38.186*** 38.186*** 34.029*** 34.546*** 38.597*** 35.070*** 38.029*** 41.461*** 42.585*** 47.063*** (5.489) (5.489) (6.499) (7.651) (8.535) (9.042) (9.470) (10.229) (11.434) (12.943) N 1571 1571 1083 693 558 1318 1282 973 845 640 Note: 2SLS regression with clustered standard errors (by cohort). Covariates include: age, race, sex, language, residence status, and cohort. ***p<.001 **p<.01 *p<.05 ^p<.10 EXTENDING TIME IN DEVELOPMENTAL MATH 171 Table A.3b. The effect of extended elementary algebra courses on developmental math outcomes, fuzzy RDs by campus COLLEGE C: Bandwidth COLLEGE D: Bandwidth All 1.0sd 0.50sd 0.33sd 0.25sd All 1.0sd 0.50sd 0.33sd 0.25sd Enrolled in Any Math (Sharp RD) Enrolled in Any Math (Sharp RD) Extended Alg. -0.028 -0.026 -0.015 0.008 -0.019 0.020 0.065 0.056 0.048 0.097 (0.025) (0.026) (0.026) (0.059) (0.053) (0.054) (0.073) (0.076) (0.072) (0.064) Constant 0.864*** 0.864*** 0.842*** 0.774*** 0.745*** 0.769*** 0.827*** 0.653*** 0.572*** 0.523* (0.030) (0.030) (0.039) (0.032) (0.043) (0.055) (0.079) (0.099) (0.134) (0.187) N 4742 4718 2916 1880 1451 1616 1220 638 433 330 Passing EA (B or better) Passing EA (B or better) Extended Alg. -0.121** -0.117** -0.141** -0.162* -0.129 -0.196** -0.077 -0.235 -0.060 0.015 (0.043) (0.043) (0.053) (0.070) (0.079) (0.075) (0.105) (0.145) (0.173) (0.198) Constant -0.026 -0.029 0.052 0.111 0.147 0.202* 0.068 0.094 -0.153 -0.204 (0.044) (0.044) (0.055) (0.068) (0.076) (0.094) (0.118) (0.182) (0.213) (0.219) Attempting IA Attempting IA Extended Alg. -0.145** -0.141** -0.179** -0.214** -0.253** -0.199* -0.134 -0.220 -0.154 -0.144 (0.048) (0.048) (0.059) (0.080) (0.092) (0.088) (0.124) (0.165) (0.204) (0.238) Constant 0.426*** 0.427*** 0.482*** 0.505*** 0.528*** 0.483*** 0.557*** 0.587** 0.284 0.205 (0.049) (0.049) (0.062) (0.077) (0.088) (0.109) (0.140) (0.208) (0.251) (0.263) Passing IA (B or better) Passing IA (B or better) Extended Alg. -0.006 -0.006 -0.067 -0.043 -0.104 -0.023 -0.021 -0.116 -0.017 0.111 (0.035) (0.035) (0.042) (0.057) (0.065) (0.058) (0.080) (0.109) (0.140) (0.175) Constant -0.005 -0.001 0.050 0.040 -0.011 0.060 0.083 0.161 -0.085 -0.143 Units Completed in One Year Units Completed in One Year Extended Alg. -0.906 -0.785 -0.830 -0.624 -1.043 -0.959 -2.673 -3.062 -3.827 -3.596 (1.227) (1.231) (1.509) (2.038) (2.349) (2.320) (3.319) (4.362) (5.517) (6.664) Constant 15.311*** 15.268*** 16.941*** 17.937*** 19.409*** 12.606*** 15.564*** 20.921*** 23.438*** 25.160*** (1.255) (1.258) (1.574) (1.963) (2.244) (2.877) (3.752) (5.488) (6.792) (7.359) Degree-Applicable Units through Spr. 2013 Degree-Applicable Units through Spr. 2013 Extended Alg. -1.285 -1.153 -2.103 -6.148 -7.526 -5.063 -6.101 -5.383 -4.382 -5.054 (2.832) (2.839) (3.501) (4.724) (5.398) (4.825) (6.744) (8.944) (11.119) (13.611) Constant 39.044*** 39.029*** 43.098*** 45.322*** 49.327*** 30.778*** 33.466*** 40.522*** 36.747** 36.236* (2.896) (2.901) (3.653) (4.550) (5.157) (5.986) (7.624) (11.252) (13.690) (15.030) N 3484 3468 2116 1367 1057 1011 759 407 283 210 Note: 2SLS regression with clustered standard errors (by cohort). Covariates include: age, race, sex, language, residence status, and cohort. ***p<.001 **p<.01 *p<.05 ^p<.10 A RESEARCH AGENDA 172 Math and the Making of College Opportunity: A Research Agenda There is no question that math plays a profound role in determining college access and opportunity. This dissertation sought to demonstrate key ways it does so through an investigation of math placement and course-taking in the transition to college. The three studies provide both a national look at issues in math course-taking in higher education and insight from one community college system on possibilities for reform in the assessment and placement policies and organizational practices that have typically hindered student progress. The first study moves the conversation on college math course-taking beyond the remedial/non-remedial dichotomy by focusing on redundant college mathematics. I defined this phenomenon as students repeating a high school level course in college when they likely could have passed a higher-level course. Using nationally representative data, I showed that redundant math courses may come at a high cost to students, since students who started college in redundant math courses were less likely to persist and earn a college degree. They were also significantly less likely to earn a STEM degree, suggesting that their first college math experiences may have influenced subsequent course and major choices. Given the differences I documented between remedial and redundant math courses, I argued that redundancy is a nuanced way by which inequality is perpetuated in higher education and therefore deserves greater research and policy attention. I then drew attention to assessment and placement policies in community colleges, which may be a key reason why students begin college math course-taking in a remedial or redundant course. The second study explicitly highlighted how placement tools that colleges rely on often A RESEARCH AGENDA 173 result in mis-assignment and thereby perpetuate inequality in college opportunity. Offering a possibility for reform, the study examined a unique policy context where a more holistic approach to course placement was used. Students who were able to access higher-level math courses based on a set of academic and non-academic background indicators were just as likely to succeed as their higher-scoring peers. As such, using additional measures in the assessment and placement process may increase access to higher-level courses without compromising students’ likelihood of success in those courses. Finally, the third study investigated organizational practices in mathematics that affect college opportunity. I found that students who were subjected to taking a two-semester math course instead of a traditional one-semester course were far less likely to persist in developmental math and complete the math requirements necessary to earn an associate’s degree or transfer to a four-year institution. Extending time in developmental math increased the direct and indirect costs of college for these students and subsequently negatively affected students’ persistence decisions. Implications & Future Research The primary conclusion that emerges from this research is that math course-taking and the institutional policies and practices that determine students’ experiences of mathematics, even when well-intentioned, have significant consequences for students’ educational opportunity and achievement. This dissertation not only provided a new category of college math experience with which to examine college opportunity, it also showed through two applied policy evaluation studies how college math policies can impact college outcomes. Even though placement testing is generally used to help sort students into appropriate coursework, this research shows that mis- assignment and placement error can deal significant setbacks to students (see also Melguizo, A RESEARCH AGENDA 174 Bos, Ngo, Mills, and Prather (2016); Ngo and Melguizo (2016); Scott-Clayton, Crosta, and Belfield (2014)). Similarly, extended math courses are thought to be beneficial to students with respect to providing more time to master math concepts, but the increased costs may deter students from persisting towards their aspirations. Given that the policy studies were conducted in one community college district, there is clearly more work to be done to identify math-related policies and practices across the nation that can foster student success rather than hinder it. The findings of this dissertation therefore frame and forge a broader research agenda on math and the making of college opportunity. They underscore the need to more rigorously evaluate the everyday policies and practices in colleges and universities across the country that affect students’ experiences of mathematics. Additional high-quality research on math course-taking, assessment and placement practices, and other math-related policies is imperative. It is essential, for example, to document the range of math requirements in higher education across the nation and to identify how these policies affect college opportunity and outcomes. The field would benefit from rigorous evaluation of such policy changes as raising math requirements, as was done in the California Community Colleges in 2009, or waiving or lowering math requirements, as enacted in such states as Michigan and Florida (Joselow, 2016; Park et al., 2016). There is an urgent need to evaluate recent placement testing reform initiatives, such as those underway in California, Texas, Florida, and North Carolina, in order to determine the impact on equity and inequality in higher education (e.g., Burdman, 2012; Park et al., 2016). Although there is growing evidence on alternative models of delivery in developmental math, results are largely mixed and based on innovations that have not yet reached scale (see Kosiewicz, Ngo, and Fong (2016) for a review). It is therefore necessary to evaluate whether and under what conditions alternatives such as acceleration, A RESEARCH AGENDA 175 contextualization, and co-requisite math remediation can be effective for promoting student success. Beyond the applied policy work, the clear link between mathematics and educational inequality behooves researchers to focus more attention on the first college math experiences that students have, investigating their nature and examining how students are assigned to them. Why is redundant mathematics course-taking more prevalent in community colleges and public colleges? Are there heterogeneous effects of redundant math course-taking by student background? How can we interrupt patterns of persistent math remediation for the many students who repeat the same lower-level courses throughout high school and college? Relatedly, it is pertinent to further unpack the relationship between college math course- taking and STEM participation. There is ample research on the determinants of STEM participation in higher education (e.g., Maltese & Tai, 2011; Wang, 2013; 2015; 2016), but the relationship between math course-taking in college and STEM attainment is less well-researched. As the first study showed, the types of math course-taking experiences that students have at the start college are related to subsequent STEM participation and attainment. Given the importance of bolstering and diversifying the STEM pipeline (Malcom, 2010), it is critical to examine how policies related to math pre-requisites might shape STEM participation and the composition of graduates in STEM fields. It would be an important research task, for example, to examine whether and how placement in math remediation affects entry into and success in introductory STEM courses. Improving math course placement and evaluating the necessity of math pre- requisites can ultimately serve to expand the STEM pipeline to include a wider range of students. A RESEARCH AGENDA 176 Conclusion Mathematics is and will remain a key component of college access and undergraduate education. As such, there is need to evaluate the policies and practices that structure students’ experiences of mathematics, seeking to identify those that create educational opportunity and those that curtail it. This research task is ultimately about tying the institutional policies and practices that determine math experiences to persistent problems and patterns of inequality in higher education. If research on mathematics in higher education can inspire reforms and innovations that improve math placement and course-taking, then mathematics may shift from being the notorious gatekeeper it has persistently been to instead being a gateway to college success. A RESEARCH AGENDA 177 References Burdman, P. (2012). Where to begin? The evolving role of placement exams for students starting college. Boston, MA: Jobs for the Future. Hughes, K. L., & Scott-Clayton, J. (2011). Assessing developmental assessment in community colleges. Community College Review, 39(4), 327-351. Joselow, M. (2016, June 16). No Math Required. Inside HigherEd. Retrieved from https://www.insidehighered.com/news/2016/06/16/debate-over-whether-all- undergraduates-should-take-mathematics-course Kosiewicz, H., Ngo, F., & Fong, K. E. (2016). Alternative models to deliver developmental math: Issues of use and student access. Community College Review, 44(3), 205-231. Malcom, L. E. (2010). Charting the pathways to STEM for Latina/o students: The role of community colleges. New Directions for Institutional Research, 2010(148), 29-40. Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among US students. Science Education, 95(5), 877-907. Melguizo, T., Bos, J. M., Ngo, F., Mills, N., & Prather, G. (2016). Using a regression discontinuity design to estimate the impact of placement decisions in developmental math. Research in Higher Education, 57(2), 123-151. Ngo, F., & Kwon, W. (2015). Using multiple measures to make math placement decisions: Implications for access and success in community colleges. Research in Higher Education, 56(5), 442-470. Ngo, F., & Melguizo, T. (2016). How can placement policy improve math remediation outcomes? Evidence from community college experimentation. Educational Evaluation and Policy Analysis, 38(1), 171-196. Park, T., Woods, C. S., Richard, K., Tandberg, D., Hu, S., & Jones, T. B. (2016). When developmental education is optional, what will students do? A preliminary analysis of survey data on student course enrollment decisions in an environment of increased choice. Innovative Higher Education, 41(3), 221-236. Scott-Clayton, J., Crosta, P., & Belfield, C. (2014). Improving the targeting of treatment: Evidence from college remediation. Educational Evaluation and Policy Analysis, 36(3), 371-393. Wang, X. (2013). Why students choose STEM majors: Motivation, high school learning, and postsecondary context of support. American Educational Research Journal, 50(5), 1081- 1121. Wang, X. (2015). Pathway to a baccalaureate in STEM fields: Are community colleges a viable route and does early STEM momentum matter?. Educational Evaluation and Policy Analysis, 37(3), 376-393. Wang, X. (2016). Course-taking patterns of community college students beginning in STEM: Using data mining techniques to reveal viable STEM transfer pathways. Research in Higher Education, 57(5), 544-569.
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
How extending time in developmental math impacts persistence and success: evidence from a regression discontinuity in community colleges
PDF
To what extent does being a former high school English learner predict success in college mathematics? Evidence of Latinx students’ duality as math achievers
PDF
Three essays on the high school to community college STEM pathway
PDF
A multi-perspective examination of developmental education: student progression, institutional assessment and placement policies, and statewide regulations
PDF
The advanced placement program: a case study of one urban high school
PDF
Reforming developmental education in math: exploring the promise of self-placement and alternative delivery models
PDF
Exploring the complexities of private sector influence: the case of student data privacy policy
PDF
State policy as an opportunity to address Latinx transfer inequity in community college
PDF
Ready or not? Unprepared for community college mathematics: an exploration into the impact remedial mathematics has on preparation, persistence and educational goal attainment for first-time Cali...
PDF
High-achieving yet underprepared: first generation youth and the challenge of college readiness
PDF
Oppression of remedial reading community college students and their academic success rates: student perspectives of the unquantified challenges faced
PDF
Community college education for the incarcerated: the provision of access, persistence and social capital
PDF
Unprepared for college mathematics: an investigation into the attainment of best practices to increase preparation and persistence for first-time California community college freshmen in remedial...
PDF
Faculty learning and agency for racial equity
PDF
Developmental math in California community colleges and the delay to academic success
PDF
Mathematics identity and sense of belonging in mathematics of successful African-American students in community college developmental mathematics courses
PDF
AB 705: the equity policy – race and power in the implementation of a developmental education reform
PDF
Academic advising, engagment with faculty, course load, course type, and course completion rates for urban community college students with learning disabilties
PDF
Culture, politics, and policy implementation: how practitioners interpret and implement the transfer policy in a technical college environment
PDF
Making equity & student success work: practice change in higher education
Asset Metadata
Creator
Ngo, Federick Joseph
(author)
Core Title
Math and the making of college opportunity: persistent problems and possibilities for reform
School
Rossier School of Education
Degree
Doctor of Philosophy
Degree Program
Urban Education Policy
Publication Date
05/04/2017
Defense Date
03/08/2017
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
college access,Higher education,math,national data,OAI-PMH Harvest
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Melguizo, Tatiana (
committee chair
), Bensimon, Estela (
committee member
), Owens, Ann (
committee member
)
Creator Email
federick.ngo@usc.edu,mrfngo@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-371982
Unique identifier
UC11258206
Identifier
etd-NgoFederic-5315.pdf (filename),usctheses-c40-371982 (legacy record id)
Legacy Identifier
etd-NgoFederic-5315.pdf
Dmrecord
371982
Document Type
Dissertation
Rights
Ngo, Federick Joseph
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
college access
national data