Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Understanding high school type (public, Catholic, and independent private) as an admission tool to contextualize student performance
(USC Thesis Other)
Understanding high school type (public, Catholic, and independent private) as an admission tool to contextualize student performance
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
1
UNDERSTANDING HIGH SCHOOL TYPE (PUBLIC, CATHOLIC, AND
INDEPENDENT PRIVATE) AS AN ADMISSION TOOL TO
CONTEXTUALIZE STUDENT PERFORMANCE
by
Heather Heimerl Brunold
__________________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2015
Copyright 2015 Heather Heimerl Brunold
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
2
DEDICATION
To my husband and to my son. Every second of effort that was poured into this document
is dedicated to you. Thank you for your unwavering support, endless love and boundless faith in
Who I Am. And to my daddy and my Mothe, for believing in me no-matter-what.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
3
ACKNOWLEDGEMENTS
I would like to acknowledge my committee for their tireless and generous support of this
project. A special note of gratitude goes to Dennis Hocevar for his extraordinary statistical
knowledge and for his patience. I never could have accomplished this undertaking without his
substantial and bighearted help. And deep thanks to LKH for her permission to do this project.
I would also like to acknowledge the following people (in alphabetical order) for their
pivotal roles in this undertaking, but also…
Especially for their kindness, generosity, and brilliance:
Sean Early
Ray Gonzales
Bill Gordon
Bob Keim
Jeff Longmate
Matthew Marchak
Eduardo Molina
Emily Shaw
Scott Smith
Harold Urman and his team at Vital Research
Especially for their unique and gifted approaches to education, and for changing my life in so
many un-definable ways:
Sue Borrego
Nona Chiang
Rudy Crew
Avery Goldstein
Mary Immordino-Yang
Deborah Monroe
Helena Seli
Gail Sinatra
Especially for their attention to detail and exceptional editing skills:
Evelyn Felina Castillo
Ilda Jimenez y West
Annemarie Perez
Jane Rosenthal
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
4
Especially for her patience and clerical support:
Brenda Mallory
Especially for her compassion and genius:
Betsy Felser
Especially for being there from that first, nervous moment to the triumphant last:
Heather Davis
Especially for their friendship, source of laughter, and love for my son:
Debra Ono
Carrie Walker
Especially for their integrity, honesty and courage (both for different reasons):
Edward Siebert
John Siebert
Especially for her unending prayers and selfless love:
Kate Grimberg Napolitan
Especially for being such a blinding light for how and why:
David Heimerl
Especially for the sacrifice of their lives:
Each and every little tree that died on my behalf
And lastly, but certainly not least:
To my all those in my Tuesday night cohort, who began this journey with such incredible
joy and affirming friendship
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
5
TABLE OF CONTENTS
Dedication 2
Acknowledgements 3
List of Tables 6
List of Figures 9
Abstract 10
Chapter 1: Overview of the Study 11
Chapter 2: Review of the Literature 31
Chapter 3: Methodology 61
Chapter 4: Results 86
Chapter 5: Summary of Findings, Discussion and Recommendations 114
References 126
Appendix 147
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
6
LIST OF TABLES
Table 1.1. Advanced Placement Courses 18
Table 1.2. Concordance Between ACT Composite Score and Sum of SAT Critical 21
Reading and Mathematics Scores
Table 1.3. Concordance Between ACT Combined English/Writing Score and SAT 22
Writing Score
Table 1.4. Grade Conversion 24
Table 1.5. Definition of Ethnic Categories 27
Table 1.6. Description of College Board Regional Definitions 28
Table 3.1. School Count vs. Number of Students by School Type 64
Table 3.2. Description of Ethnicity, Frequency and Percentage 65
Table 3.3. Ethnic Classifications by NCES Category and Research Category 67
Table 3.4. Distribution of AP Courses by Student 70
Table 3.5. College Board Advanced Placement Score Scale 71
Table 3.6. Number of Advanced Placement Exams Reported per Student 72
Table 3.7. AP Groupings by STEM Designation 74
Table 3.8. Description of Cohorts 76
Table 3.9. Description of College Board Regions and School Counts 77
Table 3.10. School Count of Institutions with 5 or More Students in the Sample 78
Population
Table 3.11. Description of Grade Conversion 80
Table 3.12. Description of Variables 81
Table 4.1. Federal Pell Grant Frequency 86
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
7
Table 4.2. Descriptive Statistics of Dependent Variables 87
Table 4.3. Description of Student’s Declared Academic Unit 91
Table 4.4. Cross-Tabulation of College Board Region by School Type 94
Table 4.5. Chi-Square Results for Region and School Type 95
Table 4.6. Number of AP Courses by School Type 96
Table 4.7. Chi-Square Results for Number of AP Courses and School Type 96
Table 4.8. Ethnicity, Academic Unit and SES by School Type 97
Table 4.9. Analysis of Variance for Ethnicity, Academic Unit, SES, and Region 99
by School Type
Table 4.10. Fisher’s LSD Post Hoc Tests for Ethnicity, Academic Unit, SES, and 100
Region by School Type
Table 4.11. AP STEM GPA, AP STEM Test Scores, AP Non-STEM GPA, and AP 102
Non-STEM Test Scores by School Type
Table 4.12. Analysis of Variance for AP STEM GPA, AP STEM Test Scores, 103
AP Non-STEM GPA, and Non-STEM Test Scores by School Type
Table 4.13. Fisher’s LSD Post Hoc Tests for AP STEM GPA, AP STEM Test Scores, 104
AP Non-STEM GPA, and Non-STEM Test Scores by School Type
Table 4.14. HSGPA, SAT/ACT, and FYGPA by School Type 105
Table 4.15. Analysis of Variance for HSGPA, SAT/ACT, and FYGPA by School Type 106
Table 4.16. Fisher’s LSD Post Hoc Tests for HSGPA, SAT/ACT, and FYGPA 107
by School Type
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
8
Table 4.17. Regression Results of FYGPA on HSGPA, AP STEM GPA, and AP 112
non-STEM GPA
Table 4.18. Residual Scores by School Type 113
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
9
LIST OF FIGURES
Figure 4.1. High School GPA 88
Figure 4.2. Highest SAT or ACT 89
Figure 4.3. Freshman year GPA 90
Figure 4.4. RQ 1: Correlations between demographic & academic indicators with 93
school type
Figure 4.5. Regressions on school type 111
Figure A.1. Number of AP courses offered, by school type 147
Figure A.2. Student ethnicity, by school type 147
Figure A.3. Students pursuing STEM majors, by school type 148
Figure A.4. Pell Grant eligible students, by school type 148
Figure A.5. Mean AP STEM course GPA, by school type 149
Figure A.6. Mean STEM AP test scores, by school type 149
Figure A.7. Mean non-STEM AP course GPA, by school type 150
Figure A.8. Mean non-STEM AP test scores, by school type 150
Figure A.9. Mean HSGPA, by school type 151
Figure A.10. Mean SAT, by school type 151
Figure A.11. Mean FYGPA, by school type 152
Figure A.12. Residuals: HSGPA 152
Figure A.13. Residuals: STEM AP GPA 153
Figure A.14. Residuals: Non-STEM AP GPA 153
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
10
ABSTRACT
An understanding of how well a high school prepares its students for college success is critical
information for admission officers; however, it is one thing for a high school to claim that it is a
rigorous college preparatory institution and quite another thing to actually be one. The factors
examined in this study were chosen based on their ability to elicit positive student preparation for
and success in college. The factors studied included: Sex, Ethnic Classification (white/Asian and
other), Declared Academic Unit (STEM and non-STEM), Pell Grant eligibility (Yes, No),
Region of U.S., HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP test scores (STEM and
non-STEM), and AP grades (STEM and non-STEM), as well as school and school type. The
sample represents 17,080 students from 3,413 high schools. Of schools in the sample, 70% were
public, 20% were independent private, and roughly 10% were Catholic institutions. Thirty
percent of the student population attended independent private or Catholic high schools, thus
making the examination of student success, based on school type, extremely viable. The study
employed a non-experimental research design utilizing a correlational approach with an
exploratory design. There was no intervention or treatment, other than the exploration of school
type. This study confirms preexisting evidence of grade inflation, but furthers the understanding
of which types of schools are most engaged in the practice of grade inflation. More compelling,
however, was the outcome of the public school examination. Overwhelming and significant
evidence points to the success of public schools, and despite lower demonstrated SES, public
institutions appear to be preparing students better than the “private school effect” might suggest.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
11
CHAPTER 1
OVERVIEW OF THE STUDY
Each fall, public and private high schools across the country prepare their senior classes
for the college admission process. High school counselors work tirelessly to help students excel
as they organize their college applications. At the same time, college admission offices bolster
their staff for the onslaught of applications with some universities, like the University of
California-Los Angeles, receiving more than 86,000 freshman applications in the fall (UCLA,
2014). Traditionally, assessment data, which include high school grades, standardized test scores,
and extracurricular activities, among other things, has been used to determine a student’s
admissibility (Atkinson & Geiser, 2009; Bishop, 2000; Cuseo, 2013). At most colleges and
universities, these data include quantitative, qualitative, behavioral and perceptual measures
(Cuseo, 2013). However, while qualitative measures (such as short answers, personal statements,
letters of recommendation or interviews) provide a subjective lens to “holistically” read a
student’s application, more quantitative measures, such as grades and test scores, are needed to
form objective evaluations for each application (Starkman, 2013). Perhaps the most necessary
component in the quantitative evaluation process is the high school transcript (Breland, Maxey,
Gernand, Cumming, & Trapani, 2002). This official school document provides broad and deep
information about a student and his or her high school. Official transcripts, which often include
the high school’s published school profile, provide information about the size of senior class and
sometimes student rank, the number of advanced placement courses offered by the school (and
particularly, how many the applicant chose to pursue), cumulative grade-point average (GPA),
and the school’s address, which can reveal clues about the prevailing socioeconomic status (SES)
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
12
of the student body. This overall snapshot of an applicant, and the school in which they attended,
is a heavily relied upon predictor of a student’s capacity to perform in college (Perfetto, 1999).
Background of the Problem
A student’s high school of origin is an important empirical measure for college admission
officers to consider, and while many colleges and universities might already have such a concept
in practice, a pragmatic methodology does not currently exist which allows for a fair and reliable
school index to be used appropriately. Traditionally, college admission officers focus their
attention on the achievements of individual students and the particular merits each might bring to
the university if admitted (Breland et al., 2002). However, comparing students and their
coursework from the same school brings with it one set of problems, while comparing students
from different high schools brings with it an entirely new set of issues (Attewell & Domina,
2008). Comparisons across schools are imperfect because each high school is unique, with
different cultures, objectives, goals, expectations, and teaching and grading styles. Yet
identifying and understanding the nature of each individual high school’s curriculum and
potential for rigorous study is imperative to the college admission process, and ultimately is the
focus of this study.
Statement of the Problem
An understanding of how well a high school prepares its students for college success is
critical information for admission officers as they seek to compare and analyze student
preparation and ultimately select students who will graduate from their institution. However,
ample evidence of grade inflation exists (Adelman, 2006; Arenson, 2004; Attewell & Domina,
2008; Bishop, 2000; Godfrey, 2011; Nikolakakos, Reeves, & Shuch, 2012; Zirkel, 1999; Breland
et al., 2002; Camara & Michaelides, 2005; Geiser, 2009; Ishop, 2008; Kirst & Venezia, 2001;
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
13
McMillan, 2001). This waning credibility of academic reporting has serious implications. The
transcript is the only official record of high school achievement, which serves as the principal
quantitative information source to colleges (Rigol, 2003; National Association for College
Admission Counseling [NACAC], 2012; Tam & Sukhatme, 2004). If grade inflation is affecting
the reliability of this document, the traditional cornerstone that functions as the certified
communication among parents, students, administrators, teachers and college admission officers
is in danger of being faulty, thus putting the entire structure at risk of losing value (Stanley &
Baines, 2009; Bar, Kadiyali, & Zussman, 2009; Bishop, 2000; Higher Education Research
Institute [HERI], 2003; Jost, 2002; Kirst, 1998; Willingham, Pollack, & Lewis, 2002; Zirkel,
1999; Klopfenstein, n.d.; Nett, 2009; Sadler & Tai, 2007).
Purpose of the Study
The importance of gaining an understanding of influences on college success is well
documented (Burton & Ramist, 2001; Camara & Echternacht, 2000; Geiser & Santelices, 2007;
Harackiewicz, Barron, Tauer, & Elliot, 2002; Kirst & Venezia, 2001; Wyatt, Wiley, Camara, &
Proestler, 2011). The literature, however, lacks a comprehensive understanding of how to
reliably assess which high schools are producing the kinds of scholars that they claim to produce
and which are exaggerating the facts. Essentially, it is one thing for a high school to pronounce
that it is a rigorous college preparatory institution and quite another thing to actually be one
(Willingham & Morris, 1986).
The objective of this study, therefore, is to form a pragmatic understanding of which high
schools are providing reliable transcript information and, therefore, producing bona fide scholars.
By grouping institutions by school type (Public, Catholic, and Independent Private), the research
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
14
herein provides a more reliable methodology for understanding the impact that school type
(Public, Catholic, and Independent Private) might have on college preparation.
Research Questions
Currently, education literature lacks information on the best way to test for the reliability
of a transcript for use as a tool in the admission process. Therefore, the research questions in this
study were designed to probe at the quality of a student’s high school and what influence that it
has on college success, as measured by freshman year grade-point average (FYGPA). Predictive
demographic and academic variables were compared to determine how high school type (Public,
Catholic, and Independent Private) might relate to first-year college success.
Specifically, an attempt to accurately understand high school grade inflation, or the
assigning of grades higher than previously assigned for similar levels of achievement (Grade
Inflation, n.d.), as a way to elicit positive student preparation for and success in college, is the
purpose of the following questions.
Research Question 1: To what extent, if any, are the following variables correlated with
high school type (Public, Catholic, and Independent Private): Gender, Ethnic Classification
(white/Asian versus other), Declared Academic Unit (STEM and non-STEM), Pell Grant
eligibility (Yes, No), Region of United States, HSGPA, SAT/ACT, FYGPA, Number of AP
courses, AP test scores (STEM and non-STEM), and AP grades (STEM and non-STEM)?
Research Question 2: Are there significant institutional level differences by high school
type (Public, Catholic, and Independent Private) on HSGPA grade inflation and AP course grade
inflation (STEM and non-STEM) as indicated by the residuals in the following three regressions:
1. FYGPA on HSGPA,
2. FYGPA on AP STEM GPA,
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
15
3. FYGPA on AP non-STEM GPA.
The research adopted a quantitative approach (Creswell, 2012). A correlational method
with an explanatory design was employed on this non-experimental research. There was no
intervention or treatment, other than the exploration of school type. To obtain a clear
understanding of the population, descriptive statistical analyses were performed on the sample
groups. Dispersion (standard deviations, ranges) and measures of central tendency (means,
medians, and other percentiles) and were computed. Lastly, in order to assess the strength of the
direction of the association between school type and college success, bivariate correlational
analyses were conducted.
Significance of the Problem
Student success in terms of persistence and academic achievement plays an important
role in whether a student will progress towards college graduation (HERI, 2003; Cuseo, 2013),
and high school rigor plays a key role in setting this stage (Kirst, 1998). Consideration of which
high school factors produce rigor, as defined by a significant and positive influence on student
preparation, is a substantial focus in educational research (Camara & Michaelides, 2005; Cuseo,
2013; Dougherty, Mellor, & Jian, 2006; Gollub, Bertenthal, Labov, & Curtis, 2002; Stearns,
Potochnick, Moller, & Southworth, 2009). However, with the increase in college selectivity
(Dougherty et al., 2006; Stanley & Baines, 2009), the potential for exaggerating the appearance
of rigor, as it is reflected on the high school record, is certainly possible (ACT, 2007; Bar et al.,
2009; Jost, 2002; Kirst, 1998; Zirkel, 1999).
Traditional quantitative measures of student success include high school grade-point
average (HSGPA), rigor of curricular course load (ACT, 2007), and standardized test scores
(Rigol, 2003). However, due to evidence of grade inflation and exaggerated rigor, the traditional
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
16
measure of student success — the high school transcript — has become less reliable in recent
years making student selection more difficult for admission officers (Bar et al., 2009; Jost, 2002;
Kirst, 1998; Zirkel, 1999; Moller, Stearns, Potochnick, & Southworth, 2011). Specifically, grade
inflation and weighted HSGPA are affecting the reliability of a traditional cornerstone in college
admission: the high school transcript (Bar et al., 2009; Bishop, 2000; HERI, 2003; Jost, 2002;
Kirst, 1998; Willingham et al., 2002; Zirkel, 1999; Klopfenstein, n.d.; Nett, 2009; Sadler & Tai,
2007). This record of high school achievement is the principal quantitative information source
provided to colleges by high schools (Rigol, 2003; NACAC, 2012; Tam & Sukhatme, 2004).
Knowing which high schools best prepare students for college is of critical importance to
admission offices, but if the high school transcript has become less reliable, a process for
identifying more valid indicators of student success must be considered, for improving the
thoroughness and accuracy of the admission decision could ultimately increase first-year
academic success and, subsequently, college graduation rates (Camara & Kimmel, 2005; Kuh,
2005; Ishler & Upcraft, 2005; Fairris, Peeples, & Castro, 2011).
Definition of Terms
While examining research in the area of college admission and student preparation, it
became clear that there are countless meanings for terms used by myriad authors, scholars, and
institutions. Therefore, for the purpose of clarity, the subsequent delineations were observed
throughout.
Academic Achievement. Satisfactory or superior levels of academic performance, which
avoids academic probation and/or qualifies for academic honors (Cuseo, 2013).
Advanced Placement (AP) Course. An accelerated college-level course created by the
College Board and taught at the high school level using a standardized course syllabus that is
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
17
aligned with the corresponding Advanced Placement (AP) examination (Mason, 2010). The AP
Program currently offers 38 courses and exams across seven subject areas (College Board,
2014d). Table 1.1 outlines this in greater detail. Due to under enrollment, the College Board
eliminated Computer Science AB, French Literature, and Latin Literature (College Board,
2013c).
Advanced Placement (AP) Exam. The Advanced Placement program consists of
voluntary finishing examinations administered each year in May (see Table 1.1). The exam score
represents the culmination of college-level work in a given discipline. Completed AP
examinations are scored on a numeric scale from one to five, with five being the highest.
Students who earn qualifying scores of three, four, or five on AP examinations may often be
granted course credit or accelerated placement at colleges and universities. However, policies
regarding the acceptance of AP exams, or the scoring level required for course credit or
accelerated placement, is unique to each college or university. Individual colleges and
universities, not the College Board, grants course credit and/or accelerated college placement
(College Board, 2013c).
American College Testing (ACT). American College Testing is a college readiness
assessment for high school achievement and college admission in the United States. It consists of
four tests: English, Mathematics, Reading, and Science Reasoning, and a voluntary Writing test.
The exams are scored separately on a gradation of 1–36; with a composite score defining a whole
number average of the combined four scores (ACT, 2015).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
18
Table 1.1
Advanced Placement Courses
AP Capstone
AP Research AP Seminar
Arts
AP Art History AP Studio Art: 3-D Design
AP Music Theory AP Studio Art: Drawing
AP Studio Art: 2-D Design
English
AP English Language and Composition AP English Literature and Composition
History & Social Science
AP Comparative Government and Politics AP Psychology
AP European History AP United States Government and Politics
AP Human Geography AP United States History
AP Macroeconomics AP World History
AP Microeconomics
Math & Computer Science
AP Calculus AB AP Computer Science A
AP Calculus BC AP Statistics
Sciences
AP Biology AP Physics C: Electricity and Magnetism
AP Chemistry AP Physics C: Mechanics
AP Environmental Science AP Physics 1
AP Physics B AP Physics 2
World Languages & Cultures
AP Chinese Language and Culture AP Japanese Language and Culture
AP French Language and Culture AP Latin
AP German Language and Culture AP Spanish Language and Culture
AP Italian Language and Culture AP Spanish Literature and Culture
Note. Adapted from College Board (2013c).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
19
Advanced Placement Exam. The AP exam score is determined by scoring weighted
amalgamation of the multiple-choice and the free-response portions of the exam. Computers
calculate the score of the multiple-choice sections by scanning each answer sheet and tallying the
correct responses. The open-ended, free-response section of the exam is graded at the “AP
Reading” held yearly in the first two weeks in June. Veteran AP teachers and seasoned college
professors manually grade this section of the exam. Scores are combined from the free-response
and multiple-choice portions to form a composite score, which is translated into a 5-point scale
using statistical processes designed to ensure that scores are congruent from year to year
(College Board, 2014c).
BESTSCORE. The university in this study combines standardized test scores, using either
the highest SATR+SATM+SATW or the concorded ACT, to form one reported SAT/ACT
BESTSCORE. This is also true for the SAT, even if the exams are taken on different test dates,
but the ACT is the highest single-test-date composite (Guo, Liu, Curley, & Dorans, 2012).
(Concordance tables for conversion can be seen in Table 1.2 and Table 1.3.)
Carnegie Unit. Standardized credits, indicated on the official transcript, which denote the
completion of one academic course that meets once every school day for a full academic school
year (National Center for Education Statistics [NCES], 2013).
Catholic Schools. Parochial schools maintained by the ministry of the Catholic Church
(NCES, 2015; Miller, 2006).
The College Board. In existence since 1955, this non-profit, membership organization
develops and maintains the Advanced Placement program and oversees the management of
yearly AP examinations (Mason, 2010). The organization is composed of more than 5,700
schools, colleges, and universities, and operates services annually to seven million students,
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
20
23,000 high schools, and 3,800 colleges. Programs include the SAT, the Preliminary
SAT/National Merit Scholarship Qualifying Test (PSAT/NMSQT) and the Advanced Placement
Program (AP) (Shaw & Mattern, 2009).
College Grade-Point Average (CGPA). All undergraduate coursework attempted by a
student is graded into a numerical average and reported on the academic transcript as well as the
official academic verification letter. The total grade-points are calculated by multiplying the
numeric equivalent of the grade received, (A=4, A-=3.7, B+=3.3, etc.) by the number of graded
Carnegie Units for each course, and then dividing that total by the sum of the graded units for
each graded course (Nagaishi & Slade, 2012; UCLA College of Letters and Science, 2013).
Concordance Tables. The College Board and the ACT collaborated in 2006 by
consolidating the scores of all the students who took both the ACT and the SAT, and created two
standardized concordance tables (see Tables 1.2 and 1.3) (College Board, 2014b).
Declared Academic Unit. A department, or unit, with a mission focused largely on
teaching and/or research in a specific field (Southern Illinois University Edwardsville, 1996).
Majors vary by academic unit.
Federal Pell Grant. An award usually granted to undergraduate students. This award is
not a loan and does not have to be repaid. The amount of aid awarded depends on financial need
and the cost of attendance at the institution the undergraduate enrolls in (U.S. Department of
Education [USDOE], 2013). A student having received a Pell Grant while in college is a
dependable gauge of low SES (Douglass & Thomson, 2008; Wei & Horn, 2009).
First-Year College Success. Satisfactory academic progress as measured by a cumulative
freshman GPA with a minimum of 2.0, and the matriculation from freshman to sophomore year
after completing a full load of coursework or 32 academic units (Willingham & Morris, 1986).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
21
Table 1.2
Concordance Between ACT Composite Score and Sum of SAT Critical Reading and Mathematics
Scores
SAT CR+M (Score Range) ACT Composite Score SAT CR+M (Single Score)
1600 36 1600
1540–1590 35 1560
1490–1530 34 1510
1440–1480 33 1460
1400–1430 32 1420
1360–1390 31 1380
1330–1350 30 1340
1290–1320 29 1300
1250–1280 28 1260
1210–1240 27 1220
1170–1200 26 1190
1130–1160 25 1150
1090–1120 24 1110
1050–1080 23 1070
1020–1040 22 1030
980–1010 21 990
940–970 20 950
900–930 19 910
860–890 18 870
820–850 17 830
770–810 16 790
720–760 15 740
670–710 14 690
620–660 13 640
560–610 12 590
510–550 11 530
Note. Adapted from ACT (2008).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
22
Table 1.3
Concordance Between ACT Combined English/Writing Score and SAT Writing Score
SAT Writing (Score Range) ACT English/Writing Score SAT Writing (Single Score)
800 36 800
800 35 800
770–790 34 770
730–760 33 740
710–720 32 720
690–700 31 690
660–680 30 670
640–650 29 650
620–630 28 630
610 27 610
590–600 26 590
570–580 25 570
550–560 24 550
530–540 23 530
510–520 22 510
480–500 21 490
470 20 470
450–460 19 450
430–440 18 430
410–420 17 420
390–400 16 400
380 15 380
360–370 14 360
340–350 13 340
320–330 12 330
300–310 11 310
Note. Adapted from ACT (2008).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
23
Gender. Rather than as a biological construct, the state of being male or female comprises
social or cultural differences and distinctions, as well as the shared characteristics or qualities
that are linked with a particular sex (or are determined as a result of one’s sex). This also applies
to groups characterized as male or female (Gender, n.d.). For the purposes of this study, the word
gender, rather than sex, will be used when categorizing students.
Grade Inflation. Over the last 50 years, GPAs have increased by roughly 0.1 to 0.2 grade-
points per decade (Rojstaczer, 2002).
Grade-Point Average (GPA). Standardized measurements used to assess varying levels of
achievement in an educational course are traditionally referred to as grades. This measurement
can be assigned as letters, as a numeric range, as a percentage of a total number, or as a written
descriptor (see Table 1.4). In the United States, grades are conventionally averaged to make up a
grade-point average (GPA) for the marking period. The GPA — for high school or college — is
calculated by taking the number of grade-points a student earned in a given period of time
divided by the total number of credits taken (Camara & Echternacht, 2000; Nagaishi & Slade,
2012).
Grade-weighting. Tougher grading standards applied to honors and AP classes,
accounting for the increased rigor and challenge by awarding heavier weight to the usual grading
standard for less rigorous classes (Bishop, 2000).
Graduation Rate. The graduation completion rate is the measured quantity of students
who finish the academic requirements needed to receive a diploma from a degree granting
institution (Boden, 2011).
High School Grade-Point Average (HSGPA). The numerical average of graded high
school coursework appearing on the official academic transcript.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
24
Table 1.4
Grade Conversion
Grade-Point
Scale
Letter
Grades
100-Point
Scale
7-Point
Scale
Quality of
Performance Description
4.00 A+
Excellent Exceptional achievement 4.00 A 93+ 7
3.70 A- 90-92
3.43
6
Good Extensive achievement
3.30 B+ 87-89
3.00 B 83-86
2.86
5
2.70 B- 80-82
Satisfactory Acceptable achievement 2.30 C+ 77-79 4
2.00 C 73-76
1.70 C- 70-72 3
Poor Minimal achievement
1.30 D+ 67-69
1.14
2
1.00 D 65-66
0.70 D- 1
0.00 F 0-64 0 Failure Inadequate achievement
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
25
High School Transcript Study (HSTS). A recurring study established by the National
Center for Education Statistics to provide the U.S. Department of Education and other
educational policymakers with information about up-to-date course-taking patterns and offerings
in the nation’s public, private and religious high schools (NCES, 2013).
Independent Private High School. A non-Catholic school that relies on a combination of
tuition revenue, a gift in kind, and sometimes the investment yield of an endowment. This type
of private school is not financed or operated by national or local government, and is not
dependent on taxpayer support, but rather an elected board of directors govern the all operations
(National Association of Independent Schools, 2015).
International Baccalaureate (IB). A comprehensive two-year pre-college curriculum
offered in high school, which uses a standardized liberal arts curriculum and leads to a certified
IB diploma. Successful students master enhanced critical thinking skills, participation in
extracurricular activities and community services, and development of a comprehensive research
paper as well as being granted accelerated credit at colleges and universities (Byrd et al., 2007).
Introductory Courses. College freshmen are required to take general education and
prerequisite courses. Depending on the institution, it is sometimes possible for undergraduate
students to receive credit for these introductory courses through prior successful achievement in
Advanced Placement or IB coursework (Gollub et al., 2002).
Parochial School. A religious school, which in addition to conventional education,
provides a Catholic education. This is part of, and run by, a Catholic parish (Broughman, Swaim,
Parmer, Zotti, & Dial, 2014).
Persistence. Students remain, re-enroll, and continue forward towards degree completion
(Cuseo, 2013).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
26
Private School. Also known as independent schools, these non-state/public schools, are
self-governed and not overseen by local, state or national governments. Private schools select
their students independent of public policy and charge tuition rather than relying on public
taxation or government funding to finance their institution (California Department of Education,
2009).
Public School. Schools founded or endowed for public use and subject to public
management or control (Public School, n.d.).
Race/Ethnicity. For the purposes of this paper the word ethnicity will be used to denote
racial or ethnic identity. Specifically, ethnicity relates to the status of one’s membership in a
group regarded as having common descent, or sharing a common national or cultural tradition
(Ethnicity, n.d.). It is because of this reason that the word ethnicity was used to define six
mutually exclusive ethnic categories: Caucasian, black/African American, Hispanic, Asian,
Native Hawaiian/Pacific Islander, International, and unknown/decline to state/other (NCES,
2014). Table 1.5 discusses these definitions further.
Region. One of six geographic areas used in gathering and reporting data as defined by
The College Board. See Table 1.6 for College Board Regional Definitions.
Religious School. An institution that stresses the importance of particular beliefs related
to a corresponding religion (Religious Education, n.d.). School types include: Catholic,
Conservative Christian, Baptist, Lutheran, Jewish, Episcopal, Seventh-day Adventist, Calvinist,
and Friends, etc.
Retention Rate. Retention rate is the percentage of retained students in comparison to the
percentage of those at risk. A high retention rate results in a positive outcome for the institution
(Astin, 1997).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
27
Table 1.5
Definition of Ethnic Categories
Ethnicity Definition
Hispanic or Latino A person of Cuban, Mexican, Puerto Rican, South or Central
American, or other Spanish culture or origin, regardless of race.
American Indian or Alaska
Native
A person having origins in any of the original peoples of North
and South America (including Central America) who maintains
cultural identification through tribal affiliation or community
attachment.
Asian A person having origins in any of the original peoples of the Far
East, Southeast Asia, or the Indian Subcontinent, including, for
example, Cambodia, China, India, Japan, Korea, Malaysia,
Pakistan, the Philippine Islands, Thailand, and Vietnam.
Black or African American A person having origins in any of the black racial groups of
Africa.
Native Hawaiian or Other
Pacific Islander
A person having origins in any of the original peoples of
Hawaii, Guam, Samoa, or other Pacific Islands.
Caucasian A person having origins in any of the original peoples of
Europe, the Middle East, or North Africa.
Race/ethnicity unknown The category used to report students whose race and ethnicity
are not known.
Note. Adapted from NCES (2014).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
28
Table 1.6
Description of College Board Regional Definitions
College Board Region U.S. State or Territory
Middle States DC, DE, MD, NJ, NY, PA
New England CT, ME, MA, NH, RI, VT
Southwest AR, NM, OK, TX
Midwest IL, IN, IA, KS, MI, MN, MO, NE, ND, OH, SD, WV, WI
South AL, FL, GA, KY, LA, MS, NC, SC, TN, VA, PR, VI
West AK, AZ, CA, CO, HI, ID, MT, NV, OR, UT, WA, WY
Rigor. Prolonged exposure to complex academic content involving intellectually
challenging and engaging curriculum designed to promote higher order thinking (ACT, 2007;
Blackburn, 2012; Housman, 2006).
SAT Reasoning Test. The SAT is a standardized test considered by most college
admission offices in the United States as a way to assess an applicant’s readiness for college. The
exam is administered by the College Board, but is designed and scored by Educational Testing
Service. The current SAT Reasoning Test, introduced in 2005, costs $50 and takes three hours
and 45 minutes to complete. Three sections (Mathematics, Critical Reading, and Writing)
compile the exam. Composite scores range from 600 to 2400, after combining test results from
the three individual sections (each ranging from 200-800 points possible). Taking a standardized
entrance exam such as the SAT (or its competitor, the ACT) is commonly required for freshman
entry to most universities in the United States (College Board, 2013a).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
29
SAT Subject Tests. Twenty optional, subject-specific, multiple-choice, standardized tests
designed to improve a student’s credentials for college admission in the United States.
Sometimes called the SAT Subject Tests, or alternately, the SAT II: Subject Tests, students may
elect take up to three exams in addition to the SAT I (College Board, 2013a).
Self-reported Grades. Individual course grades reported directly from the student to the
university through the admission process are considered self-reported. Kuncel, Credé, and
Thomas (2005) suggest that self-reported grades are reasonable likenesses to actual grades for
high-achieving students, but are less likely to embody the actual scores of students with lower
achievement.
SES. Socioeconomic Status (Palardy, 2013).
Sex. The division of humans into two main categories (male and female) and sorted as
such by their reproductive functions (Sex, n.d.).
STEM. An acronym denoting the academic disciplines of science, technology,
engineering, and mathematics (Gonzales & Kuenzi, 2012).
Transcript. An official school record that documents the courses taken, grades awarded,
graduation status, and attendance of all students in a particular school. Assessments such as
PSAT, SAT, ACT, and honors are often also included (NCES, 2013).
Transfer Student. Any student who has enrolled in college courses (after the completion
of high school) and then relocates to another college or university before graduating.
True Freshman. Any student who enrolls in college (after the completion of high school)
before completing any courses, at the college level, at another institution.
Weighted High School Grade-Point Average. High school courses that provide an extra
grade-point, which allows students to earn a grade-point average above the traditional perfect
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
30
average of 4.0. Students who enroll in AP and IB classes receive extra grade-points, thus, their
maximum GPA is 5.0 whereas the un-weighted GPA is only 4.0 points (Nagaishi & Slade,
2012).
Organization of the Dissertation
Chapter 1 of the dissertation has presented the introduction, the background of the
problem, the statement of the problem, the purpose of the study, the questions to be answered,
the significance of the study, a brief description of the methodology and the definition of terms.
Chapter 2 is a review of relevant literature. It addresses the following topics: (1) college
admission, (2) high school rigor, and (3) first-year college success. These topics will be
presented in an integrative theoretical framework, supporting the notion of admission criteria and
their relationship to first-year college success.
Chapter 3 presents the methodology used in the study, including the research design;
population and sampling procedure; and the instruments and their selection, together with
information on validity and reliability. The chapter also describes the procedures for data
collection and analysis.
Chapter 4 presents the results of the study. Chapter 5 discusses and analyzes the results,
culminating in conclusions and recommendations.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
31
CHAPTER 2
REVIEW OF THE LITERATURE
The primary function of this review was to analyze the literature relating to school type
(Public, Catholic, and Independent Private) and how it might impact student preparation for and
success in college. Secondarily, literature about which high school variables predict student
success in college and which might be most helpful in the college/university admission process is
examined. A review and analysis of pertinent research was organized in the areas of college
admission, demonstrated performance prior to college/high school preparation, and college
persistence and success. How each of these topics relates to the school type is the objective of the
review.
College Admission
Ideologies that shape the admission processes, objectives of student selection, and the
prediction of academic success have been well documented (Atkinson & Geiser, 2009;
Clinedinst, Hurley, & Hawkins, 2013; Espenshade, Hale, & Chung, 2005; Geiser, 2009; Rigol,
2003). There is at least a 75-year history of intentional college admission leadership beginning in
1937, when a collection of college and university professionals gathered with high-school
counselors to establish a common professional code of ethics within admission counseling
(NACAC, 2012). This initial gathering resulted in the National Association for College
Admission Counseling (NACAC) and the membership organization survives today as the
foremost leader in the ethical recruitment and management of college admission records and
student applicants, as well as the subsequent matriculation of the admitted students into
postsecondary education (NACAC, 2012). As a result, vast research exists surrounding college
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
32
admission in the United States (Kiley & Gable, 2013), but still there are almost as many ways to
approach an admission file as there are institutions to read them (Rigol, 2003).
Selection Criteria
The selection criteria at each college are unique (Clinedinst et al., 2013). Differences
exist between public and private, large and small, and highly selective versus those with open
enrollment (NACAC, 2009). However, Perfetto (1999) distinguishes the difference between
eligibility and selection. He defines eligibility-based models as those that use public criteria,
which is published and objective, to determine if a student is admissible (Perfetto, 1999). By
contrast, selection models compare applicants against pre-establish sets of admission criteria, but
also to one another, resulting in offers being given to some, while denying others (Perfetto,
1999). Regardless of the configuration, most schools consider the capacity by which applicants
will be able to perform, benefit from, and contribute to the academic and social quality of their
school over the long term (Perfetto, 1999). Ultimately, it is the uniqueness, regardless of
approach, that makes building a freshman class an art, as well as a science (NACAC, 2009).
Capacity to perform in higher education.
Measuring a student’s capacity to perform, and thus predict the student’s likelihood of
college success, is a major objective in college admission (Perfetto, 1999; Rigol, 2003). This
selection criterion, associated with the “meritocracy” perspective (Rigol, 2002, p. 2), centers on
an applicant’s intellectual achievements, both in and outside the classroom (Andrews, 2004;
Bausmith & Laitusis, 2012; Chajewski, Mattern, & Shaw, 2011; Cuseo, 2013).
With regard to classroom pursuits, academic excellence is traditionally gauged by
evaluating an applicant’s high school transcript (Stearns et al., 2009). However, this assessment
of school performance will often go beyond the cumulative high school grade-point average
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
33
(HSGPA). The types of courses completed, and the trend of grades throughout high school,
traditionally deepens the admission professional’s understanding of the bigger academic picture
(Andrews, 2004; Attewell & Domina, 2008; Bishop, 2000; Rigol, 2002, 2003; Warburton,
Bugarin, & Nuñez, 2001; Willingham & Morris, 1986). Likewise, the rigor of the student’s
coursework is important, noting college preparatory classes such as International Baccalaureate
(IB) participation or Advanced Placement (AP) coursework (Geiser, 2009; Lichten, 2000; Sadler
& Tai, 2007; Stearns et al., 2009). In addition, class rank is sometimes indicated on the transcript
demonstrating a hierarchy of HSGPA within a graduating cohort (Kuh, 2001).
In addition, understanding the distribution of grades awarded at a particular school could
also be a useful measure (Astin, 1997; Camara & Kimmel, 2005; Cuseo, 2013; California
Department of Education, 2009; Broughman et al., 2012; Espenshade et al., 2005). An intelligent
approach to the issue of grade distribution has become increasingly important in light of
evidence of grade inflation in high school over the past decade (Arenson, 2004; Attewell &
Domina, 2008; Broughman et al., 2012; Pattison, Grodsky, & Muller, 2013; Zirkel, 1999). While
particular attention has been given to the student’s high school performance, questions emerge as
to the quality and reliability of these measures.
Confirmation of student achievements beyond grades and test scores can be important
measures of capacity when looking at college performance (Clinedinst & Hawkins, 2008; Kiley
& Gable, 2013; Kilpatrick, 2010; NACAC, 2012; Rigol, 2003; Trustees of Indiana University,
2013). These factors can include noteworthy research or unique projects, a proven track record of
intellectual curiosity and interests outside of the four walls of the classroom, and purposeful
academic challenges and honors (NACAC, 2009; Perfetto, 1999; Rigol, 2002, 2003). Elements
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
34
of a student’s application, such as the interview, personal essay, resume or letter of
recommendation all help to highlight these accomplishments (Rigol, 2003).
Once verification of an applicant’s ability to successfully compete in the college
classroom has been established, personal qualities associated with “character” can offer another
perspective when evaluating an application to college (Willingham & Morris, 1986).
Scholarship, leadership, and significant accomplishments contribute to quantitative measures for
character assessment (Perfetto, 1999; Rigol, 2002, 2003). For the purpose of this study, however,
personal qualities that weigh into a student’s pre-collegiate qualifications will not be measured.
These factors might include creative accomplishments, community service, athletic
achievements, work experience, internships and leadership positions, and other numerous
channels used to demonstrate an applicant’s self-worth, like the personal statement, high school
references or interviews. Furthermore, character, as well as compassion, empathy, and social
consciousness are attributes that might reasonably influence a student’s capacity to perform
(Geiser, 2009; Kohn, 2008; Yair, 2000), but are well beyond the scope of this research.
Capacity to benefit from higher education.
Higher education and how it will affect the prospective student in the long term is an
important consideration to many educators, school administrators and admission professionals
(Camara & Kimmel, 2005). To look beyond high school achievement to predict what might be
possible if a prospective student is awarded admission is a subtle evaluative measure in the
application process (Perfetto, 1999; Rigol, 2003; Stearns et al., 2009). Attempting to identify if a
potential student can interact effectively within the institution’s culture and learning
environment, as well as to benefit from healthy peer competition, is an important measure in the
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
35
long term success of the college or university (Fairris et al., 2011; Shaw & Mattern, 2009; Wyatt
et al., 2011).
Managing academic adversity has also been identified as a component that is important
for students to demonstrate (Schoeffel, van Steenwyk, & Kuriloff, 2011). Differences in
educational opportunities, teacher bias, socioeconomic backgrounds (SES), sex and race are just
some of the factors many students have had to rise above in order to make the cut in the
competitive college selection process (Altonji, Blom, & Meghir, 2012; Altonji, Elder, & Taber,
2005a; Attewell & Domina, 2008; Kiley & Gable, 2013; Kuh, 2001; Randall & Engelhard, 2010;
Stearns et al., 2009). The academic qualifications accumulated by a prospective student are
undoubtedly significant in the application process, but without context such credentials are
merely static measures of what someone has achieved (Perfetto, 1999). A holistic appraisal of an
applicant’s collegiate preparation may be equally important when considering their capacity to
benefit from higher education (Cuseo, 2013; Kirst, 1998; NACAC, 2009; Rigol, 2002, 2003).
Though not without controversy, Bowen and Bok (1998) speak effectively to this issue on
capacity to benefit:
To begin with, it is not clear that students who receive higher grades and test scores have
necessarily worked harder in school. Grades and test scores are a reflection not only of
effort but of intelligence, which in turn derives from a number of factors, such as
inherited ability, family circumstances, and early upbringing, that have nothing to do with
how many hours students have labored over their homework. Test scores may also be
affected by the quality of teaching that applicants received or even by knowing the best
strategies for taking standardized tests, as coaching schools regularly remind students and
their parents. For these reasons, it is quite likely that many applicants with good but not
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
36
outstanding scores and B+ averages in high school will have worked more diligently than
many other applicants with superior academic records. (p. 227)
While already difficult to predict with any certainty, the convoluted circumstances
identified by Bowen and Bok (1998) make the job of admission officers even more arduous. The
task of identifying prospective students with a capacity to contribute and engage as a member of
the school’s student body is no small effort (Rigol, 2003). Even loftier, is the struggle to identify
a future graduate who will use their entry into higher education as a way to help society as a
whole (Perfetto, 1999). This aspect of admitting students who seem likely to pay their education
forward by making esteemed and distinct contributions to a future profession or even to the good
of the greater society (Bowen & Bok, 1998) is beyond the scope of this paper, but certainly
worth noting.
Common Practices and Traditional Measures
There are nearly as many different approaches to the practice of admission as there are
institutions (Rigol, 2003). Summarizing the individual processes, delimited by school policy,
would be impossible. A discussion of internal practices, an explanation of the student search
services, an acknowledgment of the influence of college rankings, and an attempt to summarize
race conscious admission practices are included in this literature review. Each plays a significant
role in the scaffolding of college admission as a profession.
Internal practices.
Specific institutional needs often define the constructs for bringing in a new freshman
class. The institutional characteristics, as defined by the college or university’s senior leadership,
define how the admission staff will identify and collect information about prospective applicants.
Based on these criteria, the admission staff usually collects a wide-ranging index of
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
37
socioeconomic information, sex, home and school address, declared religion, racial/ethnic
orientation, etc. Many schools accomplish this by way of a student search service, which will be
discussed within this dissertation. In addition to this basic criterion, whether or not a prospective
student is previously affiliated with the institution (e.g., legacy), if they have any recognition of
outstanding achievement, if there are any indications of special talents (e.g., musically,
athletically, creatively), what academic and career interests they might have, and frequently, how
great the potential is for key donations to be made to the institution’s endowment are frequently
considered (Perfetto, 1999; Rigol, 2003). Often, the application or documents submitted by the
applicant’s high school provide much of this information, but contacts between admission staff
and other institutional benefactors and contacts, or even the applicants themselves also help to
fill in the gaps (Perfetto, 1999; Rigol, 2003).
Balancing institutional needs while also managing delicate secondary school
relationships can be as much of an art as a science. For some institutions, observing the judicial
or institutional mandates is paramount. Public universities, for example, may be bound by the
requirements of their supporting political entity. Religious institutions, on the other hand, may
build their freshman class largely from members of the school’s religious affiliation (Grogger &
Neal, 2000). Alumni, benefactors, or certain prominent constituencies are oftentimes influential
as well.
At the same time, some colleges and universities can also be intentional about creating a
particular kind of learning community (Grogger & Neal, 2000). Liberal arts colleges, for
example, may seek a certain disciplinary balance. Another type of school might put emphasis on
broad geographic representation. Likewise, decisions may be constructed around specific gaps in
the existing student body — perhaps a distinguished opera singer is needed for a floundering
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
38
music school or a left-handed pitcher is prized for the baseball team. Sometimes a specific
academic attribute, preparation, or achievement will be given extra deliberation, in spite of other
minimal qualifications (Rigol, 2003). The myriad factors that go on behind the scenes of the
formation of a college or university’s freshman class is boundless, which is why college
admission professionals might benefit from a more concise and constructive methodology.
Student search service.
Student search services are proprietary services very well known to admission
professionals. For example, over 1,100 scholarship programs, colleges and universities use the
College Board Student Search every year (College Board, 2014e). This kind of service allows
colleges and universities to target marketing efforts towards students who are interested in higher
education. For example, this personal and preferential direct mail allows perspective students to
hear from colleges and scholarship programs that are seeking students with their individual skill
set (College Board, 2014e). When students take a standardized exam, such as the ACT, SAT,
PSAT/NMSQT, or AP, they can elect to participate in the student search service during the
registration process; however, they have to be clearly informed to do so.
The service, while free to the student, costs 37 cents per name for the colleges. The
student search service fee allows admission professionals to identify students who might be a
good fit with their programs, scholarships and special activities (College Board, 2014e). By
amassing a comprehensive and targeted population schools are able to hone their recruiting
efforts. While some students may benefit from direct mail, only recognized colleges, universities,
and specified nonprofit groups, are permitted access to student information through the College
Board’s Student Search (College Board, 2014e). This limits how schools may outreach to
students and narrows the kinds of schools certain students will hear from. Furthermore, the
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
39
federal government mandates that certain information be kept confidential, such as telephone and
social security numbers (College Board, 2014e).
College Performance and Advanced Placement
With college admission becoming increasingly competitive in recent decades, one
method students attempt to differentiate themselves is by participating in the AP Program
(Bound, Hershbein, & Long, 2009). Klopfenstein and Thomas (2009), however, found that AP
involvement is not a decent predictor of first semester college GPA or of retention into the
second year of college. State policies which direct all school constituencies to include the AP
experience in their curriculum, and the common practice of university admission professionals
giving preferential treatment to students with AP coursework is cause for concern (Klopfenstein
& Thomas, 2009). Geiser and Santelices (2004) reported that the quantity of advanced placement
and honors courses taken by a student were not a significant predictor of college success,
however, the outcome of AP exam scores were clearly and significantly related to college
performance. Later, Camara and Michaelides (2005) criticized the methodology of Geiser and
Santelices (2004), but with admission staff using the quantity of AP coursework as a barometer
rather than as hypothetically more useful AP exam scores (Shaw, Marini, & Mattern, 2013), it
could be valuable to reconsider ideas like those posed by Geiser and Santelices (2004).
Unfortunately, 30 to 40% of students in AP courses nationally do not take the associated AP
exam (Gollub et al., 2002). The fact that the college admission process so heavily considers
participation in the AP curriculum, but not the exams themselves, provides considerable
incentive for students to partake in as many classes as they can manage.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
40
Demonstrated Performance Prior to College/High School Preparation
Understanding the extent to which high school rigor impacts college degree completion,
is a topic of interest to virtually all stakeholders such as parents, students, administrators, and
policy makers (Atkinson & Geiser, 2009; Berkowitz & Hoekstra, 2011; Cuseo, 2013). It is
presumed by many in the admission profession that if a high school provides relevant and
rigorous learning opportunities — from instructors who are qualified to help students achieve
high standards — a student has a better chance at college success, however, this presumption
lacks empirical evidence (Godfrey, 2011). It is not uncommon, when comparing schools, to see
that despite apparently equal course grades, achievement test scores show great disparities in
student ability (Jost, 2002), yet there is comparatively little empirical evidence that relates a high
school’s quality to a student’s persistence in college (Altonji et al., 2012). Therefore, the problem
of non-equivalence, paired with evidence of grade inflation, leave admission decisions, which
are meant to be based on achievement, in question (Godfrey, 2011).
Grade-Point Average, Grades and Transcripts
Research demonstrates that grades are not analogous between courses and that there are
substantial differences between grades assigned from different instructors for the same papers
and work (Camara, Kimmel, Scheuneman, & Sawtell, 2003; Willingham et al., 2002). Most
significant research surrounding grades in the United States comes from the College Board
(Breland et al., 2002; Cuseo, 2013). For example, the College Board surveyed 18,000 high
schools and 85% admitted the responsibility of grade distribution was that of the teachers
(Camara et al., 2003). A review of the literature on grading practices and the meaning of grades
suggests that grades have different connotations in different settings and are difficult to compare
school to school and classroom to classroom (Camara & Echternacht, 2000).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
41
There is no standardized system of grading in the United States (USDOE, 2008). Based
on a biannual survey from the College Board, 91% of secondary schools (grades 10, 11, and/or
12) whose students took the SAT I reported using a traditional A–F grading system on a four-
point scale, but even though this discrete evaluation is the most widespread grading structure in
the United States, there is no common standard that makes the system congruent (Camara, 1998).
Eight percent of schools discourage discrete evaluation in favor of a broad and lengthy
written evaluation (Camara et al., 2003). Further, 74% of public school districts report using
standardized grading policies across high schools, while 90% compute individualized school
HSGPA and only 81% calculate a student rank (Camara et al., 2003). The largest disparities
concern the use of A+ (39% of schools use this grade), omitting courses from the calculation of
HSGPA (43% of schools omit courses), and the definition of the lowest grade for which credit is
given (53% say D- while 38% use a higher “lowest” grade) (Camara et al., 2003). Nearly 85% of
schools allow teachers to award any grade distribution they desire, 7% of schools issue broad
guidelines about quantities of grades given (for example, about ¼ A, ¼ B...), and 3.5% of
schools have strict grading guidelines (Camara et al., 2003). Further, Camara et al. (2003) reports
that teachers base grades on several factors in addition to achievement, deciphering and using
them for a variety of purposes, like as prerequisites into more advanced classes. Traditional
grades have myriad definitions, yet teachers use individual grading systems with very little
scientific combination or weighting practice to standardize the outcomes across schools (Laska
& Juarez, 1992). Confounding the problem, high schools use different criterions for course titles,
credit assignment, or rubrics (Nord et al., 2011).
A traditional measure of academic achievement in the United States is grade-point
average (GPA). To calculate GPA, both grade information and course credit information are
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
42
needed. This type of information, however, contrasts considerably among schools, and between
states and regions. It is, therefore, essential to regulate the grading process so that student, as
well as school comparisons can be made. A course with 120 hours of instruction defines a
Carnegie unit, and currently this is the only standardized credit information used (NCES, 2011).
Grade inflation.
Ziomek and Svec (1997) studied the fact of grade inflation in American high schools.
Their study examined ACT’s student history files longitudinally from over 5,000 college-
preparatory high schools and revealed grade inflation (Ziomek & Svec, 1997). Data on freshmen
at 27 public colleges in Georgia showed that while the average SAT scores of freshmen was
wide-ranging, the college grade-point average (CGPA) remained stable (Kuh, Kinzie, Schuh, &
Whitt, 2010). Grade inflation has been problematic since the late sixties (Kuh et al., 2010). In
addition, widespread grade inflation in high schools over the past decade has been well
documented and discussed (Camara et al., 2003; Shaw & Mattern, 2009; Woodruff & Ziomek,
2004).
With grade inflation on the rise, the only standard the high school diploma can be relied
upon for is credit accrual and seat time, thus constricting knowledge about what students have
actually studied or what they can potentially do with what was learned (Wiggins, 1989). Stanley
and Baines (2009) ponder the value of the grade: “The grade is the cornerstone for
communication among parents, students, administrators and teachers. If that cornerstone is
faulty, the entire structure may eventually collapse” (p. 227).
Accelerated Learning
To mention a demanding high school course load without also mentioning Advanced
Placement (AP) or International Baccalaureate (IB) is almost impossible (Hechinger Institute,
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
43
2009). While the increase in students participating in the AP and IB curriculum has been
significant, evidence suggests that attention should rather be placed on how many of those
students pass the AP or IB exam (Hechinger Institute, 2009). In 2005, over half of graduates out
of high school who received a passing grade (A or B) in physics, were not prepared for college
science, and the same was shown to be true for Algebra II (ACT, 2007).
The College Board.
The College Board is a not-for-profit membership organization that makes up more than
5,400 schools, colleges and universities, servicing college admission and guidance, test
assessment, financial aid, and enrollment (College Board, 2013a). The best-known College
Board programs are the SAT, the PSAT/NMSQT’s, and the Advanced Placement (AP) program
(College Board, 2013a).
Advanced Placement coursework.
There is an intentional and reciprocal relationship between the Advanced Placement (AP)
Program and the secondary schools and colleges (College Board, 2013b). It first emerged on the
educational scene in May of 1954 and the College Board gained control of the process in 1955
(Rothschild, 1999). The Educational Testing Service (ETS), a wholly-owned subsidiary of the
College Board, became the administering agency for the examination (Rothschild, 1999).
At the time of this study, the AP Program offered 34 courses (College Board, 2014d). In
2011, the four core subject areas (English language arts, mathematics, science and social studies)
were offered in over half (53.8%) of American public high schools. Unfortunately, research in
this area is insufficient surrounding private secondary school data.
Advanced Placement courses are designed to mimic beginning college courses and
increase the overall rigor of the high school curriculum (Hertberg-Davis & Callahan, 2008).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
44
College faculty work with the College Board to align AP course standards with development,
validation, and scoring processes, as well as use correlational studies to guarantee that AP exam
performance associates with a similar performance in the comparable college course (Hughes,
2013). At the time of this study, there is very little peer-reviewed literature that compares
performance in the AP course with the analogous college course.
High school transcripts play an important role in the college admission process (Rigol,
2003), but many high schools add extra points to a student’s basic HSGPA as a reward for
advanced and honors coursework (Camara et al., 2003; Clinedinst et al., 2013). Regardless of
this extra point advantage, the actual number of AP courses a pupil took did not predict college
success (Geiser & Santelices, 2004). Rather, there is strong evidence indicating a correlation
between passing at least one AP exam and an increased graduation rate from college (Geiser &
Santelices, 2004). Perfetto (1999) found that low-income students had a 45% increase in five-
year graduation rates after passing at least one AP exam. This positive correlation between AP
exam scores and graduation rates was demonstrated in comparison to the low-income students’
counterparts who did not pass an AP exam (Perfetto, 1999).
Conversely, Tierney, Colyar, and Corwin (2003) found an overwhelming majority of
high school students, who enroll in AP courses, actually persevere and complete a bachelor’s
degree. This success rate included first generation college students (Tierney et al., 2003).
However, a substantial difference in persistence rate emerged between those who took AP
coursework and those who did not (Willingham & Morris, 1986). Twenty-six percent of the non-
AP students had dropped out of college by the fourth year, as opposed to only 15% of AP
students (Willingham & Morris, 1986). Willingham and Morris (1986) note that this discrepancy
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
45
in early graduation rates is likely associated with AP students receiving college credit, and not
necessarily with their being academically superior.
Advanced Placement benefits.
Research in support of the Advanced Placement program is abundant, positing myriad
benefits to the whole of education (Casserly, 1986; Mattern, Shaw, & Ewing, 2011; Morgan &
Klaric, 2007; Santoli, 2013; Willingham & Morris, 1986). Research which supports participation
in the Advanced Placement program commonly expound benefits surrounding exposure to
rigorous curriculum and the long term impact it has on college success, as well as evidence that
the completion of AP courses enhance the college application (Mason, 2010). Superior academic
performance from AP students over their non-AP counterparts (Willingham & Morris, 1986) is a
main benefit to the AP Program. Compelling research also indicates the use of AP credits, after
passing the AP exam, can reduce the college course load and prerequisites (Gollub et al., 2002;
Mason, 2010).
The College Board has sponsored considerable research on the AP Program since the
early 1980s (Camara, 2011). A recent publication summarizes the positive impacts of the AP
Program taught at high schools that are coupled with strong AP policies at corresponding
colleges (College Board, 2013c). Students who complete an AP course and earn college credit
(by subsequently passing the AP exam), perform well in successive college courses in that same
discipline (Morgan & Klaric, 2007). AP students who have passed the AP exam are more likely
to major in their AP subject or related discipline (Mattern et al., 2011). Successful exam-taking
AP students yield more college coursework than their non-AP counterparts in the same discipline
(Murphy & Dodd, 2009). AP students who have passed the corresponding AP exam are more
likely to graduate from college within five years of leaving high school (Dougherty et al., 2006).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
46
Lastly, AP students who complete an AP course (and earn credit for the corresponding
introductory college course) are more likely to develop an interest in STEM subjects that leads to
a STEM major in college (Morgan & Klaric, 2007).
Advanced Placement criticism.
Despite its popularity and influence, some have been critical of the Advanced Placement
program. Critics express uneasiness that too much emphasis is placed on facts, memorization and
the mechanics of test preparation, rather than the cultivation of critical thinking skills
(Klopfenstein, 2003; Klopfenstein & Thomas, 2005; Willingham & Morris, 1986; Gollub et al.,
2002; Klopfenstein & Thomas, 2009; Kyburg, Hertberg-Davis, & Callahan, 2007; Rigol, 2003).
Criticism includes concern that the AP Program favors privileged students with access to the best
academic preparation, and that this in turn creates a systematic bias against under-represented
minorities (Furry & Hecsh, 2001). In addition, admittance to these classes may be restricted as a
result of faculty experience (Kyburg et al., 2007), school resources (Willingham & Morris,
1986), or administrative opposition to the accelerated curriculum (Furry & Hecsh, 2001).
Critics also argue that even for students with a strong academic background, the breadth
and depth of the AP curriculum is too advanced (Klopfenstein, 2003). AP courses are designed
to cover a broad swath of curriculum in a considerably short timeframe, all with the end goal of
preparing for the AP exam (Klopfenstein & Thomas, 2005). Gollub et al. (2002) argues that the
rigid and rigorous curriculum may be limited based on the instructor’s ability to respond to
individual learning styles, thus inhibiting students’ long-term academic success. Further, AP
teachers express concerns that too many AP courses are taken by some students at one time,
resulting in less attentive and engaged intellectual work and more focused effort on earning a
higher grade-point average (Furry & Hecsh, 2001).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
47
Student access into an AP courses varies from school to school and from teacher to
teacher (Klopfenstein, 2003). In 2002, the College Board surveyed 31,811 AP teachers and
found that over half either used previous course grades, teacher recommendations, or prerequisite
course requirements as a way to assess students for their AP course (College Board, 2002).
Some of the more compelling research to argue concern for the AP Program comes from
Klopfenstein and Thomas (2009). Their inquiry casts doubt on the causal relationship between
AP curriculum and college success, finding no evidence that AP course taking is any more
superior to that of the non-AP curriculum, regardless of race or family income (Klopfenstein &
Thomas, 2009). What the AP examinations actually measure leaves reservations about the kinds
of thinking the examinations elicit (Gollub et al., 2002).
Lastly, the idea that many students fail to complete the comprehensive AP Program and
subsequently follow through with the post-course exam, propels criticism for the program (Furry
& Hecsh, 2001). On top of that, the timing of test result delivery to colleges, in mid-July, well
after admission decisions have been mailed, is a major point of contention (College Board,
2013c; Geiser & Santelices, 2004; Ishop, 2008; Willingham, 1985). Many students opt out of the
culminating exam knowing it holds no merit in the admission process, but researchers know that
student performance on AP examinations is strongly linked to college success, and simply taking
AP courses in high school does not provide a valid gauge for how well a student will perform in
higher education (Geiser & Santelices, 2004). In spite of this, the College Board continually
produces research in support of the AP Program, which the association benefits from, with over
20 million dollars in profits annually (Casement, 2003).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
48
Advanced Placement and SES.
If AP students tend to come from secondary schools that place emphasis on preparation
for college (Willingham & Morris, 1986), and if high schools with large AP Programs produce
students that experience more success in college (Santoli, 2013), what is to come of students
without access to the AP curriculum? Do schools without an AP curriculum suffer in quality?
Surprisingly, a high school’s academic reputation did not counteract a student’s class rank or
HSGPA if they had underperformed (Furry & Hecsh, 2001), but the number of AP courses
available at a school, had a positive impact on the college success of students who attended that
school, but never took an AP course (Kirst & Venezia, 2001). Further, within the same type of
secondary school, students who participated in the AP Program tended to do better in college
than students who did not participate (Willingham & Morris, 1986). Regardless of public or
private attendance, the trend seems to be that students who are educated alongside peers who
challenge one another to work at higher academic levels, graduate with better college preparation
than those who are separated into homogeneous learning groups (Espenshade et al., 2005; Tam
& Sukhatme, 2004).
Evidence supports the idea that rigor has a positive impact on college preparation
(Espenshade et al., 2005; Tam & Sukhatme, 2004), but Advanced Placement students tend to
come from somewhat more advantaged homes and from secondary schools that place special
emphasis on college preparation (Willingham & Morris, 1986). So what does this mean for
students with less access to AP? Low-income, Latino students, for example, are less inclined to
attend high schools with higher-level subject offerings than their white or Asian counterparts
(Adelman, 2006). There are several interesting differences between AP curriculum taught in
higher-SES schools and curriculum taught in lower-SES schools (Furry & Hecsh, 2001). Most
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
49
notably, Espenshade et al. (2005) found that in comparison to lower-SES schools, higher-SES
schools tended to have more AP teachers with higher longevity and experience, and that the AP
students were rarely underprepared or had gaps in their education. The drawback in these high-
SES environments, however, was that class sizes were large and there was pressure for AP
teachers to make sure students achieved high exam scores (Furry & Hecsh, 2001). Conversely,
teachers in lower-SES schools had strong dissatisfaction with student preparation, noting large
gaps in education, but noted that diminished pressure existed for students to achieve high test
scores (Furry & Hecsh, 2001).
Deepening the conversation of SES and class, Furry and Hecsh (2001) concluded that
student ethnicity had an extreme and disproportionate impact on participation in AP classes.
Particularly, Hispanics and African Americans were significantly under-represented in the AP
classroom, but this was only true within lower SES schools (Edwards & Duggan, 2012). In very-
high-SES schools, African American students seem to have a greater rate of participation than
their low SES counterparts (Edwards & Duggan, 2012). Caucasian students, on the contrary,
participated in AP coursework at higher rates in low (and very-low) SES schools, more so than
in very-high-SES schools (Furry & Hecsh, 2001). Interestingly, racial diversity in the Advanced
Placement program has not changed since its inception (Furry & Hecsh, 2001). Based on rates of
AP exam taking, whites and Asians participate at much higher rates than African Americans,
Native Americans, and Hispanics (Furry & Hecsh, 2001; Edwards & Duggan, 2012).
Advanced Placement examinations.
A unique aspect of the AP curriculum is that students are not required to enroll in or pass
an AP course in order to take a corresponding AP exam (College Board, 2013b). However, over
90% of American four-year colleges and universities give college credit on the basis of
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
50
qualifying AP exam scores (College Board, 2013b), so the process of exam completion has its
benefits. However, by taking the AP class, as well as the related exam, the likelihood of
attendance at a 4-year institution, plus matriculation into an equivalent college major, increases.
Taking both the class and the exam also increases the GPA, both in the corresponding college
course as well as the FYGPA. Furthermore, it raises the chance of retention into the second-year
of college and, ultimately, gradation (Shaw et al., 2013). Taking at least one AP course, however,
as opposed to none, does little to help predict first-year college GPA or to forecast retention into
the second-year of college, whereas the AP exam score plays a large and significant part in
predicitng college outcomes (Shaw et al., 2013).
AP examinations are evaluated with a 1 to 5 scale (with 5 as the highest), but the College
Board believes a student to be “successful” on an exam if he/she receives a 3 or higher (College
Board, 2013b). At some colleges and universities, a passing score on an AP exam provides an
opportunity for students to satisfy college requirements thus enabling an opportunity to complete
college coursework in a shorter amount of time (College Board, 2013b). It is individual colleges
and universities that set their own guidelines surrounding introductory-level course-credit and
placement (College Board, 2013b). Most schools consider the score of 3 adequate enough to
grant credit, but many states have gone so far as to mandate all public universities to award
introductory-level credit if a student has a 3 or higher on an AP exam (Johnson, 2005). However,
some private colleges, like Stanford, will only accept a perfect score of 5 in exchange for credit
to an introductory course (Stanford University, 2014).
Research shows, however, that AP students who achieve a 3 or better on an AP test
produce higher college grades, as well as graduate from college at a more significant rate, than
their peers with very similar circumstances expect for the fact of achieving lower test scores
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
51
(Ackerman, Kanfer, & Beier, 2013). Students who submitted AP grades of 4 or 5, however,
earned much higher college grade averages than predicted on the basis of school rank and SAT
scores, but that pattern was not true for students who made AP grades of l or 2 (Willingham &
Morris, 1986).
The mean score, in 2010, across all AP exams was 2.84, but this included 58% of all the
tests scored having received a 3 or higher (NCES, 2013). Sixty-one percent of males and 54% of
females received a 3 or higher in the same study (NCES, 2013). Also, in 2010, 56% of the test
takers were female (NCES, 2013).
The national graduating class of 2011 contained 30.2% of public high school students
who took an AP Exam, and 18.1% of those students scored a 3 or better on at least one AP Exam
(Hughes, 2013). Over 1.8 million American students took at least one AP Exam in 2010, an
increase from 0.6 million students in 1997 (Fithian, 2003). The overall total number of
administered AP exams also increased to over 3.1 million in 2010, from 0.9 million in 1997
(Fithian, 2003). This steady growth has been maintained by continuing national reforms in K-12
education, unwavering college admission practices, and sensationalized media attention (Fithian,
2003).
The AP exams are administered each year in May, and exam limits are determined by
individual school policies (College Board, 2013b). Homeschooled students, for example, or
students who attend a high school that does not offer AP curriculum, are still able to take an AP
exam by taking the test at a participating school (College Board, 2013b). Further, students with
recognized disabilities can arrange to receive special accommodations including extended test
time, large-print exams, as well as for the sighted and hearing impaired (College Board, 2013b).
Each AP exam currently costs $89 per test, although the College Board will provide up to a $28
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
52
discount to students who demonstrate financial need (College Board, 2013b). Students designate
which colleges or universities they wish to receive their scores, and scores are mailed to schools
in the middle of July (College Board, 2013c; Willingham & Morris, 1986).
Science, Technology, Engineering and Math (STEM)
Research repeatedly shows that mathematics participation is a strong predictor of college
success (Adelman, 1999, 2006; Sadler & Tai, 2007). Evidence suggests that continuous
enrollment in sequential high school math, throughout all four years of school, has a positive
influence on student completion of bachelor’s degrees (Zelkowski, 2008). Those with the highest
proclivity for bachelor completion, took higher-level math such as precalculus and calculus,
during their high school years (Zelkowski, 2008). Algebra I, geometry, Algebra II, trigonometry,
and precalculus lead the typical sequence of high school mathematics to calculus (Zelkowski,
2008). This cadence is crucial, since the odds that a student will earn a bachelor’s degree
significantly increases by completing calculus in high school (Lee & Burkam, 2003).
Students who achieved scores at the “proficient” level in both science and mathematics in
college were high school graduates who completed an AP or IB science or mathematics course,
accomplished a higher level science or mathematics course in ninth grade, or completed an
overall rigorous science or math curriculum throughout high school (Nord et al., 2011). State
high school graduation requirements usually include two years of science, but students with
intentions of college often take biology, chemistry, as well as physics (Gollub et al., 2002).
Standardized Test Scores
While not without criticism (Attewell & Domina, 2008; Bishop, 2000; College Board,
2013c; Palardy, 2013), the expansion of national standardized testing, such as programs like the
ACT and the SAT, emerged out of a necessity to compare student achievement across broadly
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
53
different curricula and grading standards (Augustine, 2005; Cuseo, 2013; Mason, 2010).
Standardized tests are exams that are implemented and scored through consistent and verified
measures, and they attempt to create a reliable comparison across all test takers (Chajewski et al.,
2011; Stearns et al., 2009). This effort can serve as a complimentary measure to HSGPA and
further establishes a value surrounding a student’s capacity to perform (Rigol, 2003). Further,
subject-area testing is sometimes used to validate course grades by measuring students’ strengths
and weaknesses in a particular subject (Augustine, 2005; Gollub et al., 2002; Randall &
Engelhard, 2010).
Standardized tests are used by public, as well as policy makers, to measure educational
progress (Burke, 2009), but substantial weight is placed on standardized tests when selective
colleges seek students with the strongest academic background (Moller et al., 2011). Research
suggests a significant relationship exists between cumulative HSGPA and standardized test
scores, and that this relationship has a strong predictive quality for future college success
(Mattson, 2007; Richmond, 2012; Trusty, 2004). Critics argue that standardized tests do not
always assess what students are learning, however, and that too much emphasis is placed on
factual knowledge rather than higher-order thinking and application (Burke, 2009). In fact, more
than 800 colleges and universities no longer use standardized tests in their college admission
process (FairTest, 2015).
In spite of these concerns, it is still very common for American colleges and universities
to use the SAT in their admission and evaluation process (College Board, 2015). The SAT is an
exam designed to measure college readiness. The College Board owns and operates this
governing body that manages the SAT, but the Educational Testing Service (ETS) administers
the actual exam. The SAT, as it exists today, was introduced in 2005, and takes three hours and
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
54
forty-five minutes to complete. It currently costs $50, per attempt, to take the test. Scores are
computed by combining three 800-point sections (Mathematics, Critical Reading, and Writing),
resulting in a scoring range from 600 to 2400. The competitor to the SAT is the ACT. Taking
either exam is required for freshman entry to many, but not all, American colleges and
universities (College Board, 2015).
High School Type
A major shortcoming in education literature is that few studies have examined private
secondary education with large enough sample-sizes. Cookson and Persell (1985) lament “few
researchers have had access to these schools” (p. 114). Existing research examines parochial
schools, specifically Catholic (Altonji, Elder, & Taber, 2005b; Evans & Schwab, 1995; Greeley,
1982; Jeynes, 2002; Lee & Holland, 1995; Neal, 1997), charter schools (Western Interstate
Commission for Higher Education, 2006), and/or private schools in relationship to public school,
but most of these studies involve the discussion of school choice or vouchers (Couch, Shughart,
& Williams, 1993; Stecher, Kirby, Barney, Pearson, & Chow, 2004). What lacks is a comparison
of quality between school type. Altonji et al. (2005a) have done considerable research in the area
of school choice and vouchers, but draw attention to the fact that questions about whether or not
private schools (including Catholic) offer superior instruction to public schools are currently at
the center of a national debate. They assert, however, that the breadth and depth of empirical
research has been stifled by an obsession with the role of vouchers, charter schools, and other
reforms, which might increase educational choice (Altonji et al., 2005a). In fact, the majority of
research involving school quality is specifically devoted to public institutions (Godfrey, 2011).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
55
Characteristics of public education.
The Elementary and Secondary Education Act of 1965 (ESEA) makes high-quality
education available to every child in the United States as a civic right. Since education in the
U.S. is also compulsory, the government provides it for free (subsidized by the taxpayer). The
public institutions in the United States totaled 98,328 in the 2011-12 school year. This count has
been steadily rising over the last decade when only 92,012 public schools existed in school year
1999-2000 (Kena et al., 2014). Sixty-one percent of traditional public institutions have more than
half of their student population identify as white (Kena et al., 2014). More than 75% of students
qualify for free or reduced-cost lunches in 21% of all public schools in 2011-12 (Kena et al.,
2014). This rate is significantly higher than in 1999-2000.
Most educational policy is directed at public schools (Ehrenberg & Brewer, 1994). No
Child Left Behind and The Common Core are just a few better-known policies that have taken
center-stage. These policies are designed around improving graduation rates. In 2011-12, 3.1
million public high school students (81%) graduated in the mandated amount of time (Kena et
al., 2014). However, the dropout rate has been decreasing since 1990, to only 7% in 2012, but
with 21% of school-age children living below the poverty line, this statistic is closely watched
(Kena et al., 2014).
Characteristics of Catholic education.
Much of the private school research surrounds Catholic high school education. This is
perhaps because Catholic schools, operated by the Roman Catholic Church, make up the largest
system of private schools in the United States (NCES, 2015). The greatest difference between
Catholic education and that of public or independent private is the central, goal-oriented
emphasis on religious development (NCES, 2015). Catholic education generally offers lower
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
56
tuition than its private school counterpart, however four of the top 50 most expensive private
high schools (non-boarding) in the country are Catholic (Stanger, 2014).
Evidence shows that students educated within the Catholic school system drop out of
high school less often than their public school counterparts and are two times as likely to enroll
in a four-year college (Altonji et al., 2005b). Furthermore, attending a Catholic high school
considerably increases the likelihood of urban minorities graduating from high school (Altonji et
al., 2005b).
Characteristics of independent private education.
Private school enrollment from pre-K through twelfth grade swelled from 5.9 million
1996 to 6.3 million in 2002, but then, due to the market crash in 2008, decreased to 5.5 million in
2010 (NCES, 2015). In 2008, it was estimated that the United States had 28,220 private schools
(NCES, 2015), but by 2010 that number had increased to 33,400 private schools (Broughman &
Swaim, 2013).
It has been noted that the growth rate of income per capita is higher within private
education (Sackett, Hardison, & Cullen, 2004). Witte (1992) found that as opposed to public
schools, private schools offered more advanced-level classes, and their students took them
(Witte, 1992). Evidence of differences in culture also exists. Students and administrators in the
private environment, compared with those in public schools, experienced markedly higher
expectations and more homework, as well as more stringent, but fair-minded discipline (Witte,
1992). Private school teachers also reported less fighting, less truancy, less verbal abuse, and less
drug and alcohol abuse among students, as well as experienced more school spirit and deepened
involvement in school activities (Witte, 1992). Private school is expensive, however, so access to
these benefits is limited to those who can afford it. The average private high school tuition
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
57
(Catholic, other religious, and non-sectarian) was $10,549 in 2008 (NCES, 2012). Currently, the
most expensive private high school in the nation is the Lawrenceville School, in Lawrenceville,
New Jersey. Based on daytime-only (non-boarding) tuition and fees, the cost to attend
Lawrenceville’s high school in 2013-2014 was $46,989 for the year. It boasts a 700-acre campus
with “Harry Potter-style inter-house competitions,” and a nine-hole golf course with an indoor
ice hockey rink (Stanger, 2014). Presumably, if private school students have access to these
abundant benefits, it is not surprising that the growth rate of income per capita is higher within
private education.
College Persistence and Success
The fundamental importance of the college experience is that students persist and
complete their undergraduate degree (HERI, 2003). This desire for persistence and degree
completion is true for all stakeholders: students, parents, and school administration (HERI,
2003). Institutional performance is often measured by graduation rates, with the success of an
institution, as well as its students, indicated by a high degree of completion rates (HERI, 2003).
These kinds of measurements also serves as a strong measure of accountability (HERI, 2003).
A Theoretical Framework
Trusty (2004) established a long-term educational development model that applies well to
this study. Specifically, he identified that several variables had significant effects on degree
completion. For example, early math aptitude and course-taking in math and science mattered
most to degree completion across all racial-ethnic groups (Trusty, 2004). Most importantly,
however, he identified that the most powerful effects on college graduation came from rigorous
and demanding course taking (Trusty, 2004). Further, Trusty (2004) identified several important
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
58
forms of student engagement: rigorous academic engagement, good school attendance,
participation in extracurricular activities, and high expectations from parents.
First-Year College Success and Graduation Rates
High involvement and engagement in high school assignments is significantly correlated
to continued motivation, commitment, and overall college performance (Shernoff & Hoogstra,
2001). Adelman (2006) studied the outcomes of high school and college performance on
bachelor’s degree attainment. However, the foremost attention of Adelman’s study included
college variables. Using longitudinal data from 1980 to 1993, two variables emerged as having
the greatest impact on graduation: the strength of students’ high school curriculum (i.e., rigor),
and unceasing college enrollment through college graduation (Adelman, 2006). Adelman (2006)
concluded that taking demanding mathematics curriculum in high school had the most impact on
college graduation. He further asserted that accomplishments in rigorous high school math had a
stronger impact as a whole, than did the effects of a HSGPA or standardized test scores
(Adelman, 2006).
A student who positively anticipates their future education, has a high class rank, is
successful with standardized tests, achieves the highest level of high school math available,
maintains momentum throughout high school in science and math, succeeds in consecutive
foreign language classes, has completed more than one AP course, exhibits overall academic
intensity throughout high school, and has access to sufficient academic resources, will likely
succeed in college (Adelman, 1999). Each of these variables has direct consequences for high
school planning and attention should be closely paid when assisting students in their future
pursuits (Adelman, 1999).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
59
Resilient first-year college grades are indicative of post-secondary course completion
(Gifford, Briceno-Perriott, & Mianzo, 2006; Reason, 2003). But research has often focused on an
institution’s degree completion rate by examining the differences in the effectiveness of
undergraduate retention programs (HERI, 2003). However, universities might indirectly help to
improve the school’s degree completion rate by computing an expected degree completion rate
based on the characteristics of the students in the entering freshman class, observing which
students have been most successful, and why (Willingham & Morris, 1986). Further, by
understanding the reasons for persistence in a student population, colleges and universities can
possibly apply that knowledge to the definition of success at the high school level (Willingham
& Morris, 1986). This awareness could help an admission staff to know where to seek students
that, if given the chance, would graduate successfully from their institution (Willingham &
Morris, 1986). Surprisingly, limited research has challenged this broad but straightforward idea
(Willingham & Morris, 1986).
Summary
With multiple benefits and concerns, the AP Program is a foremost contributor to high
school rigor and must be looked at closely (Furry & Hechs, 2001). The AP Program plays a more
dynamic role than ever before in the college admission process, therefore, it is critical to study
the program in relationship to college graduation rates (Furry & Hecsh, 2001). Broadening the
scope of admission deliberations to include a way to verify high school rigor could enable
admission offices to have a more complete understanding of student preparation (Kuh, 2005;
Ishler & Upcraft, 2005; Fairris et al., 2011). Verifying high school rigor could help to improve
the quality of admission decisions and ultimately increase first-year academic success (Kuh,
2005; Ishler & Upcraft, 2005; Fairris et al., 2011). To approach the admission process in this
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
60
holistic manner, could subsequently improve the graduation rate, which could be a long-term
gain (Kuh, 2005; Ishler & Upcraft, 2005; Fairris et al., 2011). Trusty (2004) proposes an
effective framework for targeting student engagement as a measure for college persistence and
overall academic success. Identifying high school measures of engagement and genuine
academic rigor could benefit college admission offices that intend to select students with the
capacity to perform well, in addition to benefiting from and graduating from college.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
61
CHAPTER 3
METHODOLOGY
The factors examined in this study were chosen based on their ability to elicit positive
student preparation for and success in college. This correlational study adopted a quantitative
approach (Creswell, 2012) to assess the best way to quantify high school grade inflation so that it
can be factored in to the college admission process. Utilizing admission data, current student
records and demographic data, the study examined the ability to use high school grades and
school type (Public, Catholic, and Independent Private) as a way to predict college success.
The current chapter outlines the methodology that was used in the study, including a
description of the research design, population and sample, variables and instrumentation, and
procedures for data collection and analysis.
Research Questions
Research Question 1: To what extent, if any, are the following variables correlated with
institutional type (Public, Catholic, and Independent Private): Sex, Ethnic Classification
(white/Asian versus other), Declared Academic Unit (STEM and non-STEM), Pell Grant
eligibility (Yes, No), Region of U.S., HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP
test scores (STEM and non-STEM), and AP grades (STEM and non-STEM)?
Research Question 2: Are there significant institutional level differences by institutional
type (Public, Catholic, and Independent Private) on HSGPA grade inflation and AP course grade
inflation (STEM and non-STEM) as indicated by the residuals in the following three regressions:
1. FYGPA on HSGPA,
2. FYGPA on AP STEM GPA,
3. FYGPA on AP non-STEM GPA.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
62
Research Design
The Institutional Review Board (IRB) of the participating university approved the study,
and all necessary clearances were obtained from the administrative offices involved in providing
data. This quantitative study utilized data collected all at one time in the fall of 2012, but the data
itself covered the span of time from fall of 2006 through fall of 2012. Initially, admission data
were collected from the Office of Admission, which included records of all incoming first-year
students from this time period. Incomplete records, international students, and athletes were
excluded. The Office of Registrar was then approached for academic data relating to all students
in the study. Finally, the Office of Financial Aid provided Pell Grant data, which would be used
to roughly determine SES. Lastly, due to the large scope of the analyses, a private research
consultant was retained to run the software under the guidance of the researcher’s dissertation
committee.
The variables in the research questions were Sex, Ethnic Classification (white/Asian and
other), Declared Academic Unit (STEM and non-STEM), Pell Grant eligibility (Yes, No),
Region of U.S., HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP test scores (STEM and
non-STEM), and AP grades (STEM and non-STEM), as well as school and school type. The
level of significance used to accept or reject the hypotheses was set at the .05 level. As defined
by the Central Limit Theorem (CLT), the large sample size in this study caused the regression of
the mean to decrease so that the sample and population means are functionally equivalent;
therefore, skewness and/or kurtosis of the sample data were irrelevant to interpretation of means
and other measures of dispersion.
The study employed a non-experimental research design utilizing a correlational
approach with an exploratory design. There was no intervention or treatment, other than the
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
63
exploration of school type. Descriptive statistical analyses were performed on the sample groups
to obtain a clear understanding of the population. Measures of central tendency (means, medians,
and other percentiles) and dispersion (standard deviations, ranges) were computed. Bivariate
correlational analyses were conducted in order to assess the strength of the direction of the
relationship between school type and college success.
Population and Sample
The population observed in this study consisted of matriculated undergraduate students at
a highly selective research university in the United States. In 2012, just over 47,000 students
applied to the university in this study, competing for only 2,950 places in the incoming first-year
class. Approximately 20% were offered admission, and one-third of that group enrolled. Men
and women applied in nearly equal numbers, coming mostly from the top 10% of their high
school classes. Applications represented every state in the United States and almost 100
countries, resulting in a well balanced and diverse student body.
The university in this study enrolls more underrepresented minority undergraduates than
most private research universities in the country: 3,398 as of fall 2012, or 19% of all
undergraduate students. Outside California, the leading U.S. metropolitan areas for students
admitted are, in order: New York City, Chicago, Seattle, Dallas, Boston and Washington, D.C.
The most represented foreign countries are China, South Korea, India, Canada and Singapore.
This institution maintains a strong commitment to financial aid for undergraduate
students and continues to increase the amount of funding available every year. The school offers
what it believes to be the largest pool of university-funded financial aid of any private institution
in the country. Two-thirds of the undergraduates receive some form of financial aid and students
are admitted without regard for ability to pay. The university in this study meets the full
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
64
demonstrated financial need for all admitted students. In fall 2012, 23% of those enrolled were
low-income undergraduates, as defined by Federal Pell Grant eligibility.
The vast majority of students at this institution rank in the top 10% of their high school’s
graduating class, 75% have standardized test scores at or above the 95th percentile, and their
average un-weighted HSGPA is 3.71 (on a 4-point scale). The average admitted student
completed five or more AP or IB courses in high school. The sample used from this population
included 17,080 student records from 3,413 different public, Catholic and independent private
high schools (see Table 3.1).
Table 3.1
School Count vs. Number of Students by School Type
School Type No. of Schools Percent No. of Students Percent
Public 2,392 70.085 11,016 64.500
Catholic 337 9.874 1,992 11.700
Independent Private 684 20.041 4,072 23.800
Total 3,413 100.000 17,080 100.000
The racial composition of the sample is shown in Table 3.2. Thirty percent of students are
Asian, 13.5% Hispanic, 6.5% are black/African American and less than 1% are Native
American/Pacific Islander. Overall, 20% are from underrepresented minority populations (black,
Hispanic or Native American). Twenty-three percent of the population in this study received a
Pell Grant at least once while in college. A student having received a Pell Grant while in college
is a dependable gauge of low SES (Douglass & Thomson, 2008; Wei & Horn, 2009).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
65
Table 3.2
Description of Ethnicity, Frequency and Percentage
Ethnicity Frequency Percent
Native Hawaiian/Pacific Islander 57 0.300
Unknown/Decline to State/Other 111 0.600
International 142 0.800
Black/African American 1,106 6.500
Hispanic 2,307 13.500
Asian 5,197 30.400
Caucasian 8,154 47.700
Total 17,074 100.000
Missing 6 0.000
Total 17,080 100.000
Variables
Independent Variables
School type (Public, Catholic, and Independent Private). School type, as defined by
Public, Catholic or Independent Private, was used as the main independent variable. There were
not enough charter or home schools in the sample to make a viable comparison, so they were
omitted. Private and non-Catholic/religious schools were combined to form the category of
independent private institutions. This grouping was created after understanding that Catholic
schools made up the largest school type within the private school category (NCES, 2015;
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
66
Broughman & Swaim, 2013). This subsequently allowed for the comparison between Catholic
and non-Catholic institutions.
Sex/gender. Sex/gender, as defined by the terms male and female, was used as an
independent variable.
Ethnic classification (White/Asian and other). The impact of ethnicity on school type
was deemed outside of the scope of this paper. Therefore, for the purposes of categorization,
ethnic classification was truncated by “white/Asian” and “other,” and was used as an
independent variable. Ethnicity was grouped into these two broad classifications in order to
isolate the majority (Shrestha, 2011). The categories used do not denote scientific definitions of
anthropological origins. Table 3.3 illustrates how the ethnic categorizations were delimited.
Declared academic unit (STEM and non-STEM).
Declared academic unit, as defined by STEM (Science, Engineering, Technology and
Math) and non-STEM, was used as an independent variable.
Pell Grant eligibility (yes, no).
Pell Grant eligibility, as defined by yes or no, was used as an independent variable. For
the purposes of this study, Pell Grant eligibility was used to establish the SES (Socioeconomic
Status) of each student.
Region of the United States.
Region of the U.S., as defined by the seven College Board regional categorizations, was
used as an independent variable. The groupings used were West, Southwest, Midwest, South,
Middle States, New England, and outside the U.S. (U.S. overseas territories).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
67
Table 3.3
Ethnic Classifications by NCES Category and Research Category
Ethnic Categories on Admission
Application NCES Ethnic Category
Research
Category
Arab/Arab American
Caucasian
White/Asian
Caucasian
Other Asian/Asian American
Other Asian/Asian American
Chinese/Chinese American
Filipino/Filipina/Filipino American
Asian Indian/Indian American
Japanese/Japanese American
Korean/Korean American
Central Asian
Vietnamese/Vietnamese American
Black/African American Black/African American
Other
Cuban
Hispanic/Latino
Puerto Rican
Other Spanish American
Central American
Mexican/Mexican American
South American
International Student International
Unknown Unknown
Native American/Alaskan Native Native American/Alaskan Native
Omitted
Native Hawaiian/Pacific Islander Native Hawaiian/Pacific Islander
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
68
Dependent Variables
High School Grade-Point Average (HSGPA).
Although the university in this study collects student-reported grades for use in the initial
evaluation process, no decisions are made until official transcripts have been received and grades
and courses have been verified. Kuncel, Credé and Thomas (2005) report that school
performance tends to moderate the reliability of student-reported grades, such that grades
reported directly by the student are most often honest reflections of actual grades, but this is only
true for students with high ability and elevated grade-point averages. Due to the high selectivity
of the university in this study, it is, therefore, accepted that the student-reported grades used in
the calculation of high school GPAs by the Admission Office are reliable.
The HSGPA, as measured by a traditional four-point scale with a theoretical range of 0.0
to 4.0, is a dependent variable. High schools utilize various approaches for calculating and
reporting grade-point averages. Sometimes only academic courses are included in the GPA;
other times weighting is applied, based on the pattern of honors or advanced coursework taken
by the student. Many high schools do not use the traditional four-point scale all together, in favor
of percentages or five- or six-point systems. In order to ensure that every student is considered
for admission using a common scale, it is the practice of the university in this study to recalculate
each student’s grade-point average using an internally developed GPA calculator. Only academic
coursework is included in this evaluation. The basic GPA equation consists of adding the number
of grade-points a student earned in academic courses over a given period of time and dividing
that number by the total number of credits taken overall (Camara & Echternacht, 2000).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
69
Standardized test scores (SAT/ACT).
The SAT/ACT, as normalized by using a concordance table supplied by the College
Board (College Board, 2014b), is a dependent variable. The concordance between the ACT
composite score and sum of the SAT critical reading and mathematics scores and the
concordance between the ACT combined English/writing score and the SAT writing score are
used to calculate the SAT/ACT BESTSCORE in this study.
The university in this study requires applicants to submit results from either the SAT
Reasoning Test or the ACT. The university has no preference and applies equal consideration to
both tests. If the student presents multiple sittings of the SAT, only the highest scores of each
section are considered, even if each score is derived from different test dates. If the student
presents multiple sittings of the ACT, only the highest composite score is considered. Once all
scores from the ACT have been converted to the SAT scale, the admission committee only
considers the student’s highest score between the two exams.
College Freshman Year Grade-Point Average (FYGPA).
The FYGPA, as measured by a traditional four-point scale, is a dependent variable. The
theoretical range for FYGPA is 0.0 to 4.0. Overall FYGPA is expressed in the study using the
theoretical range to measure grades on a traditional four-point grading scale. This is calculated
by taking the number of grade-points a student earned in a given period of time divided by the
total number of credits taken (Camara & Echternacht, 2000).
Number of Advanced Placement (AP) courses.
The number of AP courses a student took while in high school is a dependent variable.
Due to the timing of when college applications are evaluated during the senior year, twelfth
grade coursework is often incomplete. While the university in this study verifies all final
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
70
transcripts, the admission protocol requires original transcript data to be input prior to the posting
of senior year grades. As a result, some senior year AP coursework was not available to be
included in this study (see Table 3.4).
Table 3.4
Distribution of AP Courses by Student
AP Courses Reported Number of Students Percentage
0 9,102 53.290
1 1,053 6.165
2 1,608 9.415
3 1,918 11.230
4 1,563 9.151
5 823 4.819
6 412 2.412
7 230 1.347
8 174 1.019
9 91 0.533
10 57 0.334
11 30 0.176
12 15 0.088
13 1 0.006
14 1 0.006
15 0 0.000
16 2 0.012
Total 17,080 100.000
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
71
Advanced Placement test scores.
Another dependent variable in this study is the AP test score, which was measured by a
score expressed on a 1-5 scale (see Table 3.5) supplied by the College Board (2014c). To
establish validity, the College Board, independent of the high school, scores/evaluates AP exams.
Table 3.5
College Board Advanced Placement Score Scale
5 Extremely Well Qualified
4 Well Qualified
3 Qualified
2 Possibly Qualified
1 No Recommendation
Table 3.6 illustrates the number of reported AP exams. Due to the fact that the College
Board permits students to take an AP exam without having completed the corresponding AP
course, and because of the self-reported grade practices of the university in this study, the AP
exam count is not equal to that of the AP coursework (reported in Table 3.4 above).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
72
Table 3.6
Number of Advanced Placement Exams Reported per Student
Number of Exams Frequency Percent
0 2,398 14.000
1 456 2.700
2 770 4.500
3 1,319 7.700
4 1,709 10.000
5 2,179 12.800
6 2,084 12.200
7 1,749 10.200
8 1,480 8.700
9 1,184 6.900
10 752 4.400
11 452 2.600
12 371 2.200
13 88 0.500
14 55 0.300
15 13 0.100
16 12 0.100
17 7 0.000
18 1 0.000
21 1 0.000
Total 17080 100.000
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
73
Advanced Placement Course Grade.
The AP course grade, as measured by a letter grade expressed on an un-weighted four-
point scale, is a dependent variable. Thirty-seven AP courses were reported in this study,
including AP Latin (Vergil), AP Latin (Catullus – Horace), AP French Literature, AP Computer
Science AB. These four courses have since been discontinued from the AP Program, but were
originally available to the cohorts included in this study. AP Research, AP Seminar, AP Latin,
AP Physics 1 and AP Physics 2, courses not yet available to students in the cohorts examined,
were, therefore, omitted. For the purposes of this study, AP courses were grouped into STEM
and non-STEM categories. A complete list of the AP courses examined in this study follows in
Table 3.7 below. The University of California’s A-G subject requirements were found to be
representative of many practices throughout the country and were therefore used as the
foundation with which to sort the STEM/non-STEM groupings (University of California, 2014).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
74
Table 3.7
AP Groupings by STEM Designation
non-STEM
Arts
AP Art History AP Studio Art: 3-D Design
AP Music Theory AP Studio Art: Drawing
AP Studio Art: 2-D Design
English
AP English Language and Composition AP English Literature and Composition
History & Social Science
AP Comparative Government and Politics AP Psychology
AP European History AP United States Government and Politics
AP Human Geography AP United States History
AP Macroeconomics AP World History
AP Microeconomics
World Languages & Cultures
AP Chinese Language and Culture AP Japanese Language and Culture
AP French Language and Culture AP Latin
AP German Language and Culture AP Spanish Language and Culture
AP Italian Language and Culture AP Spanish Literature and Culture
STEM
Math & Computer Science
AP Calculus AB AP Computer Science A
AP Calculus BC AP Statistics
Sciences
AP Biology AP Physics C: Electricity and Magnetism
AP Chemistry AP Physics C: Mechanics
AP Environmental Science AP Physics 1
AP Physics B AP Physics 2
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
75
Instrumentation and Procedures
In accordance with the procedures approved by an institutional review board (IRB), this
study meets the requirements outlined in 45 CFR 46.101(b)(4) and qualifies for exemption from
IRB review. Permission was granted on September 26, 2013. The software used in this study was
IBM SPSS Statistics, Version 22 (IBM, 2013). A private consultant was retained to run the
software and worked under the guidance of the researcher’s dissertation committee.
In the fall of 2012, records for 34,340 students were obtained from three sources at the
university in this study: student records from the Office of the Registrar, admission records from
Office of Admission, and demographic and Federal Pell Grant data from the Office of Financial
Aid. Personal identifiers such as names, social security numbers and addresses were omitted
from the dataset prior to the analysis, and individual students were identified using a unique
number assigned to each student by the university at the time of admission.
As the three separate data sets were merged, several variables with incomplete and
missing data were identified and subsequently eliminated. During merging of the data sets, it was
decided that athletes would be removed from the population. The rationale for this was that some
athletes are admitted to the university under different justifications and less stringent admission
criteria than non-athletes. While not all athletes are brought to the university under these
circumstances, it was determined that as a population, they did not make up a significant
percentage of the data set, and were therefore eliminated to avoid potential bias. It was also
decided that only true freshmen would be considered for the study and, therefore, transfer
students were purged from the population as well. Transfer students are admitted to the
university with experience beyond a high school education and, therefore, cannot be compared as
equals to true freshmen. These efforts resulted in a final sample size of 17,080 student records
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
76
from 3,413 different high schools. The large sample size in this study diminished chance
variation, as well as the influence of outliers or extreme observations.
Ultimately, seven freshman cohorts spanning from 2006 to 2012 were included. Table 3.8
describes these cohorts, all of which entered the university in the fall term of the year of their
admission. A total of 7,983 or 46.7% were males, and 9,097 or 53.3% were females. Due to the
nature of this study, demographic data have been combined for all seven freshman classes.
Table 3.8
Description of Cohorts
Year of Entry Frequency Percent
2006 2,288 13.4
2007 2,560 15.0
2008 2,418 14.2
2009 2,431 14.2
2010 2,496 14.6
2011 2,377 13.9
2012 2,510 14.7
Total 17,080 100.0
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
77
The sample represents 3,413 American high schools. Steps were taken to protect the
school’s identities and the individual names of schools have been withheld from the study to
provide confidentiality for the high schools and for their students. International schools were
excluded from the study due to the fact that these myriad educational systems fall outside the
jurisdiction of the United States Department of Education (DoE) and, therefore, cannot be
compared equally to U.S. institutions. American schools, however, which are operated in U.S.
overseas territories but are regulated by the DoE, were included. For the purposes of the study,
the schools were sorted into regions of the United States as defined by the College Board (see
Table 3.9).
Table 3.9
Description of College Board Regions and School Counts
Region State No. of Schools Percent
West AK, AZ, CA, CO, HI, ID, MT,
NV, OR, UT, WA, WY
1412 41.371
Southwest AR, NM, OK, TX 276 8.087
Midwest IL, IN, IA, KS, MI, MN, MO,
NE, ND, OH, SD, WV, WI
540 15.822
South AL, FL, GA, KY, LA, MS, NC,
SC, TN, VA, PR, VI
399 11.691
Middle States DC, DE, MD, NJ, NY, PA 534 15.646
New England CT, ME, MA, NH, RI, VT 233 6.827
Outside U.S. (U.S.
overseas territories)
PR, VI, AE, AO, AP 19 0.557
Total 3413 100.000
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
78
After the population was narrowed and the schools were identified, one further cut was
made. To increase the reliability of the analysis, only schools with 5 or more students in the
population were used in the analysis involving FYGPA (see Table 3.10). The decision was made
because using a larger criterion for inclusion led to a preponderance of California schools. Lower
reliability due to smaller counts was not a problem. because the results remained the same for
sample sizes ranging from 6 to 12 institutions.
Table 3.10
School Count of Institutions with 5 or More Students in the Sample Population
N Mean Std. Deviation
Public 501 15.940 19.172
Catholic 94 16.830 17.010
Independent Private 180 17.880 21.211
Total 775 16.500 19.415
Note. Non-Catholic religious schools = 41.
For the purposes of this study, all non-academic high school courses/grades that did not
satisfy the University of California A-G subject requirements (University of California, 2014)
were deleted from the original data set (e.g., physical education). All “in-progress” high school
courses/grades were deleted since a final course grade was not available. Any high school course
that was taken at a college or university was deleted. The rationalization was that courses taken
outside of the jurisdiction of the high school listed on the transcript would not be a valid measure
of the high school quality itself.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
79
High school grades were often reported to the university in the study on a semester-by-
semester basis, thus a course was frequently listed twice (once for each semester). When this was
the case, the grades were averaged so that only one yearlong grade is listed in the dataset. In the
case of trimesters or quarters, all grades were averaged into one grade. Some courses, however,
are traditionally one semester only, most notably U.S. Government and Economics. In these
cases, only the reported semester grade was used. All letter and number grades (regardless of
scale used, e.g. 4-point, 7-point, 100-point, percentile) were converted to a four-point scale using
the College Board conversion scales prior to this study (College Board, 2014a). The most
common scale conversions for the university in this study are shown in Table 3.11.
Once the data was merged to a viable sample, the process of analyzing the variables
began. After agreeing on the definition of terms, the researcher’s dissertation committee worked
to limit the variables to only twelve, shown in Table 3.12.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
80
Table 3.11
Description of Grade Conversion
Grade-Point
Scale
Letter
Grades
100-Point
Scale 7-Point Scale
Quality of
Performance Description
4.00 A+
93+
7 Excellent
Exceptional
achievement
4.00 A
3.70 A- 90-92
3.43
B+ 87-89 6
Good
Extensive
achievement
3.30
3.00
B 83-86 5
2.86
2.70 B- 80-82
4 Satisfactory
Acceptable
achievement
2.30 C+ 77-79
2.00 C 73-76
1.70 C- 70-72 3
Poor
Minimal
achievement
1.30
D+ 67-69 2
1.14
1.00 D
65-66 1
0.70 D-
0.00 F 0-64 0 Failure
Inadequate
achievement
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
81
Table 3.12
Description of Variables
Variables SPSS Variable Name SPSS Variable Label SPSS Value Label
Independent
Variables
School Type Type of institution type_inst
1 = Public
2 = Catholic
3 = Independent Private
Sex/Gender Gender gender
1 = Female
0 = Male
Ethnic
Classification
(White/Asian
and other)
Ethnic
ethnic1
1 = Asian
2 = Black/African American
3 = Hispanic
4 = Caucasian
5 = Hawaiian/ Pacific
Islander
6 = Unknown
7 = International
ethnic2
1 = Caucasian/Asian
2 = Other
Declared
Academic Unit
Academic Unit acad_unit2
1 = STEM
0 = NOT STEM
Pell Grant
Eligibility
PELL Grant
(YES/NO)
pell_tot
1 = Yes
0 = No
Region of U.S.
College Board
Region
region
1 = Middle States
2 = New England
3 = Southwest
4 = Midwest
5 = South
6 = West
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
82
Table 3.12, continued
Variables SPSS Variable Name SPSS Variable Label SPSS Value Label
Dependent
Variables
HSGPA
The unadjusted
academic high school
GPA calculated by the
admission office
hs_gpa
4 = A
3 = B
2 = C
1 = D
0 = F
SAT/ACT
The highest of either
SATR+SATM+SATW
or the concorded ACT
bstscore
College Board
Concordance Table
FYGPA
The student’s
cumulative college GPA
for the freshman year
FY_GPA
4 = A
3 = B
2 = C
1 = D
0 = F
Number of
Advanced
Placement
(AP) Courses
Total number of AP
courses taken
num_ap_crs2
1 = None
2 = 1-3
3 = 4-5
4 = 6-7
5 = 8-9
6 = 10 or more
AP test scores
(STEM, non-
STEM)
AP STEM Test Scores ap_stem_test_score
5 = Extremely Well
Qualified
4 = Well Qualified
3 = Qualified
2 = Possibly Qualified
1 = No Recommendation
AP NON-STEM Test
Scores
ap_nonstem_test_score
5 = Extremely Well
Qualified
4 = Well Qualified
3 = Qualified
2 = Possibly Qualified
1 = No Recommendation
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
83
Table 3.12, continued
Variables SPSS Variable Name SPSS Variable Label SPSS Value Label
Dependent
Variables
AP course
grades
(STEM, non-
STEM)
AP STEM Course
Grades
ap_stem_course_grade
4 = A
3 = B
2 = C
1 = D
0 = F
AP NON-STEM Course
Grades
ap_nonstem_course_grade
4 = A
3 = B
2 = C
1 = D
0 = F
Delimitations
For the purpose of this study, student athletes, transfer students and international students
were removed from the population due to their uniquely different admission criteria. These
populations, therefore, should not be generalized to this study. In addition, there were
proportionally fewer charter or home schools in the sample to make a viable comparison, so they
were omitted. Further, independent private and non-Catholic/religious schools were combined to
form one category of independent private institutions. This grouping was created after
understanding that Catholic schools made up the largest school type within the private school
category (Broughman & Swaim, 2013). This subsequently allowed for the comparison between
Catholic and non-Catholic institutions.
Limitations
This study is limited to one large metropolitan, highly selective, private university in the
western region of the United States. Students and measures in the admission data sampled may
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
84
not be representative of the general college student population. Additionally, this study is limited
by not having information on the possible presence of unmeasured confounding variables such as
learning disabilities, psychological conditions, the impact of life circumstances, or the mere
stress of adjusting to college. Further, myriad family and school factors working simultaneously
at a number of varying levels limit student achievement and could not be controlled for in this
study.
This study is correlational and although the relationship between college success and high
school type may be substantiated by current literature, there is a lack of empirical data to suggest
a causal relationship between the individual level predictors chosen for this study and first-year
success in college. Consequently, the proposed model can only be interpreted from a
correlational perspective. Studies of this type are limited to identifying significant differences
among groups. These findings cannot determine the cause of the differences between school
types, nor are they designed to consider the nature and content of learning experiences.
Threats to Statistical Conclusion Validity
There were two statistical threats to this study. The first is the diminished power caused
by the restriction of range phenomenon, which affected one or more variables. This is
particularly true due to the high admission standards of the university in the study. Students are
carefully selected for high performance and achievement, thus negatively skewing all of the
study’s variables.
The second statistical threat was caused by the diminished power due to the low
reliability of residualized gain scores (coefficient alpha < .70). Most findings in the study,
however, showed significance in spite of this innate low reliability.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
85
Threats to Internal Validity
The threats to internal validity occurred because the researcher chose a correlational
method, and correlation does not imply or prove causation. Based on the research design, it
would be impossible to conclude that school type, for example, has a causal inference on GPA.
Students are not randomly assigned to independent private and Catholic schools, for example,
and this leaves the study vulnerable to a counter hypothesis.
Threats to External Validity
Finally, the present study’s results cannot be generalized to other samples, settings,
measurements or treatments. The types of students that made up the sample in this research were
predominantly white or Asian, and came from a higher SES than 23% of the research population.
As a large, private, urban, research institution, the setting of the university chosen for the
research limits generalizability to other college or university settings. Lastly, it would also not be
appropriate to generalize beyond the measurements used in this specific research. The
methodology used to calculate GPA may vary from one institution to another, for example.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
86
CHAPTER 4
RESULTS
Descriptive statistics for the data are presented first, followed by chi-square and ANOVA
results, which address Research Question 1. The final section presents the results of the
regressions used to analyze Research Question 2, and a discussion about grade inflation.
Descriptive Statistics
To gain an understanding of this extremely broad and very deep set of data, demographic
variables were first establish through a descriptive analysis of the 17,080 student subjects.
Demographic Variables
Spanning from 2006 to 2012, seven freshman classes composed the sample to include
17,080 undergraduate students in total. A total of 7,983 or 46.7% were males and 9,097 or 53.3%
were females. Due to the nature of the research questions, the size of the sample, and based on
the theory that Caucasians and Asians are over represented in colleges (Krogstad & Fry, 2014),
ethnicity was truncated to white/Asian (78%) and other (22%). Further, a significant number of
students in the sample never received a Pell Grant while enrolled in college (N=13,206) (see
Table 4.1).
Table 4.1
Federal Pell Grant Frequency
Pell Grant Recipient Frequency Percent
No 13,206 77.3
Yes 3,874 22.7
Total 17,080 100
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
87
The sample represents 3,413 high schools. Of schools in the sample, 70% were public,
20% were independent private, and roughly 10% were Catholic institutions. The majority of
schools are located in the Western region (N=1412). The smallest grouping of schools exists
outside of the U.S., in overseas territories (N=19).
Descriptive results for grades and test scores are shown in Table 4.2. The average student
in this study ranks in the top 10% of their high school’s graduating class, with a mean un-
weighted HSGPA of 3.71 (on a 4-point scale), and an average SAT score of 2061 out of 2400.
The mean number of AP courses completed per student is 5.3. Lastly, the average AP test scores,
and GPA, for both STEM and non-STEM, are also shown in Table 4.2. As a group, there is no
significant difference between the AP test scores (STEM and non-STEM), but AP non-STEM
GPA is slightly higher than AP STEM GPA.
Table 4.2
Descriptive Statistics of Dependent Variables
N Mean Std. Deviation
HSGPA 17,074 3.713 0.255
SAT/ACT 17,067 2061.633 166.112
FYGPA 17,080 3.289 0.503
# of AP: courses reported 17,080 5.289 3.323
Test Scores: AP STEM 12,973 3.745 1.066
Test Scores: AP non-STEM 13,734 3.758 0.796
GPA: AP STEM 6,560 3.578 0.512
GPA: AP non-STEM 6,566 3.646 0.437
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
88
The distribution of the HSGPA (M=3.71) in this study is negatively skewed, as shown in
Figure 4.1. This is due to the highly selective admission process performed by the university in
this study.
Figure 4.1. High School GPA
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
89
The distribution of the SAT/ACT (M=2062) in this study is also skewed to the left, as
shown in Figure 4.2. This is likewise due to the highly selective admission process performed by
the university in this study.
Figure 4.2. Highest SAT or ACT
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
90
The distribution of FYGPA (M=3.29) in this study is negatively skewed, as shown in
Figure 4.3. With the high selectivity of the freshman class, this sample naturally contains high
achieving students during the freshman year.
Figure 4.3. Freshman year GPA
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
91
Students in the sample enrolled in one of 17 academic units (shown in Table 4.3). These
17 units, sorted by STEM and non-STEM, are also shown in Table 4.3. Overall, 57% of students
entered enrolled in non-STEM academic units, while 29% enrolled in STEM majors. Of the
students in the study, 14% entered freshmen year undeclared.
Table 4.3
Description of Student’s Declared Academic Unit
STEM Categorization Academic Unit Frequency Percent
non-STEM
Occupational Therapy 11 0.100
Gerontology 36 0.200
Public Policy 82 0.500
Health Promotion 148 0.900
Accounting 262 1.500
Art and Design 334 2.000
Architecture 632 3.700
Humanities 721 4.200
Dramatic Arts 741 4.300
Music 804 4.700
Communication and Journalism 926 5.400
Cinematic Arts 1036 6.100
Social Sciences 1672 9.800
Business 2348 13.700
Undeclared 2398 14.000
STEM
Natural Sciences 2234 13.100
Engineering 2695 15.800
Total 17080 100.000
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
92
Overall, the sample of 17,080 students from 3,413 different high schools represents
mostly whites and Asians, from mostly public institutions, with above average academic
achievement and high SES. However, 30% of the population attended independent private or
Catholic high schools, thus making the examination of student success, based on school type,
extremely viable.
Analysis of Research Questions
Research Question 1
To what extent, if any, are the following variables correlated with institutional type
(Public, Catholic, and Independent Private): Sex, Ethnic Classification (white/Asian versus
other), Declared Academic Unit (STEM and non-STEM), Pell Grant (Yes, No), Region of U.S.,
HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP test scores (STEM and non-STEM),
and AP grades (STEM and non-STEM)?
A graphic organizer (shown in Figure 4.4) illustrates the first research question.
The cross-tabulation of region by institutional type is shown in Table 4.4. Most schools
are located in the West (69%), the same region of the university in this study. The next most
sizable regions, in order, are the Midwest (8.3%) and the Middle States (8%). Two medium-sized
regions follow, and consist of the Southwest (5.9%) and the South (5.2%). Last, is the smallest
grouping of schools overall is New England (3.5%).
In the sample used by this study, the only region with proportionally more students from
Catholic schools is in the West (78%). The Middle States (9.3%), South (7.9%), Southwest
(7.6%), and New England (6.9%) have proportionately more students from independent schools.
The only region that has proportionately more students from public schools was the Midwest
(9.2%).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
93
Figure 4.4. RQ 1: Correlations between demographic & academic indicators with school type
School&
Type&
Sex/
Gender&
Ethnicity&
Pell&
Grant&
Region&
Major&
HSGPA&
SAT/
ACT&
FYGPA&
#&of&AP&
AP&GPA&
AP&Test&
Scores&
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
94
Table 4.4
Cross-Tabulation of College Board Region by School Type
School Type
Public Catholic
Independent
Private Total
College
Board
Region
Middle
States
Count 892 93 376 1,361
% within
School Type
8.10% 4.70% 9.30% 8.00%
New
England
Count 275 38 280 593
% within
School Type
2.50% 1.90% 6.90% 3.50%
Southwest
Count 612 81 309 1,002
% within
School Type
5.60% 4.10% 7.60% 5.90%
Midwest
Count 1,013 176 219 1,408
% within
School Type
9.20% 8.80% 5.40% 8.30%
South
Count 512 50 320 882
% within
School Type
4.70% 2.50% 7.90% 5.20%
West
Count 7,706 1,551 2,559 11,816
% within
School Type
70.00% 78.00% 63.00% 69.30%
Total
Count 11,010 1,989 4,063 17,062
% within
School Type
100% 100% 100% 100%
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
95
A chi-square test, shown in Table 4.5, was used to test independence and indicated a
significant association between region and school type, !
2
(10) = 439.444, p = .001.
Table 4.5
Chi-Square Results for Region and School Type
Value df Asymp. Sig. (2-sided)
Pearson Chi-Square 439.444
a
10 .001
Likelihood Ratio 422.524 10 .001
Linear-by-Linear Association 46.634 1 .001
N of Valid Cases 17062
a
0 cells (.0%) have expected count less than 5. The minimum expected count is 69.13.
The cross-tabulation of number of AP courses by institutional type is shown in Table 4.6.
Public institutions offer the highest number of AP courses (75.2%). Independent private schools,
however, lead the three school types with the most schools that teach no AP courses at all
(19.9%).
A chi-square test, shown in Table 4.7, was used to test independence and indicated a
significant association between number of AP courses and school type, !
2
(4) = 317.861, p =
.001.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
96
Table 4.6
Number of AP Courses by School Type
School Type
Number of AP Courses Offered Public Catholic
Independent
Private Total
No AP
Courses
Count 1,321 265 812 2,398
% within School
Type
12.00% 13.30% 19.90% 14.00%
1-3 AP
Courses
Count 1,409 352 784 2,545
% within School
Type
12.80% 17.70% 19.30% 14.90%
4 or more AP
courses
Count 8,286 1,375 2,476 12,137
% within School
Type
75.20% 69.00% 60.80% 71.10%
Total
Count 11,016 1,992 4,072 17,080
% within School
Type
100.00% 100.00% 100.00% 100.00%
Table 4.7
Chi-Square Results for Number of AP Courses and School Type
Value df Asymp. Sig. (2-sided)
Pearson Chi-Square 317.861
a
4 .001
Likelihood Ratio 307.146 4 .001
Linear-by-Linear Association 279.023 1 .001
N of Valid Cases 17,080
a
0 cells (.0%) have expected count less than 5. The minimum expected count is 279.67.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
97
Table 4.8 shows the proportions for ethnicity, academic unit and Pell grant. Gender was
not significant and was, therefore, omitted from Table 4.8. The differences shown in Table 4.8
were tested using a one-way analysis of variance (ANOVA). ANOVA results are shown in Table
4.9. One-way between groups analyses of variance were conducted to explore the impact of
school type on demographic indicators, as measured by ethnicity (white/Asian and other),
Academic Unit, Pell Grant distribution and College Board region. There is a statistically
significant difference (p = .001) for each of the demographic indicators.
Table 4.8
Ethnicity, Academic Unit and SES by School Type
N Mean Std. Deviation Std. Error
White or Asian
Public 8,664 0.787 0.41 0.004
Catholic 1,380 0.693 0.461 0.01
Independent Private 3,307 0.812 0.391 0.006
Total 13,351 0.782 0.413 0.003
Academic Unit
(STEM)
Public 3,403 0.309 0.462 0.004
Catholic 568 0.286 0.452 0.01
Independent Private 956 0.235 0.424 0.007
Total 4,927 0.289 0.453 0.003
Pell Grant (YES)
Public 2,975 0.27 0.444 0.004
Catholic 326 0.164 0.37 0.008
Independent Private 572 0.141 0.348 0.005
Total 3,873 0.227 0.419 0.003
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
98
At the time of admission, students self-selected their ethnic category from 22 racial
identifiers provided by the university in this study. Due to the large sample size, the researcher
sorted these into seven widely accepted racial groups (NCES, 2014), and then further into
white/Asian and other. The rationale for truncating the data was based on the understanding that
whites and Asians are over represented in colleges (Krogstad & Fry, 2014), and the remaining
ethnic groups are under-represented. The one-way ANOVA indicated that there are more
white/Asian students (81%) at independent private schools than at public (79%) or Catholic
schools (70%).
Academic unit, or the central administrative authority over individual majors, was used to
group students by area of academic interest. The data was further truncated into STEM and non-
STEM to provide for a more parsimonious analysis. More public school students (30%) opted to
enroll in a STEM major than students in Catholic (29%) or independent private (24%) schools.
Pell Grant distribution was used to indicate socio-economic status based on the fact that
students who receive Pell Grants are from low-income families. More public school students
(27%) received Pell Grants at least one time while in college compared to students in Catholic
(16%) or independent private (14%) schools.
ANOVA was used to test the difference in Table 4.9. The ANOVA results are significant
(p = .001) for all demographic indicators, with the exception of gender (not shown).
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
99
Table 4.9
Analysis of Variance for Ethnicity, Academic Unit, SES and Region by School Type
Sum of
Squares df
Mean
Square F Sig.
White or Asian
Between Groups 19.777 2 9.889 58.330 0.001
Within Groups 2895.087 17,077 0.170
Total 2914.864 17,079
Academic Unit
(STEM)
Between Groups 16.402 2 8.201 40.126 0.001
Within Groups 3490.172 17,077 0.204
Total 3506.574 17,079
PELL grant
(YES)
Between Groups 58.735 2 29.367 170.779 0.001
Within Groups 2936.584 17,077 0.172
Total 2995.319 17,079
College Board
Region
Between Groups 384.357 2 192.179 74.714 0.001
Within Groups 43878.896 17,059 2.572
Total 44263.253 17,061
Post-hoc comparisons using Fisher’s LSD test, shown in Table 4.10, indicate that all of
the mean score mean contrasts are statistically significant.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
100
Table 4.10
Fisher’s LSD Post Hoc Tests for Ethnicity, Academic Unit, SES and Region by School Type
Multiple Comparisons
Dependent
Variable
(I) School
Type (J) School Type
Mean
Difference (I-J) Std. Error Sig.
White or
Asian
Public
Catholic .09372
*
0.010 0.001
Independent Private -.02564
*
0.008 0.001
Catholic
Public -.09372
*
0.010 0.001
Independent Private -.11936
*
0.011 0.001
Independent
Private
Public .02564
*
0.008 0.001
Catholic .11936
*
0.011 0.001
Academic
Unit (STEM)
Public
Catholic .02336
*
0.011 0.034
Independent Private .07423
*
0.008 0.001
Catholic
Public -.02336
*
0.011 0.034
Independent Private .05087
*
0.012 0.001
Independent
Private
Public -.07423
*
0.008 0.001
Catholic -.05087
*
0.012 0.001
PELL grant
(YES)
Public
Catholic .10641
*
0.010 0.001
Independent Private .12934
*
0.008 0.001
Catholic
Public -.10641
*
0.010 0.001
Independent Private .02294
*
0.011 0.043
Independent
Private
Public -.12934
*
0.008 0.001
Catholic -.02294
*
0.011 0.043
College Board
Region
Public
Catholic -.26778
*
0.039 0.001
Independent Private .25082
*
0.029 0.001
Catholic
Public .26778
*
0.039 0.001
Independent Private .51860
*
0.044 0.001
Independent
Private
Public -.25082
*
0.029 0.001
Catholic -.51860
*
0.044 0.001
*
The mean difference is significant at the .050 level.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
101
Table 4.11 shows the AP grades and test scores, broken down for STEM and non-STEM
courses. Table 4.11 demonstrates that public schools have the lowest AP STEM GPA (M=3.573)
and Catholic schools have the highest (M=3.596), but independent private schools have the
highest AP STEM test scores (M=3.858) and Catholic schools have the lowest (M=3.405).
Public schools have the highest AP non-STEM GPA (M=3.669) and independent private schools
have the lowest (M=3.571), however, independent private schools have the highest AP non-
STEM test scores (M=3.815) while Catholic institutions have the lowest (M=3.704). Overall,
independent private schools have significantly higher AP test scores than the other two school
types.
ANOVA was used to test the difference in Table 4.11. One-way between groups analysis
of variance was conducted to explore the impact of school type on AP performance, as measured
by GPA and test scores (see Table 4.12). There is a statistically significant difference (p = .001)
in all ANOVAs with the exception of AP STEM GPA (p = .499), where the mean scores
between groups are quite small.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
102
Table 4.11
AP STEM GPA, AP STEM Test Scores, AP Non-STEM GPA, and AP Non-STEM Test Scores by
School Type
N Mean Std. Deviation Std. Error
AP STEM
GPA
Public 4,792 3.573 0.521 0.008
Catholic 656 3.596 0.502 0.020
Independent Private 1,112 3.585 0.475 0.014
Total 6,560 3.578 0.512 0.006
AP STEM
Test Scores
Public 8,734 3.764 1.062 0.011
Catholic 1,434 3.405 1.155 0.031
Independent Private 2,805 3.858 0.996 0.019
Total 12,973 3.745 1.066 0.009
AP non-
STEM GPA
Public 4,705 3.669 0.431 0.006
Catholic 774 3.618 0.435 0.016
Independent Private 1,087 3.571 0.458 0.014
Total 6,566 3.646 0.437 0.005
AP non-
STEM Test
Scores
Public 9,226 3.750 0.794 0.008
Catholic 1,637 3.704 0.786 0.019
Independent Private 2,871 3.815 0.804 0.015
Total 13,734 3.758 0.796 0.007
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
103
Table 4.12
Analysis of Variance for AP STEM GPA, AP STEM Test Scores, AP Non-STEM GPA, and Non-
STEM Test Scores by School Type
Sum of
Squares df
Mean
Square F Sig.
AP STEM
GPA
Between Groups 0.365 2 0.182 0.696 0.499
Within Groups 1718.170 6,557 0.262
Total 1718.534 6,559
AP STEM
Test Scores
Between Groups 204.333 2 102.166 91.105 0.001
Within Groups 14544.662 12,970 1.121
Total 14748.994 12,972
AP non-
STEM
GPA
Between Groups 9.156 2 4.578 24.093 0.001
Within Groups 1247.101 6,563 0.190
Total 1256.257 6,565
AP non-
STEM Test
Scores
Between Groups 14.598 2 7.299 11.551 0.001
Within Groups 8676.423 13,731 0.632
Total 8691.021 13,733
Post-hoc comparisons using Fisher’s LSD test, shown in Table 4.13, indicate that the
mean score for all differences contrast, except for the case of AP STEM GPA.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
104
Table 4.13
Fisher’s LSD Post Hoc Tests for AP STEM GPA, AP STEM Test Scores, AP Non-STEM GPA,
and Non-STEM Test Scores by School Type
Multiple Comparisons
Dependent
Variable
(I) School
Type (J) School Type
Mean Difference
(I-J) Std. Error Sig.
AP STEM
GPA
Public
Catholic -0.022 0.021 0.291
Independent Private -0.012 0.017 0.496
Catholic
Public 0.022 0.021 0.291
Independent Private 0.011 0.025 0.666
Independent
Private
Public 0.012 0.017 0.496
Catholic -0.011 0.025 0.666
AP STEM Test
Scores
Public
Catholic .35855
*
0.030 0.001
Independent Private -.09405
*
0.023 0.001
Catholic
Public -.35855
*
0.030 0.001
Independent Private -.45260
*
0.034 0.001
Independent
Private
Public .09405
*
0.023 0.001
Catholic .45260
*
0.034 0.001
AP non-STEM
GPA
Public
Catholic .05088
*
0.017 0.003
Independent Private .09773
*
0.015 0.001
Catholic
Public -.05088
*
0.017 0.003
Independent Private .04684
*
0.021 0.022
Independent
Private
Public -.09773
*
0.015 0.001
Catholic -.04684
*
0.021 0.022
AP non-STEM
Test Scores
Public
Catholic .04573
*
0.021 0.032
Independent Private -.06486
*
0.017 0.001
Catholic
Public -.04573
*
0.021 0.032
Independent Private -.11059
*
0.025 0.001
Independent
Private
Public .06486
*
0.017 0.001
Catholic .11059
*
0.025 0.001
*
The mean difference is significant at the .050 level.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
105
Table 4.14 displays the HSGPA, SAT/ACT, and FYGPA broken down by school type.
Public schools have the highest overall HSGPA (M=3.75), with Catholic schools closely behind
(M=3.74). Independent privates have the lowest HSGPA (M=3.59). Public schools have the
highest SAT/ACT (M=2065), independent private schools fall in the middle (M=2060) and
Catholic schools have the lowest (M=2041). Public schools also have the highest FYGPA
(M=3.31), independent private schools fall between the two (M=3.27) and Catholic schools have
the lowest FYGPA (M=3.24).
Table 4.14
HSGPA, SAT/ACT, and FYGPA by School Type
N Mean Std. Deviation Std. Error
HSGPA
Public 11,014 3.750 0.233 0.002
Catholic 1,992 3.736 0.232 0.005
Independent Private 4,068 3.599 0.287 0.005
Total 17,074 3.712 0.255 0.002
SAT/ACT
Public 11,003 2065.754 169.725 1.618
Catholic 1,992 2041.165 157.905 3.538
Independent Private 4,072 2060.511 159.298 2.496
Total 17,067 2061.633 166.112 1.272
FYGPA
Public 11,016 3.307 0.487 0.005
Catholic 1,992 3.243 0.513 0.011
Independent Private 4,072 3.266 0.538 0.008
Total 17,080 3.290 0.503 0.004
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
106
The differences in Table 4.14 were tested using a one-way ANOVA. ANOVA results are
shown in Table 4.15. There is a statistically significant difference (p = .001) in all categories,
HSGPA, SAT/ACT, and FYGPA.
Table 4.15
Analysis of Variance for HSGPA, SAT/ACT, and FYGPA by School Type
Sum of
Squares df
Mean
Square F Sig.
HSGPA
Between Groups 69.461 2 34.731 570.699 0.001
Within Groups 1038.880 17,071 0.061
Total 1108.341 17,073
SAT/ACT
Between Groups 1026531.923 2 513265.962 18.640 0.001
Within Groups 4.70E+08 17,064 27536.349
Total 4.71E+08 17,066
FYGPA
Between Groups 9.927 2 4.963 19.665 0.001
Within Groups 4310.126 17,077 0.252
Total 4320.053 17,079
Post-hoc comparisons using Fisher’s LSD test, shown in Table 4.16, indicate that the
mean score for all comparisons was statistically significant for all three categories, HSGPA,
SAT/ACT, and FYGPA.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
107
Table 4.16
Fisher’s LSD Post Hoc Tests for HSGPA, SAT/ACT, and FYGPA by School Type
Multiple Comparisons
Dependent
Variable
(I) School
Type (J) School Type
Mean
Difference
(I-J) Std. Error Sig.
HSGPA
Public
Catholic .01419
*
0.006 0.018
Independent Private .15153
*
0.005 0.001
Catholic
Public -.01419
*
0.006 0.018
Independent Private .13733
*
0.007 0.001
Independent
Private
Public -.15153
*
0.005 0.001
Catholic -.13733
*
0.007 0.001
SAT/ACT
Public
Catholic 24.5892
*
4.041 0.001
Independent Private 5.243 3.044 0.085
Catholic
Public -24.5892
*
4.041 0.001
Independent Private -19.3461
*
4.537 0.001
Independent
Private
Public -5.243 3.044 0.085
Catholic 19.3461
*
4.537 0.001
FYGPA
Public
Catholic .06391
*
0.012 0.001
Independent Private .04107
*
0.009 0.001
Catholic
Public -.06391
*
0.012 0.001
Independent Private -0.023 0.014 0.096
Independent
Private
Public -.04107
*
0.009 0.001
Catholic 0.023 0.014 0.096
*
The mean difference is significant at the .050 level.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
108
Summary of Research
Research Question 1
To what extent, if any, are the following variables correlated with institutional type
(Public, Catholic, and Independent Private): Sex, Ethnic Classification (white/Asian versus
other), Declared Academic Unit (STEM and non-STEM), Pell Grant (Yes, No), Region of U.S.,
HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP test scores (STEM and non-STEM),
and AP grades (STEM and non-STEM)?
Public versus independent private schools.
Fischer post hoc LSD tests indicated that public schools:
• offered significantly more AP courses than private schools
• did not differ significantly from private schools on AP STEM GPA
• had significantly lower AP STEM test scores
• had significantly higher AP non-STEM GPAs
• had significantly lower AP non-STEM test scores
• had significantly higher HSGPA
• had significantly higher SAT scores
• had significantly higher FYGPA.
In terms of student characteristics, public schools differed from private schools in a
number of significant ways: (1) public schools had a smaller percentage of white/Asian students;
(2) students from public schools were more likely to major in STEM; (3) students from public
schools were almost twice as likely to receive a Pell grant.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
109
Public versus Catholic schools.
Fischer post hoc LSD tests indicated that public schools:
• offered significantly more AP courses than Catholic schools
• did not differ significantly from Catholic on AP STEM GPA
• had significantly higher AP STEM test scores
• had significantly higher AP non-STEM GPAs
• had significantly higher AP non-STEM test scores
• had significantly higher HSGPA
• had significantly higher ACT/SAT scores
• had significantly higher FYGPA.
In terms of student characteristics, public schools differed from Catholic schools in a
number of statistically significant ways: (1) public schools had a larger percentage of
white/Asian students; (2) students from public schools were more likely to major in STEM; (3)
students from public school were more likely to receive a Pell grant.
Independent private schools versus Catholic schools
Fischer post hoc LSD tests indicated that Independent Private schools:
• offered significantly less AP courses than Catholic schools
• did not differ significantly from Catholic on AP STEM GPA
• had significantly higher AP STEM test scores
• had significantly lower AP non-STEM GPAs
• had significantly higher AP non-STEM test scores
• had significantly lower HSGPA
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
110
• had significantly higher ACT/SAT scores
• had significantly higher FYGPA.
In terms of student characteristics, independent private schools differed from Catholic
schools in a number of statistically significant ways: (1) independent private schools had a larger
percentage of white/Asian students; (2) students from independent private schools were less
likely to major in STEM; (3) students from independent private school were less likely to receive
a Pell grant.
Research Question 2
Are there significant institutional level differences by institutional type (Public, Catholic,
and Independent Private) on HSGPA grade inflation and AP course grade inflation (STEM and
non-STEM) as indicated by the residuals in the following three regressions:
1. FYGPA on HSGPA,
2. FYGPA on AP STEM GPA,
3. FYGPA on AP non-STEM GPA.
A graphic organizer (shown in Figure 4.5) illustrates the second research question.
In contrast to Research Question 1, Research Question 2 was analyzed at the institutional
level. Only high schools with five or more students attending the university in this study were
analyzed. This criterion resulted in a final sample size of 775 institutions (501 public schools, 94
Catholic schools and 146 independent private schools). While requiring a larger number of
students per high school would yield more reliable results, five or more students was selected as
the criterion to allow for geographic diversity.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
111
Figure 4.5. Regressions on school type
For Research Question 2, grade inflation was measured using the residuals in a regression
analysis. Residuals (errors of prediction) are the difference between an actual (predictor) and
expected (predicted) score. For example, using FYGPA as the predicted variable and HSGPA as
the predictor variable, for some students HSGPA would over-predict FYGPA and for other
students HSGPA would under-predict FYGPA. Stated another way, some students do better than
expected in the freshman year, based on their high school grades, while other students do worse
than expected. In this study, when data are aggregated at the institutional level, a large number of
students had a lower than expected FYGPA given their HSGPA.
As shown in Table 4.17, when using FYGPA as the independent variable, HSGPA has
the largest standardized beta-weight (0.268). The smallest beta-weight is AP non-STEM GPA
(0.132). All three regressions on FYGPA are statistically significant (p=.001).
School
Type
FYGPA on
HSGPA
FYGPA on
AP STEM
GPA
FYGPA on
AP non-
STEM GPA
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
112
Another interesting finding that resulted from the regression analysis is the small size of
the beta weights in Table 4.17. In a univariate regression, beta weights are the correlations
between the independent and dependent variables. In other words the correlations between
FYGPA and HSGPA, AP STEM GPA and AP non-STEM GPA were .268, .150 and .132
respectively. These correlations are lower than one would expect, in particular the correlations
between AP grades and FYGPA are negligible.
Table 4.17
Regression Results of FYGPA on HSGPA, AP STEM GPA, and AP non-STEM GPA
IV: Predicted
(Y)
DV:
Predictor (X) β ᵪᵧ N F
Observed
Probability
FYGPA HSGPA 0.268 775 59.690 0.001
FYGPA
AP STEM
GPA
0.150 710 16.304 0.001
FYGPA
AP non-
STEM GPA
0.132 704 12.417 0.001
Table 4.18 shows the residuals by school type. Negative numbers indicate the FYGPA
was lower than expected given HSGPA. A large number of students at Catholic institutions have
a FYGPA that is lower than expected (given their HSGPA). The average residual in Catholic
schools is -.358 and post hoc Fisher LSD tests indicate that Catholic schools had significantly
(p=.001) lower residuals than public schools or independent schools.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
113
When FYGPA is predicted using AP STEM and AP non-STEM GPA, the degree of
grade inflation is much higher in Catholic schools for both predictors (AP STEM GPA, p=.001
and AP non-STEM GPA, p=.003) than for public and independent private schools.
Table 4.18
Residual Scores by School Type
School Type
Predictor Public Catholic
Independent
Private F
Observed
Probability
HSGPA
M 0.049 -0.358 0.052
6.992 0.001
N 501 94 180
AP STEM
GPA
M 0.095 -0.309 -0.114
7.578 0.001
N 473 91 146
AP non-
STEM GPA
M 0.081 -0.301 -0.079
6.153 0.002
N 473 90 141
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
114
CHAPTER 5
SUMMARY OF FINDINGS, DISCUSSION AND RECOMMENDATIONS
Scholars have long debated the “private school effect” (Coleman, Hoffer, & Kilgore,
1982). This assumption, that significant achievement advantages exist for private school students
due to their prevailing affluence, also implies that private schools are better at educating students
than public schools. Fair or not, this assumption maintains a firm grasp on many people’s
opinions surrounding the American educational system (Berliner & Biddle, 1995; Rotberg,
1990). However, when a student applies to college, how does this conjecture play out? If three
college applicants vie for one coveted freshman spot, and the only difference between them is
where each went to high school (one public, one independent private, and one Catholic), who
wins out?
Understanding the type of influence a high school has over a student’s preparation for
college was the goal of this study. Many colleges and university admission offices have
internally developed practices for assessing school quality, but while these assessments are
sometimes empirically based, they are just as likely to be based on hunches and subjective data.
Currently, a pragmatic methodology does not yet exist which allows for a fair and reliable school
index to be used in the evaluation of college admission applications. This void leaves applicants
vulnerable to being overlooked due to bias towards (or against) the high school, and colleges
vulnerable to enrolling a freshmen class that is underprepared for the rigor that lies ahead.
Summary of Methodology
The research adopted a quantitative approach (Creswell, 2012). A correlational method
with an explanatory design was employed on this non-experimental research. There was no
intervention or treatment, other than the exploration of school type. To obtain a clear
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
115
understanding of the population, descriptive statistical analyses were performed on the sample
groups. Dispersion (standard deviations, ranges) and measures of central tendency (means,
medians, and other percentiles) were computed. Lastly, in order to assess the strength of the
direction of the association between school type and college success, bivariate correlational
analyses were conducted.
For the purpose of this study, student athletes, transfer students and international students
were removed from the population due to their uniquely different admission criteria. These
populations, therefore, should not be generalized to this study. There were also not enough
charter or home schools in the sample to make a viable comparison, so they were omitted.
Further, independent private and non-Catholic/religious schools were combined to form one
category of independent private institutions. This grouping was created after understanding that
Catholic schools make up the largest school type within the private school category (Broughman
& Swaim, 2013). This subsequently allowed for the comparison between Catholic and non-
Catholic institutions.
This study is limited since the data are specific to one large metropolitan, highly
selective, private university in the western region of the United States. Students and measures in
the admission data sampled may not be representative of the general college student population.
Additionally, this study is limited by not having information on the possible presence of
unmeasured confounding variables such as learning disabilities, psychological conditions, the
impact of life circumstances, or the mere stress of adjusting to college. Further, myriad family
and school factors working simultaneously at a number of varying levels limit student
achievement and could not be controlled for in this study.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
116
This study is correlational and although the relationship between college success and high
school type may be substantiated by current literature, there is a lack of empirical data to suggest
a causal relationship between the individual level predictors chosen for this study and first-year
success in college. Consequently, the proposed model can only be interpreted from a
correlational perspective. Studies of this type are limited to identifying significant differences
among groups. These findings cannot determine the cause of the differences between school
types, nor are they designed to consider the nature and content of learning experiences.
The research outlined herein is the first step toward establishing a school index
methodology that could be used by college admission offices as a tool to reliably assess which
high schools are producing the kinds of scholars they claim to produce, and which are
exaggerating the facts. In an effort to form a pragmatic understanding about the reliability of
high school grade and transcript information, 17,080 students were assessed based on myriad
demographic and academic factors. Then students were grouped by high school, and high
schools were grouped by school type (Public, Catholic, and Independent Private). High schools
were ultimately analyzed to establish an understanding of how school type (Public, Catholic, and
Independent Private) might impact college preparation. This chapter summarizes the results of
the analyses from Chapter 4, discusses the conclusions and implications, and offers
recommendations for further research.
Research Questions
Research Question 1
To what extent, if any, are the following variables correlated with high school type
(Public, Catholic, and Independent Private): Gender, Ethnic Classification (white/Asian versus
other), Declared Academic Unit (STEM and non-STEM), Pell Grant eligibility (Yes, No),
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
117
Region of United States, HSGPA, SAT/ACT, FYGPA, Number of AP courses, AP test scores
(STEM and non-STEM), and AP grades (STEM and non-STEM)?
Research Question 2
Are there significant institutional level differences by high school type (Public, Catholic,
and Independent Private) on HSGPA grade inflation and AP course grade inflation (STEM and
non-STEM), as indicated by the residuals in the following three regressions:
1. FYGPA on HSGPA,
2. FYGPA on AP STEM GPA,
3. FYGPA on AP non-STEM GPA?
Discussion of Results
This study confirms ample research on the existence of grade inflation (Adelman, 2006;
Arenson, 2004; Attewell & Domina, 2008; Bishop, 2000; Godfrey, 2011; Nikolakakos et al.,
2012; Zirkel, 1999; Breland et al., 2002; Camara & Michaelides, 2005; Geiser, 2009; Ishop,
2008; Kirst & Venezia, 2001; McMillan, 2001). What is new as a result of the current study, is
the understanding of which types of schools are most engaged in the practice of grade inflation,
and the results are very straightforward: there is more evidence of grade inflation in Catholic
schools than in public or independent private. This result was clearly interesting, but more
compelling was the outcome of the public school examination. Overwhelming and significant
evidence points to the success of public schools, and despite lower demonstrated SES (see Figure
A.4), public institutions appear to be preparing students better than the “private school effect”
might suggest (Coleman et al., 1982).
Of the three school types, public institutions offered the highest number of AP courses
overall (see Figure A.1) and 30% of public schools had students enroll in a STEM major once in
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
118
college. This was more than Catholic (29%) or independent private (24%) schools (see Figure
A.3). Further, public schools had the highest overall HSGPA (M=3.75) (see Figure A.9), as well
as the highest SAT/ACT (M=2065) (see Figure A.10).
What was most compelling by far, however, was the fact that public schools had the
highest FYGPA (M=3.31) (see Figure A.11). Research Question 2 used three regressions to
express the residual performance on first-year college success (measured by FYGPA).
Interestingly, a large number of public institutions had a FYGPA that was higher than was
expected (given the HSGPA). This showed that public schools significantly outperformed
Catholic schools by a noteworthy margin (see Figure A.12), and nearly matched that of the
independent privates. Two further regressions further confirmed the superior performance of
public schools. When using AP STEM GPA to predict FYGPA public institutions faired the best
(see Figure A.13). This result was also true when using AP non-STEM GPA (see Figure A.14).
Public institutions significantly outperformed Catholic schools and marginally outperformed
independent private schools, not the sort of headline one reads in today’s newspapers.
Lastly, while the studies are divided surrounding the AP Program and its impact on
college preparation, it is agreed that consistency in the standard of the program is of paramount
concern (Klopfenstein & Thomas, 2005; Geiser & Santelices, 2004). Along these lines, the
current study showed interesting results when correlating AP grades and test scores. Though not
much difference existed between school type and AP STEM GPA, significant differences existed
between school type and mean AP STEM test scores. For independent private schools, the AP
STEM test scores were high relative to the AP STEM GPA (see Figures A.5 and A.6). Similarly,
the independent private schools demonstrated high AP non-STEM test scores relative to the
corresponding AP non-STEM GPA. Interestingly, the independent private schools demonstrated
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
119
the lowest AP non-STEM GPA, but the highest AP non-STEM test achievement (see Figures
A.7 and A.8). Although there is clearly need for further research beyond these initial
correlations, it goes without saying that these preliminary findings are worth noting.
Implications for Practice
This research generated a number of pertinent implications as it relates to the practice of
college admissions. The subsequent sections will delineate the implications, which are supported
by the findings of this study, into three classifications: implications for high school practices,
college admissions and the College Board.
High School Practices
The findings of this study have practical implications for high schools, which can initiate
actionable change at the classroom and school level.
First, as apparent from the outcomes discussed above, indicators of accountability are
reasonable given the significant implication of grade inflation for certain school types. Although
this study has identified grade inflation most predominately in the Catholic schools, it is
important to remember that evidence of research on the existence of grade inflation, regardless of
school type, is abundant (Adelman, 2006; Arenson, 2004; Attewell & Domina, 2008; Bishop,
2000; Godfrey, 2011; Nikolakakos et al., 2012; Zirkel, 1999; Breland et al., 2002; Camara &
Michaelides, 2005; Geiser, 2009; Ishop, 2008; Kirst & Venezia, 2001; McMillan, 2001).
However, student success in terms of persistence and academic achievement plays an important
role in whether a student will progress towards college graduation (HERI, 2003; Cuseo, 2013),
and high school rigor plays a key role in setting this stage (Kirst, 1998). Specifically, if the
purpose of the high school transcript is to provide factual and accurate academic reports, the
potential for misuse of this document is alarming. This official school record, which is relied
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
120
upon by students, parents and colleges for truthful communication and assessment, should not be
used to place some students (or some schools) at a relative advantage in comparison to others.
With the college admission process as competitive as it is, schools that inflate grades and tinker
with transcripts on behalf of their students not only rob students in less fortunate situations from
college opportunities, but they jeopardize the long-term success of their own scholars.
Second, AP courses are designed to mimic introductory college courses and increase the
overall rigor of the high school curriculum (Hertberg-Davis & Callahan, 2008). High schools are
entrusted by the College Board to implement the AP curriculum accurately (University of
Oregon, 2015), however evidence in this study suggests grade inflation within the AP GPA
(STEM and non-STEM) as compared to the correlating AP test scores. If high schools abuse the
privilege of teaching the AP curriculum by inflating grades and exaggerating competency, two
key implications come to light. High schools that are sought out as “more prestigious” based on
the number of AP courses offered, and students who are sought out as “more competent” based
on the number of AP courses completed, should be scrutinized carefully.
College Admission Practices
The findings of this study also identified several cogent implications for college
admissions.
First, academic excellence is traditionally gauged by evaluating an applicant’s high
school transcript (Stearns et al., 2009), but evidence in this study, as well as in others,
demonstrates that the potential for exaggerating the appearance of rigor, as it is reflected on the
high school record, is certainly possible (ACT, 2007; Bar et al., 2009; Jost, 2002; Kirst, 1998;
Zirkel, 1999). Measures should be taken to ensure reliable and valid transcript information
before trusting the HSGPA as a measure of success. Knowing which high school best prepares
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
121
students for college is of critical importance and a process for identifying more valid indicators
of student success must be considered in order to improve the thoroughness and accuracy of the
admission decision which, ultimately, will increase first-year academic success and,
subsequently, college graduation rates (Camara & Kimmel, 2005; Kuh, 2005; Ishler & Upcraft,
2005; Fairris et al., 2011).
Second, the purpose behind this study was to take preliminary steps in the creation of a
methodology that can be used as a school index. The hope is that these initial steps can be used
to aid college admission offices to strengthen their decision process. Certainly then, validating
the existence of grade inflation is critical. However, more crucial is the message about the risk a
college takes when admitting a student with inflated grades: the input might look good, but the
output is a gamble. When colleges and universities are held accountable for their retention efforts
and graduation rates, relying on high school grades, as evidence of performance, is ill advised. A
school index, which accurately and empirically measures whether a high school is producing
what it claims to produce, needs to be put into practice, thus holding high schools accountable
for the performance of their matriculated student body. Relying on subjective data or conjecture
to make presumptions about school quality is cautionary. Ideally, this vetting would be
established through a consortium of colleges to eliminate bias and promote fairness.
Lastly, with college admission becoming increasingly competitive, one way students (and
schools) attempt to differentiate themselves is by participating in the AP Program (Bound et al.,
2009). The mere fact that the college admission process so heavily considers participation in the
AP curriculum, but not the exams themselves, provides considerable incentive for students to
partake in as many classes as they can manage. Klopfenstein and Thomas (2009), however,
found that AP involvement is not a reliable predictor of first semester college GPA or of
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
122
retention into the second year of college. However, to mention a demanding high school course
load without also mentioning AP is almost impossible, but while the increase in students
participating in the AP curriculum has been significant, evidence suggests that attention would
be better placed on how many of those students pass the AP exam (Hechinger Institute, 2009). In
2005, over half of high school students who received an A or B grade in physics, were not
prepared for college science, and the same was shown to be true for Algebra II (ACT, 2007).
Further, the actual number of AP courses a pupil took did not predict college success (Geiser &
Santelices, 2004). Rather, there is strong evidence showing a correlation between passing at least
one AP exam and an increased graduation rate from college (Geiser & Santelices, 2004). This
study supports these findings and asserts that without a secondary verification such as the AP test
score, AP grades should be considered with caution. Likewise, students with lower AP grades
should not be overlooked until competency can be verified with a secondary measure, such as the
AP test score.
The College Board Practices
State policies which direct all school constituencies to include the AP experience in their
curriculum, and the common practice of university admission professionals giving preferential
treatment to students with AP coursework, is cause for question (Klopfenstein & Thomas, 2009).
Geiser and Santelices (2004) report that the quantity of advanced placement and honors courses
taken by a student were not a significant predictor of college success, however the outcome of
AP exam scores were instead a worthwhile measure. While Camara and Michaelides (2005)
criticized the methodology of Geiser and Santelices (2004), it is the assertion of this study that
their hypothesis is valuable and deserves further exploration. AP course grades simply must be
audited against AP test scores for grades to be taken seriously. Further, admission staff that rely
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
123
on the quantity or quality of AP coursework as a barometer for admission criteria, rather than the
more useful AP exam scores, are jeopardizing the quality of their freshman class. Unfortunately,
the timing of AP test result delivery to colleges, in mid-July, well after admission decisions have
been mailed, is a major flaw in the College Board system. It is not within the scope of this study
to suggest an alternative, but given the strong evidence suggesting grade inflation and abuse of
the high school AP curriculum, an urgent and expedited audit of the entire AP framework and a
reevaluation of the timing of test delivery is highly encouraged.
Future Research
Evidence shows that the strongest predictor for college graduation is the intensity and
quality of a high school’s curriculum (Adelman, 1999, 2006). However, if the representation of
that intensity and quality is exaggerated, what results? As an outcome of this study, copious and
salient implications were generated for application in three venues: the establishment of
accountability practices within the high schools, and reformation within college admission and
the College Board. However, further research is compulsory in order to create a body of
knowledge that can corroborate and strengthen the understanding of the kind of impact school
type has on academic achievement and how students persist after influence across different types
and subtypes of school settings. More scrutiny about the validity of the high school transcript and
the role AP curriculum plays on college selectivity, will greatly enhance colleges’ potential for
providing equity and access to those most qualified. In addition, deeper investigation delving
into more diverse populations would be beneficial, as the types of students who comprised the
sample in this research were predominantly white or Asian, and came from a higher SES group
than 23% of the overall research population. Beyond this, the research was performed by
manipulating data specific to one large metropolitan, highly selective, private university in the
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
124
western region of the United States. Benefit can be garnered with similar research performed on
different types of college admission offices and on different populations of college applicants.
Lastly, a more thorough course-by-course analysis to include the senior year AP STEM
test scores and AP STEM course grades, as well as the senior year AP non-STEM test scores and
AP non-STEM course grades, computed on a greater magnitude than was possible with this
study, would be of great value to education research. To effectively establish a school index
methodology for use in college admission, an annually conducted individual course-by-course
analysis against summary score data that is then compared to each matriculated student’s
FYGPA (and possibly also the four-year-graduation rate) is recommended. In addition, the
creation of a methodology to evaluate the high school type by assessing the high school’s four-
year college enrollment and graduation rate, any AP/IB offerings, annual tuition or free-lunch
programs, and the individual student’s family’s socioeconomic status would greatly contribute to
not only the overall body of knowledge surrounding high school effectiveness, but also to
accessibility for low income students to take the AP Exam.
Conclusion
It was clear in the literature review and throughout the data analysis that real families
with real hopes of success for their children lay at the heart of the college admission process.
Grading, testing and evaluation are, for many, viewed as arbitrary and stress-inducing, however
each can actually create order, fairness and access to deserving candidates, if appropriately
formulated and interpreted. Understanding that no one school type will ever be “the best,” it is
imperative that parents, school administrators, teachers and college admission officers insist on
transparency, honesty and accuracy when it comes to student evaluation. Ultimately, to gain
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
125
access to a particular college or university, through inflated grades or lax standards, serves no
one in the long run.
The information produced from this study is judicious and timely, and it warrants serious
consideration from education researchers, policymakers, and practitioners. When it comes to
college admission, context matters. Understanding the type of school students have attended, as
well as the preparation that students have received, are important pre-requisites for the successful
screening of admission candidates. The transcript is the principal quantitative information source
provided to colleges by high schools (Rigol, 2003; NACAC, 2012; Tam & Sukhatme, 2004). As
such, we must determine ways to better understand and ensure the efficacy of what is being
reported. Not only will this assist the colleges in making the right decisions, but it will also
enable students and families to gain a more realistic understanding of their level of preparation
for college.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
126
REFERENCES
Ackerman, P. L., Kanfer, R., & Beier, M. E. (2013). Trait complex, cognitive ability, and domain
knowledge predictors of Baccalaureate success, STEM persistence, and gender
differences. Journal of Educational Psychology, 105(3), 911-927. doi:10.1037/a0032338
ACT. (2007). Rigor at risk: Reaffirming quality in the high school core curriculum. Iowa City,
IA: Author.
ACT. (2008, June). Compare ACT and SAT scores. Retrieved from
http://www.act.org/solutions/college-career-readiness/compare-act-sat/
ACT. (2015). ACT website. Retrieved from http://act.org
Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and
bachelor’s degree attainment. Washington, D.C.: U.S. Department of Education.
Adelman, C. (2006). The toolbox revisited: Paths to degree completion from high school through
college. Washington, D.C.: U.S. Department of Education.
Altonji, J. G., Blom, E., & Meghir, C. (2012). Heterogeneity in human capital investments: High
school curriculum, college major, and careers (NBER Working Papers 17985).
Cambridge, MA: National Bureau of Economic Research.
Altonji, J. G., Elder, T. E., & Taber, C. R. (2005a). An evaluation of instrumental variable
strategies for estimating the effects of Catholic schools. The Journal of Human
Resources, 40(4), 791-821.
Altonji, J. G., Elder, T. E., & Taber, C. R. (2005b). Selection on observed and unobserved
variables: Assessing the effectiveness of Catholic schools. Journal of Political Economy,
113(1), 151-184. doi:10.1086/426036
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
127
Andrews, H. A. (2004). Dual credit research outcomes for students. Community College Journal
of Research and Practice, 28(5), 415-422. doi:10.1080/1066892049044445
Arenson, K. W. (2004, April 18). Is it grade inflation, or are students just smarter? The New York
Times, A1.
Astin, A. W. (1997). How “good” is your institution’s retention rate? Research in Higher
Education, 38(6), 647-658.
Atkinson, R. C., & Geiser, S. (2009). Reflections on a century of college admissions tests.
Educational Researcher, 38(9), 665-676. doi:10.3102/0013189X09351981
Attewell, P., & Domina, T. (2008). Raising the bar: Curricular intensity and academic
performance. Educational Evaluation and Policy Analysis, 30(1), 51-71.
Augustine, N. R. (2005). Rising above the gathering storm: Energizing and employing America
for a brighter economic future. Washington, D.C.: National Academies Press.
Bar, T., Kadiyali, V., & Zussman, A. (2009). Grade information and grade inflation: The Cornell
experiment. Journal of Economic Perspectives, 23(3), 93-108.
Bausmith, J. M., & Laitusis, V. (2012). The impact of AP Achievement Institute I on students’ AP
performance (Research Report No. 2012-7). New York, NY: The College Board.
Berkowitz, D., & Hoekstra, M. (2011). Does high school quality matter? Evidence from
admissions data. Economics of Education Review, 30(2), 280-288.
doi:10.1016/j.econedurev.2010.10.001
Berliner, D. C., & Biddle, B. J. (1995). The manufactured crisis: Myths, fraud, and the attack on
America’s public schools. Redding, MA: Addison-Wesley.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
128
Bishop, J. H. (2000). Nerd harassment and grade inflation: Are college admissions policies
partly responsible? (CHERI Working Paper #8). Retrieved from Cornell University, ILR
School site: http://digitalcommons.ilr.cornell.edu/cheri/8
Blackburn, B. R. (2012). Rigor is not a four letter word. New York, NY: Eye On Education.
Boden, G. T. (2011). Retention and graduation rates: Insights from an extended longitudinal
view. Journal of College Student Retention: Research, Theory and Practice, 13(2), 179-
203.
Bound, J., Hershbein, B., & Long, B. T. (2009). Playing the admissions game: Student reactions
to increasing college competition. Journal of Economic Perspectives, 23(4), 119-146.
doi:10.1257/jep.23.4.119
Bowen, W. G., & Bok, D. (1998). The shape of the river. Princeton, NJ: Princeton University
Press.
Breland, H., Maxey, J., Gernand, R., Cumming, T., & Trapani, C. (2002). Trends in college
admission 2000: A report of a survey of undergraduate admissions policies, practices,
and procedures. Tallahassee, FL: Association for Institutional Research.
Broughman, S. P., & Swaim, N. L. (2013). Characteristics of private schools in the United
States: Results from the 2011-2012 Private School Universe Survey (NCES 2013-316).
Washington, D.C.: U.S. Department of Education, National Center for Education
Statistics.
Broughman, S. P., Swaim, N. L., Parmer, R., Zotti, A., & Dial, S. (2014). Private School
Universe Survey (PSS): Public-use data file user’s manual for school year 2011-12
(NCES 2014-351). Washington, D.C.: U.S. Department of Education, National Center for
Education Statistics.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
129
Broughman, S. P., Tourkin, S., Swaim, N., Peterson, J., Parmer, R., Zotti, A., & Andriani, S.
(2012). Private School Universe Survey (PSS): Survey documentation for school year
2009-10 (NCES 2012-323). Washington, D.C.: U.S. Department of Education, National
Center for Education Statistics.
Burke, K. (2009). How to assess authentic learning. Thousand Oaks, CA: Corwin.
Burton, N. W., & Ramist, L. (2001). Predicting success in college: SAT studies of classes
graduating since 1980 (Research Report No. 2001-2). New York, NY: The College
Board.
Byrd, S., Ellington, L., Gross, P., Jago, C., Stern, S., Finn, C. E., Jr, & Davis, M. A., Jr. (2007).
Advanced Placement and International Baccalaureate: Do they deserve gold star status?
Washington D.C.: Thomas B. Fordham Institute.
California Department of Education. (2009). California private schools, kindergarten through
grade twelve, enrollment and staff report 2008-09. Sacramento, CA: Author.
Camara, W. J. (1998, May). High school grading policies (RN-04). New York, NY: The College
Board, Office of Research and Development.
Camara, W. J. (2011). Catalog of research reports. New York, NY: The College Board, Office
of Research and Development.
Camara, W. J., & Echternacht, G. (2000). The SAT[R] I and high school grades: Utility in
predicting success in college (Research Report No. RN-10). New York, NY: The College
Board.
Camara, W. J., & Kimmel, E. W. (2005). Choosing students: Higher education admissions tools
for the 21st century. Mahwah, NJ: Lawrence Erlbaum.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
130
Camara, W. J., Kimmel, E. W., Scheuneman, J., & Sawtell, E. A. (2003). Whose grades are
inflated? (College Board Research Report No. 2003-4). New York, NY: The College
Board.
Camara, W. J., & Michaelides, M. (2005). AP use in admissions: A response to Geiser and
Santelices. New York, NY: The College Board.
Casement, W. (2003). Declining credibility for the AP program. Academic Questions, 16(4), 11-
25.
Casserly, P. L. (1986). Advanced placement revisited (College Board Report No. 86-6, ETS RR
No. 86-35). New York, NY: College Entrance Examination Board.
Chajewski, M., Mattern, K. D., & Shaw, E. J. (2011). Examining the role of Advanced
Placement® exam participation in 4year college enrollment. Educational Measurement:
Issues and Practice, 30(4), 16-27.
Clinedinst, M. E., & Hawkins, D. A. (2008). State of college admission. Arlington, VA: National
Association for College Admission Counseling.
Clinedinst, M. E., Hurley, S. F., & Hawkins, D. A. (2013). 2012 state of college admission.
Arlington, VA: National Association for College Admission Counseling.
Coleman, J., Hoffer, T., & Kilgore, S. (1982). Cognitive outcomes in public and private schools.
Sociology of Education, 55, 65-76.
College Board. (2002). Opening classroom doors. New York, NY: The College Board.
College Board. (2013a). College Board 2012-2013 educator’s handbook for the SAT and the
SAT subject tests. New York, NY: The College Board.
College Board. (2013b). AP: A foundation for academic success. New York, NY: The College
Board.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
131
College Board. (2013c). Guidelines on the uses of College Board test scores and related data.
New York, NY: The College Board.
College Board. (2014a, July 2). How to convert your GPA to a 4.0 scale. Retrieved from
http://www.collegeboard.com/html/academicTracker-howtoconvert.html
College Board. (2014b, June 11). SAT-ACT concordance tables. Retrieved from
http://research.collegeboard.org/programs/sat/data/concordance
College Board. (2014c, June 11). What is an AP score and what does it mean? Retrieved from
https://apscore.collegeboard.org/scores/about-ap-scores
College Board. (2014d, July 20). AP courses. Retrieved from
https://apstudent.collegeboard.org/apcourse
College Board. (2014e). Student search service. Retrieved from
http://professionals.collegeboard.com/k-12/prepare/sss
College Board. (2015). SAT homepage. Retrieved from http://sat.collegeboard.org
Cookson, P. W., & Persell, C. H. (1985). English and American residential secondary schools: A
comparative study of the reproduction of social elites. Comparative Education Review,
29, 283-298.
Couch, J. F., Shughart, W. F., & Williams, A. L. (1993). Private school enrollment and public
school performance. Public Choice, 76(4), 301-312.
Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five
approaches. Thousand Oaks, CA: Sage.
Cuseo, J. (2013). Student success: Definition, outcomes, principles and practices. Retrieved from
http://www.indstate.edu/studentsuccess/pdf/Defining%20Student%20Success.pdf
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
132
Dougherty, C., Mellor, L., & Jian, S. (2006). The relationship between advanced placement and
college graduation (2005 AP Study Series, Report 1). Austin, TX: National Center for
Educational Accountability.
Douglass, J. A., & Thomson, G. (2008). The poor and the rich: A look at the economic
stratification and academic performance among undergraduate students in the United
States (Research & Occasional Paper Series: CSHE.15.08). Berkeley: Center for Studies
in Higher Education, University of California.
Edwards, K., & Duggan, O. (2012). Data-based decision making: The road to AP equity (APAC
2012 presentation). New York, NY: The College Board.
Ehrenberg, R., & Brewer, D. J. (1994). Do school and teacher characteristics matter? Evidence
from high school and beyond. Economics of Education Review, 13(1), 1-17.
Espenshade, T. J., Hale, L. E., & Chung, C. Y. (2005). The frog pond revisited: High school
academic context, class rank, and elite college admission. Sociology of Education, 78(4),
269-293.
Ethnicity. (n.d.). In Oxford English Dictionary Online. Retrieved from
http://www.oed.com/view/Entry/64791
Evans, W. N., & Schwab, R. M. (1995). Finishing high school and starting college: Do Catholic
schools make a difference? Quarterly Journal of Economics, 110(4), 941-974.
Fairris, D., Peeples, J., & Castro, M. (2011, March). First to second year retention and six-year
graduation rates: An analysis by social identity groups at the campus and college level
(Undergraduate Education Institutional Research Report). Riverside, CA: University of
California, Riverside.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
133
FairTest. (2015, Winter). Colleges and universities that do not use SAT/ACT scores for
admitting substantial numbers of students into bachelor degree programs. Retrieved from
http://www.fairtest.org/university/optional
Fithian, E. C. (2003). Rate of advanced placement (AP) exam taking among AP-enrolled
students: A study of New Jersey high schools (Unpublished doctoral dissertation). College
of William and Mary, Williamsburg, VA.
Furry, W. S., & Hecsh, J. (2001). Characteristics and Performance of Advanced Placement
Classes in California. Sacramento: The California State University Institute for
Education Reform.
Geiser, S. (2009). Back to the basics: In defense of achievement (and achievement tests) in
college admissions. Change, 41(1), 16-23. doi:10.3200/CHNG.41.1.16-23
Geiser, S., & Santelices, M. V. (2004). The role of advanced placement and honors courses in
college admissions. In P. Gandara, G. Orfield, & C. Horn (Eds.), Expanding opportunity
in higher education: Leveraging promise (pp. 75-113). Albany: State University of New
York.
Geiser, S., & Santelices, M. V. (2007). Validity of high-school grades in predicting student
success beyond the freshman year: High-school record vs. standardized tests as
indicators of four-year college outcomes. Berkeley, CA: Center for Studies in Higher
Education, University of California, Berkeley.
Gender. (n.d.). In Oxford English Dictionary Online. Retrieved from
http://www.oed.com/view/Entry/77468
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
134
Gifford, D. D., Briceno-Perriott, J., & Mianzo, F. (2006). Locus of control: Academic
achievement and retention in a sample of university first-year students. Journal of
College Admission, 191, 18-25.
Godfrey, K. E. (2011). Investigating grade inflation and non-equivalence (Research Report
2011-2). New York, NY: The College Board.
Gollub, J. P., Bertenthal, M. W., Labov, J. B., & Curtis, P. C. (2002). Learning and
understanding: Improving advanced study of mathematics and science in U.S. high
schools. Washington, D.C.: National Academies Press.
Gonzalez, H. B., & Kuenzi, J. J. (2012, August). Science, technology, engineering, and
mathematics (STEM) education: A primer. Washington, D.C.: Congressional Research
Service, Library of Congress.
Grade Inflation. (n.d.). In Merriam-Webster Online. Retrieved from http://www.merriam-
webster.com/dictionary/grade%20inflation
Greeley, A. M. (1982). Catholic high schools and minority students. New Brunswick, NJ:
Transaction.
Grogger, J., & Neal, D. (2000). Further evidence on the benefits of Catholic secondary
schooling. In W. G. Gale & J. R. Pack (Eds.), Brookings-Wharton papers on urban
affairs (pp. 151-193). Washington, D.C.: Brookings Institute.
Guo, H., Liu, J., Curley, E., & Dorans, N. (2012). The stability of the score scales for the SAT
Reasoning Test from 2005 to 2010 (Research Report ETS RR-12-15). Princeton, NJ:
Educational Testing Service.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
135
Harackiewicz, J. M., Barron, K. E., Tauer, J. M., & Elliot, A. J. (2002). Predicting success in
college: A longitudinal study of achievement goals and ability measures as predictors of
interest and performance from freshman year through graduation. Journal of Educational
Psychology, 94(3), 562-575. doi:10.1037//0022-0663.94.3.562
Hechinger Institute. (2009). Understanding and reporting on academic rigor. New York, NY:
The Hechinger Institute on Education and the Media.
Hertberg-Davis, H., & Callahan, C. M. (2008). A narrow escape: Gifted students’ perceptions of
advanced placement and international baccalaureate programs. Gifted Child Quarterly,
52(3), 199-216.
Higher Education Research Institute. (2003). How “good” is your retention rate? Using the
CIRP freshman survey to evaluate undergraduate persistence. Los Angeles, CA: Higher
Education Research Institute, University of California, Los Angeles.
Housman, N. (2006). Defining rigor in high school: Framework and assessment tool.
Washington, D.C.: National High School Alliance.
Hughes, K. (2013). The college completion agenda: 2012 progress report. New York, NY: The
College Board.
IBM. (2013). IBM SPSS Statistics for Windows, Version 22.0. Armonk, NY: IBM Corp.
Ishler, J. L. C., & Upcraft, M. L. (2005). The keys to first-year student persistence. In M. L.
Upcraft, J. N. Gardner, & B. O. Barefoot (Eds.), Challenging & supporting the first-year
student: A handbook for improving the first year of college (pp. 27-46). San Francisco,
CA: Jossey-Bass.
Ishop, K. B. (2008). The college application essay: Just tell me what to write and I’ll write it
(Unpublished doctoral dissertation). University of Texas at Austin, Austin, TX.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
136
Jeynes, W. H. (2002). Educational policy and the effects of attending a religious school on the
academic achievement of children. Educational Policy, 16(3), 406-424.
Johnson, L. (2005). Why is it important to take challenging classes? Retrieved May 9, 2013,
from http://www.davidsongifted.org/db/Articles_id_10330.aspx
Jost, K. (2002, June 7). Grade inflation. CQ Researcher, 12(22), 505-520.
Kena, G., Aud, S., Johnson, F., Wang, X., Zhang, J., Rathbun, A., . . . Rosario, V. (2014). The
condition of education 2014 (NCES 2014-083). Washington, D.C.: U.S. Department of
Education, National Center for Education Statistics.
Kiley, M. L., & Gable, R. K. (2013, April). Admissions counselors’ perceptions of cognitive,
affective, and behavioral correlates of student success at an independent high school: A
mixed methods study. Presented at the 45th annual meeting of the New England
Educational Research Organization, Portsmouth, NH.
Kilpatrick, J. (2010, September 6). Schools are using High School Survey of Student
Engagement results to find solutions for their students. EducationNews.org. Retrieved
from http://ceep.indiana.edu/hssse/pdfs/EducationNews.org%20-%20Schools...pdf
Kirst, M. W. (1998). Improving and aligning K-16 standards, admissions, and freshmen
placement policies. Stanford, CA: National Center for Postsecondary Improvement,
Stanford University, School of Education.
Kirst, M. W., & Venezia, A. (2001). Bridging the great divide between secondary schools and
postsecondary education. The Phi Delta Kappan, 83(1), 92-97.
Klopfenstein, K. (n.d.). The AP arms race: Is grade-weighting to blame? Dallas: University of
Texas at Dallas, Texas Schools Project.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
137
Klopfenstein, K. (2003). Recommendations for maintaining the quality of advanced placement
programs. American Secondary Education, 32(1), 39-48.
Klopfenstein, K., & Thomas, M. K. (2005). The advanced placement performance advantage:
Fact or fiction? Nashville, TN: American Economic Association. Retrieved from
http://www.aeaweb.org/annual_mtg_papers/2005/0108_1015_0302.pdf
Klopfenstein, K., & Thomas, M. K. (2009). The link between Advanced Placement experience
and early college success. Southern Economic Journal, 75(3), 873-891.
Kohn, A. (2008, Spring). Progressive education: Why it’s hard to beat but also why it’s hard to
find. Independent School, 19-28.
Krogstad, J. M., & Fry, R. (2014, April 24). More Hispanics, blacks enrolling in college, but lag
in bachelor’s degrees. Retrieved from http://www.pewresearch.org/fact-
tank/2014/04/24/more-hispanics-blacks-enrolling-in-college-but-lag-in-bachelors-degrees
Kuh, G. D. (2001). Assessing what really matters to student learning. Change, 33(3), 10-17.
doi:10.1080/00091380109601795
Kuh, G. D. (2005). Student engagement in the first year of college. Challenging and supporting
the first-year student: A handbook for improving the first year of college. San Francisco,
CA: Jossey-Bass.
Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2010). Student success in college: Creating
conditions that matter. San Francisco, CA: Jossey-Bass.
Kuncel, N. R., Credé, M., & Thomas, L. L. (2005). The validity of self-reported grade point
averages, class ranks, and test scores: A meta-analysis and review of the literature.
Review of Educational Research, 75(1), 63-82.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
138
Kyburg, R. M., Hertberg-Davis, H., & Callahan, C. M. (2007). Advanced placement and
international baccalaureate programs: Optimal learning environments for talented
minorities? Journal of Advanced Academics, 18(2), 172-215.
Laska, J. A., & Juarez, T. (1992). Grading and marking in American schools: Two centuries of
debate. Springfield, IL: Charles C Thomas.
Lee, V. E., & Burkam, D. T. (2003). Dropping out of high school: The role of school
organization and structure. American Educational Research Journal, 40(2), 353-393.
Lee, V. E., & Holland, P. B. (1995). Catholic schools and the common good. Cambridge, MA:
Harvard University Press.
Lichten, W. (2000). Whither Advanced Placement? Education Policy Analysis Archives, 8, 1-19.
Mason, J. C. (2010). The role of teacher in Advanced Placement (AP) access (Unpublished
doctoral dissertation). California State University, Sacramento, CA.
Mattern, K. D., Shaw, E. J., & Ewing, M. (2011). Is AP exam participation and performance
related to choice of college major? New York, NY: The College Board.
Mattson, C. E. (2007). Beyond admission: Understanding pre-college variables and the success
of at-risk students. Journal of College Admission, 196, 8-13.
McMillan, J. H. (2001). Secondary teachers’ classroom assessment and grading practices.
Educational Measurement: Issues and Practice, 20(1), 20-32.
Miller, J. M. (2006). The Holy See’s teaching on Catholic schools. Manchester, NH: Sophia
Institute Press.
Moller, S., Stearns, E., Potochnick, S. R., & Southworth, S. (2011). Student achievement and
college selectivity: How changes in achievement during high school affect the selectivity
of college attended. Youth & Society, 43(2), 656-680. doi:10.1177/0044118X10365629
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
139
Morgan, R., & Klaric, J. (2007). AP students in college: An analysis of five-year academic
careers (Research Report No. 2007-4). New York, NY: The College Board.
Murphy, D., & Dodd, B. G. (2009). A comparison of college performance of matched AP and
non-AP student groups (Research Report No. 2009-6). New York, NY: The College
Board.
Nagaishi, C., & Slade, M. K. (2012). Are weighted or unweighted high school grade point
averages better indicators of college success? In 2012 Proceedings of the National
Conference on Undergraduate Research (pp. 627-635). Asheville: University of North
Carolina at Asheville.
National Association for College Admission Counseling (NACAC). (2009). Factors in the
admission decision. Arlington, VA: Author.
National Association for College Admission Counseling (NACAC). (2012). Statement of
principles of good practice. Arlington, VA: Author.
National Association of Independent Schools. (2015). NAIS website. Retrieved from
http://www.nais.org
National Center for Education Statistics. (2011, April 7). How is Grade Point Average
calculated? Retrieved from https://nces.ed.gov/nationsreportcard/hsts/howgpa.aspx
National Center for Education Statistics. (2012, January). Private elementary and secondary
enrollment, number of schools, and average tuition, by school level, orientation, and
tuition: 1999–2000, 2003–04, and 2007-08. Retrieved from
http://nces.ed.gov/programs/digest/d11/tables/dt11_064.asp
National Center for Education Statistics. (2013). National Assessment of Educational Progress
glossary of terms. Retrieved from https://nces.ed.gov/nationsreportcard/glossary.aspx
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
140
National Center for Education Statistics. (2014). Definitions for new race and ethnicity
categories. Retrieved from http://nces.ed.gov/ipeds/reic/definitions.asp
National Center for Education Statistics. (2015). Private schools in the United States: A
statistical profile, 1993-94 / Catholic-parochial schools. Retrieved from
http://nces.ed.gov/pubs/ps/97459ch2.asp
Neal, D. (1997). The effects of Catholic secondary schooling on educational achievement.
Journal of Labor Economics, 15, 98-123.
Nett, D., Jr. (2009). Weighting G.P.A. to assist in improving student achievement: Choosing an
appropriate system for a rural school district (Unpublished doctoral dissertation).
Northern Michigan University, Marquette, MI.
Nikolakakos, E., Reeves, J. L., & Shuch, S. (2012). An examination of the causes of grade
inflation in a teacher education program and implications for practice. College and
University, 87(3), 2-13.
Nord, C., Roey, S., Perkins, R., Lyons, M., Lemanski, N., Brown, J., & Shuknecht, J. (2011).
The nation’s report card: America’s high school graduates (NCES 2011-462).
Washington, D.C.: U.S. Department of Education, National Center for Education
Statistics.
Palardy, G. J. (2013). High school socioeconomic segregation and student attainment. American
Educational Research Journal, 50(4), 714-754. doi:10.3102/0002831213481240
Pattison, E., Grodsky, E., & Muller, C. (2013). Is the sky falling? Grade inflation and the
signaling power of grades. Educational Researcher, 42(5), 259-265.
doi:10.3102/0013189X13481382
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
141
Perfetto, G. (1999). Toward a taxonomy of the admissions decision-making process. New York,
NY: The College Board.
Public School. (n.d.). In Oxford English Dictionary Online. Retrieved from
http://www.oed.com/view/Entry/154070
Randall, J., & Engelhard, G. (2010). Examining the grading practices of teachers. Teaching and
Teacher Education, 26(7), 1372-1380. doi:10.1016/j.tate.2010.03.008
Reason, R. D. (2003). Student variables that predict retention: Recent research and new
developments. NASPA Journal, 40(4), 172-191.
Religious Education. (n.d.). In Collins English Dictionary Online. Retrieved from
http://www.collinsdictionary.com/dictionary/english/religious-education
Richmond, L. M. (2012). Students’ college preparation level based on quality factors of the high
school attended (Unpublished doctoral dissertation). Indiana State University, Terre
Haute, IN.
Rigol, G. W. (2002). Best practices in admissions decisions. New York, NY: The College Board.
Rigol, G. W. (2003). Admissions decision-making models: How U.S. institutions of higher
education select undergraduate students. New York, NY: The College Board.
Rojstaczer, S. (2002). Grade inflation at American colleges and universities. Retrieved from
http://www.gradeinflation.com
Rotberg, I. C. (1990). I never promised you first place. Phi Delta Kappan, 72(4), 296-303.
Rothschild, E. (1999). Four decades of the advanced placement program. The History Teacher,
32(2), 175-206.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
142
Sackett, P. R., Hardison, C. M., & Cullen, M. J. (2004). On interpreting stereotype threat as
accounting for African American-White differences on cognitive tests. American
Psychologist, 59(1), 7-13. doi:10.1037/0003-066X.59.1.7
Sadler, P. M., & Tai, R. H. (2007). Advanced Placement exam scores as a predictor of
performance in introductory college biology, chemistry and physics courses. Science
Educator, 16(2), 1-19.
Santoli, S. P. (2013). Is there an advanced placement advantage? American Secondary
Education, 30(3), 23-35.
Schoeffel, M., van Steenwyk, M., & Kuriloff, P. (2011). How do you define success? An action
research project leads to curricular change. Independent School, 70(4), n.p.
Sex [n.1]. (n.d.). In Oxford English Dictionary Online. Retrieved from
http://www.oed.com/view/Entry/176989
Shaw, E. J., Marini, J. P., & Mattern, K. D. (2013). Exploring the utility of Advanced Placement
participation and performance in college admission decisions. Educational and
Psychological Measurement, 73(2), 229-253. doi:10.1177/0013164412454291
Shaw, E. J., & Mattern, K. D. (2009). Examining the accuracy of self-reported high school grade
point average (Research Report No. 2009-5). New York, NY: The College Board.
Shernoff, D. J., & Hoogstra, L. (2001). Continuing motivation beyond the high school
classroom. New Directions for Child and Adolescent Development, 2001(93), 73-88.
Shrestha, L. B. (2011). The changing demographic profile of the United States. Washington,
D.C.: DIANE Publishing.
Southern Illinois University Edwardsville (SIUE). (1996, February 1). Definition of academic
unit. Retrieved from http://www.siue.edu/policies/1a1.shtml
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
143
Stanford University. (2014). Advanced Placement. Retrieved from
https://studentaffairs.stanford.edu/registrar/students/ap
Stanger, M. (2014, October 1). The 50 most expensive private high schools in America. Business
Insider. Retrieved from http://www.businessinsider.com/most-expensive-private-schools-
in-the-us-2014-8
Stanley, G., & Baines, L. (2009). No more shopping for grades at B-mart: Re-establishing grades
as indicators of academic performance. The Clearing House, 74(4), 227-230.
doi:10.1080/00098650109599197
Starkman, R. (2013, August 1). Confessions of an application reader: Lifting the veil on the
holistic process at the University of California, Berkeley. New York Times. Retrieved
from http://www.nytimes.com/2013/08/04/education/edlife/lifting-the-veil-on-the-
holistic-process-at-the-university-of-california-berkeley.html
Stearns, E., Potochnick, S., Moller, S., & Southworth, S. (2009). High school course-taking and
post-secondary institutional selectivity. Research in Higher Education, 51(4), 366-395.
doi:10.1007/s11162-009-9161-8
Stecher, B. M., Kirby, S. N., Barney, H., Pearson, M. L., & Chow, M. (2004). Organizational
improvement and accountability. Santa Monica, CA: Rand Corporation.
Tam, M.-Y. S., & Sukhatme, U. (2004). How to make better college admission decisions:
Considering high school quality and other factors. Journal of College Admission, 183,
12-16.
Tierney, W. G., Colyar, J. E., & Corwin, Z. B. (2003). Preparing for college: Building
expectations, changing realities. Los Angeles, CA: Center for Higher Education Policy
Analysis.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
144
Trustees of Indiana University. (2013). National Survey of Student Engagement (NSSE) 2013.
Retrieved from http://nsse.iub.edu/links/surveys
Trusty, J. (2004). Effects of students’ middle-school and high-school experiences on completion
of the bachelor’s degree. Professional School Counseling, 7, 99-107.
UCLA. (2014). Freshman selection overview, Fall 2015. Undergraduate admissions and
relations with schools. Retrieved December 28, 2014, from
http://www.admissions.ucla.edu/prospect/adm_fr/FrSel.pdf
UCLA College of Letters and Science. (2013). Calculating your GPA. Retrieved July 15, 2013,
from http://www.ugeducation.ucla.edu/counseling/handouts/GPACalculation.pdf
University of California. (2014). A-G subject requirements. Retrieved July 20, 2014, from
http://www.ucop.edu/agguide/a-g-requirements/
University of Oregon. (2015). AP course ledger. Retrieved from
https://apcourseaudit.epiconline.org/ledger
U.S. Department of Education. (2008). Structure of the U.S. education system: U.S. grading
systems. Washington, D.C.: U.S. Department of Education, International Affairs Office.
U.S. Department of Education. (2013). Federal student grant programs. Retrieved from
https://studentaid.ed.gov/sites/default/files/federal-grant-programs.pdf
Warburton, E. C., Bugarin, R., & Nuñez, A.-M. (2001). Bridging the gap: Academic preparation
and postsecondary success of first-generation students (NCES 2001-153). Washington,
D.C.: U.S. Department of Education, National Center for Education Statistics.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
145
Wei, C. C., & Horn, L. (2009). A profile of successful Pell Grant recipients: Time to bachelor’s
degree and early graduate school enrollment (NCES 2009-156). Washington, D.C.:
National Center for Education Statistics, Institute of Education Sciences, U.S.
Department of Education.
Western Interstate Commission for Higher Education (WICHE). (2006). Accelerated learning
options: Moving the needle on access and success (No. 2A358). Boulder, CO: Author.
Wiggins, G. (1989). Teaching to the (authentic) test. Educational Leadership, 46(7), 41-47.
Willingham, W. W. (1985). Success in college: The role of personal qualities and academic
ability. New York, NY: The College Board.
Willingham, W. W., & Morris, M. (1986). Four years later: A longitudinal study of Advanced
Placement students in college (CB Report No. 86-2; ETS RR-85-46). New York, NY:
College Entrance Examination Board.
Willingham, W. W., Pollack, J. M., & Lewis, C. (2002). Grades and test scores: Accounting for
observed differences. Journal of Educational Measurement, 39(1), 1-37.
Witte, J. F. (1992). Private school versus public school achievement: Are there findings that
should affect the educational choice debate? Economics of Education Review, 11(4), 371-
394.
Woodruff, D. J., & Ziomek, R. L. (2004). High school grade inflation from 1991 to 2003 (ACT
Research Report Series 2004-4). Iowa City, IA: ACT.
Wyatt, J. N., Wiley, A., Camara, W. J., & Proestler, N. (2011). The development of an index of
academic rigor for college readiness (Research Report No. 2011-11). New York, NY:
The College Board.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
146
Yair, G. (2000). Reforming motivation: How the structure of instruction affects students’
learning experiences. British Journal of Educational Technology, 26(2), 191-210.
Zelkowski, J. S. (2008). Important secondary mathematics enrollment factors that influence the
completion of a bachelor’s degree (Unpublished doctoral dissertation). Ohio University,
Athens, OH.
Ziomek, R. L., & Svec, J. C. (1997). High school grades and achievement: Evidence of grade
inflation. NASSP Bulletin, 81(587), 105-113. doi:10.1177/019263659708158716
Zirkel, P. A. (1999). Grade inflation: A leadership opportunity for schools of education?
Teachers College Record, 101(2), 247-260.
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
147
APPENDIX
Figure A.1. Number of AP courses offered, by school type
Figure A.2. Student ethnicity, by school type
12.0%
12.8%
75.2%
13.3%
17.7%
69.0%
19.9%
19.3%
60.8%
0%
10%
20%
30%
40%
50%
60%
70%
80%
0 AP offered 1-3 AP offered 4+ AP Offered
Number of AP Courses Offered, by School Type
Public Catholic Independent
78.6%
21.4%
69.3%
30.7%
81.2%
18.8%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
White/Asian Other
Student Ethnicity, by School Type
Public Catholic Independent
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
148
Figure A.3. Students pursuing STEM majors, by school type
Figure A.4. Pell Grant eligible students, by school type
30.9%
28.6%
23.5%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Public Catholic Independent
Students Pursuing STEM Majors, by School Type
&
&
27.0%
16.4%
14.1%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Public Catholic Independent
Pell Grant Eligible Students, by School Type
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
149
Figure A.5. Mean AP STEM course GPA, by school type
Figure A.6. Mean STEM AP test scores, by school type
3.573
3.596 3.585
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
Public Catholic Independent
Mean AP STEM Course GPA, by School Type
3.764
3.405
3.858
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0
Public Catholic Independent
Mean STEM AP Test Scores, by School Type
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
150
Figure A.7. Mean non-STEM AP course GPA, by school type
Figure A.8. Mean non-STEM AP test scores, by school type
3.669
3.618
3.571
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
Public Catholic Independent
Mean Non-STEM AP Course GPA, by School Type
3.750
3.704
3.815
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
4.5
5.0
Public Catholic Independent
Mean Non-STEM AP Test Scores, by School Type
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
151
Figure A.9. Mean HSGPA, by school type
Figure A.10. Mean SAT, by school type
3.750 3.736
3.599
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
Public Catholic Independent
Mean HSGPA, by School Type
2065.754
2041.165
2060.511
600
800
1000
1200
1400
1600
1800
2000
2200
2400
Public Catholic Independent
Mean SAT, by School Type
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
152
Figure A.11. Mean FYGPA, by school type
Figure A.12. Residuals: HSGPA
3.307
3.243
3.266
0.0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0
Public Catholic Independent
Mean FYGPA, by School Type
0.049
-0.358
0.052
-1.0
-0.8
-0.6
-0.4
-0.2
0.0
0.2
0.4
0.6
0.8
1.0
Public Catholic Independent
Residuals: HSGPA
UNDERSTANDING HIGH SCHOOL TYPE AS AN ADMISSION TOOL
153
Figure A.13. Residuals: STEM AP GPA
Figure A.14. Residuals: Non-STEM AP GPA
0.095
-0.309
-0.114
-1.0
-0.8
-0.6
-0.4
-0.2
0.0
0.2
0.4
0.6
0.8
1.0
Public Catholic Independent
Residuals: STEM AP GPA
0.081
-0.301
-0.079
-1.0
-0.8
-0.6
-0.4
-0.2
0.0
0.2
0.4
0.6
0.8
1.0
Public Catholic Independent
Residuals: Non-STEM AP GPA
Abstract (if available)
Abstract
An understanding of how well a high school prepares its students for college success is critical information for admission officers
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
The moderating effects of racial identity and cultural mistrust on the relationship between student-faculty interaction and persistence for Black community college students
PDF
Factors influencing general education teachers to implement effective strategies for students with emotional and behavioral problems
PDF
Transfer students from California community colleges: a narrative approach to understanding the social capital and institutional factors that lead to a timely transfer to a public, four-year univ...
PDF
Field dependence-independence, logical-mathematical intelligence, and task persistence as predictors of engineering student’s performance in Chile
PDF
The practice and effects of a school district's retention policies
PDF
Oppression of remedial reading community college students and their academic success rates: student perspectives of the unquantified challenges faced
PDF
The impact of the Norton High School early college program on the academic performance of students at Norton High School
PDF
The Relationship Between Institutional Marketing and Communications and Black Student Intent to Persist in Private Universities
PDF
College readiness in California high schools: access, opportunities, guidance, and barriers
PDF
Sustaining the arts: a case study of an urban public high school of choice
PDF
Successful communication strategies used by urban school district superintendents to build consensus in raising student achievement
PDF
The lived experience of first-generation latino students in remedial education and navigating the transfer pathway
PDF
How does the evidence-based method of training impact learning transfer, motivation, self-efficacy, and mastery goal orientation compared to the traditional method of training in Brazilian Jiu-Jitsu?
PDF
PowerPoint design based on cognitive load theory and cognitive theory of multimedia learning for introduction to statistics
PDF
Women, gaming and STEM majors: interest and motivation
PDF
Coloring the pipeline: an analysis of the NASPA Undergraduate Fellows program as a path for underrepresented students into student affairs
PDF
The impact of diversity courses on students' critical thinking skills
PDF
Assessing and addressing random and systematic measurement error in performance indicators of institutional effectiveness in the community college
PDF
Sustaining arts programs in public education: a case study examining how leadership and funding decisions support and sustain the visual and performing arts program at a public high school in Cal...
PDF
An examination of teacher leadership in public schools
Asset Metadata
Creator
Brunold, Heather Heimerl
(author)
Core Title
Understanding high school type (public, Catholic, and independent private) as an admission tool to contextualize student performance
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
02/12/2015
Defense Date
01/13/2014
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
AP,Catholic school,college admission,college preparatory,grade inflation,High School,OAI-PMH Harvest,private school,private school effect,Public school,school type,Secondary School,STEM
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Tobey, Patricia Elaine (
committee chair
), Harrington, Katharine (
committee member
), Hocevar, Dennis (
committee member
)
Creator Email
hbrunold@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c3-532883
Unique identifier
UC11297818
Identifier
etd-BrunoldHea-3186.pdf (filename),usctheses-c3-532883 (legacy record id)
Legacy Identifier
etd-BrunoldHea-3186.pdf
Dmrecord
532883
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Brunold, Heather Heimerl
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
AP
Catholic school
college admission
college preparatory
grade inflation
private school
private school effect
school type
STEM