Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Achieving equity in educational outcomes through organizational learning: enhancing the institutional effectiveness of community colleges
(USC Thesis Other)
Achieving equity in educational outcomes through organizational learning: enhancing the institutional effectiveness of community colleges
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
ACHIEVING EQUITY IN EDUCATIONAL OUTCOMES THROUGH
ORGANIZATIONAL LEARNING: ENHANCING THE INSTITUTIONAL
EFFECTIVENESS OF COMMUNITY COLLEGES
by
Roberto Gonzalez
__________________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2009
Copyright 2009 Roberto Gonzalez
ii
DEDICATION
To my daughter, you are my inspiration and my strength.
iii
ACKNOWLEDGMENTS
This dissertation would not have been possible without the support, patience,
and guidance of my dissertation committee, family, and friends. I owe my deepest
gratitude to my family who have endured much throughout this process.
To my mother, Victoria, who instilled in me the value of obtaining an
education. You have been the driving force behind by educational success. I still
remember all the times you helped me with my homework as a child. Although you
did not have the chance to continue your education, you always pushed me to dream
and work towards achieving my dreams. This dream of becoming a Dr. would not
have been possible without your nurturing love and undying faith in me. You have
supported me through this journey and helped me to keep things in perspective.
Your enduring strength and love have made me into the man I am today. I hope I
have the opportunity to repay you for all you have given me.
To my wife, Nora, you have stood by me in the most difficult times. You
understood and encouraged me when I felt like giving up. You will always have my
love and gratitude. Without your support, this achievement would not have been
possible.
To my friends, Martha, Seema and Cristina, words cannot describe how
appreciative I am of your friendship. Our collegial exchanges helped me to put
everything in perspective during the times when I felt the most discouraged and was
ready to give up. I am proud to call you Drs., colleagues, and friends. I look
forward to being a part of your lives for many years to come. To my Santa Monica
iv
College colleagues, words cannot begin to describe how lucky I have been to have
you in my corner. Not a day goes by that I do not thank God for putting you in my
life. I am thankful for your enduring love and encouragement.
Last but not least, I would like to express my sincerest appreciation to my
committee chair, Alicia Dowd, for helping me throughout the process. You are a
first rate scholar from whom I have learned a lot. I am truly grateful for your
guidance and unwavering support. To Drs. Robert Rueda and Raul Cardoza, thank
you for your input and leadership. I have learned much from both of you.
v
TABLE OF CONTENTS
Dedication ……………………………………………………………………… ...ii
Acknowledgments …………………………………………………………….. ...iii
List of Tables ………………………………………………………………….. ...vi
Abstract ……………………………………………………………………….. ...vii
Chapter 1: Introduction ………………………………………………………… ...1
Chapter 2: Conceptual Framework …………………………………………… ...30
Chapter 3: Methods …………………………………………………………… ...60
Chapter 4: Results …………………………………………………………….. ...82
Figure 1: Steps to Awareness …………………………………………………...84
Chapter 5: Summary and Conclusions ………………………………………. ...133
References …………………………………………………………………… ...161
Appendices:
Appendix A: Informed Consent for Non-Medical Research ……………….. ...168
Appendix B: Interview Guide #1 …………………………………………… ...172
Appendix C: Interview Guide #2 …………………………………………… ...173
vi
LIST OF TABLES
Table 1: Institutional Self-Assessment Project Participants …………………. 65
Table 2: Participation in Inquiry Activities and Team Meetings …………. ...107
Table 3: Successful Course Completion Rates of Basic Skills Students in . ...109
Math at Hills
vii
ABSTRACT
This manuscript details the learning which took place among 4 administrators
involved in a collaborative Institutional Self-Assessment Project. Additionally, it
was the intent of the study to determine the extent to which changes were informed
by the learning which took place. The purpose of the overall project was to enhance
the effectiveness of educational institutions at addressing the diverse learning needs
of students entering the community college at the basic skills level. The intent of the
study was to determine how data contributed to the creation of awareness and
learning of inequitable educational outcomes among student groups. It was believed
that by using current data sources that administrators would be more able to
determine institutional performance and adjust institutional policies to increase
student success. A qualitative case study approach was used to determine that
administrators’ awareness and learning increased regarding the existence of
inequitable educational outcomes. Changes which took place were based on the
learning which took place.
1
CHAPTER 1: INTRODUCTION
Background of the Problem
The United States system of education aspires to provide people regardless of
background, creed, status or ethnicity the opportunity to participate and contribute to
American society. The message communicated to all children in schools in the
United States is that education is power and that the knowledge gained through
schooling can lead to greater opportunities in the personal and professional realms.
However, many youth do not obtain the education needed to tap into the
opportunities available for various reasons. This lack of educational attainment is
especially prevalent among historically underrepresented students such as Latinos
and African Americans. The lack of access to higher education by underrepresented
populations is of particular importance due to the benefits afforded to those with an
education. Education allows people to participate and contribute to society by giving
more in the form of taxes, using fewer resources, fueling the economy, and assisting
in decisions made locally and nationally (Institute for Higher Education Policy,
2004).
Furthermore, earning potential increases for those who obtain a higher
education. In fact, an individual who earns a baccalaureate will earn twice as much
over their lifetime when compared to someone earning only a high school diploma
(US DOE, 2006). Although the importance of obtaining a higher education is clear,
a gap exists in the attainment of a higher education among different ethnic groups in
the United States. The gap in attainment among groups in American society is not a
2
new phenomenon. However, with growing competitiveness in the world of
workforce preparation, and scarce resources in the form of social services, the
educational achievement gap becomes extremely important. It is projected that 90%
of the jobs experiencing the most growth in the “knowledge-driven economy” will
necessitate some form of higher education (US DOE, 2006, p.1). If the achievement
gap is not addressed, the economic livelihood of the United States and social welfare
of the American populace are likely to suffer.
In this chapter, I describe the prevalence of inequitable learning outcomes
within the secondary and higher education systems in the United States, discuss
ramifications for groups lagging in educational attainment, look at accountability
measures attempted, and end with an overview of the rest of the dissertation. My
review of the organizational learning, socio-cultural, and leadership literature
indicates that administrators at institutions of higher education can reduce the
existence of differential educational outcomes by learning to identify and respond to
institutional factors that deter student success. However, administrators do not
always use data to inform decision making on a consistent basis. Knowledge gained
through the interpretation of data will allow administrators to increase their
institutionally based funds of knowledge to effectively address institutional barriers
hindering student success. Institutionally based funds of knowledge is the
understanding a practitioner has about the rules and structures that govern the
institutional environment (Stanton-Salazar, 1997).
3
The United States system of postsecondary education has been touted for the
accessibility to people seeking a higher education. However, not all students have
been successful in completing a higher education. The gap in educational attainment
among groups is often referred to as a lack of equity in the attainment of educational
outcomes. Equitable outcomes are defined as the attainment of similar outcomes for
all ethnic groups at the same educational institution (Bensimon, 2005). For example,
an inequitable outcome is a lack of educational attainment in the form of graduation
rates among one group when compared to others. One group may be graduating at a
higher rate than another. The lack of equity in outcomes is also referred to as
unequal outcomes, differential learning outcomes, or inequities in educational
outcomes.
The two ethnic groups lagging behind in educational attainment when
compared to other groups are Latinos and African Americans. For example,
nationwide the attainment rates among Latinos of high school diplomas are low. In
2007 only 65% of Latino students graduated from high school compared to 93% of
whites (NCES, 2008). The presence of inequitable learning outcomes among
historically underrepresented student groups such as Latinos and African Americans
is of particular importance due to the percentage of the school going population they
represent. Nationally students of color currently represent 43% of all public school
enrollments (Knapp, Kelly-Reidd, Ginder, & Miller, 2008).
Moreover, the lack of educational attainment that is present at the secondary
and postsecondary levels point to a deficiency on the part of institutions in the
4
identification and institutionalization of policies and practices that promote student
success among all groups. One of the major characteristics of the postsecondary
systems of higher education is that in reality not all students have the opportunity to
access higher education. In fact, students of color are not attending higher education
institutions at the same rates as other ethnic groups. In 2006, Hispanics and African
Americans represented only 4% and 5.5% of enrollments at four-year public and
private higher education institutions (Knapp et al., 2008). The factors that have been
used to explain differential learning outcomes among students at the secondary and
post-secondary levels include: inadequate preparation, lack of information about
college opportunities, and persistent financial barriers (US DOE, 2006). Perhaps the
most common reason advanced by practitioners and scholars to explain differential
learning outcomes is the lack of preparation on the part of students.
The student deficit model places the burden of educational outcomes on the
student and leaves out the institution’s responsibility out of the equation. In fact, the
preparation issue impacts almost all students. Kirst (2007) estimates that at least 60
percent of students in the age bracket of 17 to 20 entering higher education
institutions need remedial courses. Remedial courses are courses in math and
English for students who do not have the skills or knowledge to complete college
level work in these areas (Merisotis & Phipps, 2000). Under prepared students of
color who are attending institutions of higher education are having a difficult time
adjusting to the academic rigors of university life which results in a high attrition
5
rate. According to Kezar and Eckel (2007), the dropout rate for Latinos and African
Americans pursuing a higher education was 29% and 30% respectively in 2003.
The educational attainment among ethnic groups is of particular importance
due to the size of the populations studied, resources expended by institutions, and
potential costs associated with providing services to a less educated populace.
Higher education institutions have had a particularly difficult time addressing the
existence of differential learning outcomes because no standard set of outcomes
exists to guide assessment to gauge institutional effectiveness. If institutions do not
find ways to assess and address the problem of differential outcomes, the problem is
not likely to disappear but is more likely to grow. Institutions have not been able to
curb the high attrition rates among Latinos and African Americans at both the
secondary and post-secondary levels. Institutions have devoted resources in the form
of time, money, and expertise to serve all students. The students who do not
graduate represent a lost investment for the institution and society. This investment
without a return drains the already scarce resources available to secondary and
postsecondary institutions to educate others. Lastly, students who drop out are less
prepared to enter the world of work and more likely to rely on state and federal
resources to obtain assistance with basic social services. The median earnings of
people with a high school diploma are 37% less than a person with a bachelor’s
degree (US DOE, 2006). Some form of higher education is required for people to
thrive and survive in American society. It is the institution’s responsibility to create
equal educational outcomes for all students.
6
The existence of differential educational outcomes is a pressing educational
concern that has attracted national attention and sparked national and state initiatives
at the secondary and postsecondary levels. Differential learning outcomes have
existed at K-12 and higher education institutions for decades. Moreover, data on
baccalaureate achievement rates during 1971-2007, shows that the gap has widened
among Latinos and African Americans when compared to B.A. attainment among
whites (NCES, 2008).
One of the reasons for the existence of inequitable learning outcomes is that
institutions lack coherent and practical assessment protocols. Assessment is the
method used at the institutional level to collect information to evaluate institutional
effectiveness (ACCJC, 2007). At the secondary level, the No Child Left Behind
(NCLB) initiative has attempted to address the lack of equity in educational
outcomes. In this way, the national government has through assessment attempted
to redress the lack of consistency in educating America’s youth by holding
institutions accountable for learning outcomes. The accountability movement in the
United States is exemplified by the nationally mandated NCLB initiative.
Established in 2002, the intent of NCLB was to hold institutions accountable for
raising student achievement on academic performance benchmarks in English and
mathematics for all students. Institutions were provided with standardized
performance measures that schools needed to meet in order to ensure student
success.
7
Although the standards based education movement has attempted to make
primary and secondary schools more accountable by reporting progress on indicators
to measure student performance, progress has been slow. Critics of NCLB stated
that it was difficult to ensure equal performance among students based on regional
and local differences in resources available to school districts and lack of incentives
to motivate schools to improve. Additionally, there was no consequence for
institutions which did not meet the progress standards as required by NCLB and any
additional funding which was provided to schools through Title I could not be used
by schools to develop assessments (Carey, 2005). A contradictory message was sent
to primary and secondary institutions about the purpose of accountability. On one
hand institutions were told that they must comply with performance requirements,
but policies and procedures communicated the idea that they must accomplish this
without additional resources. Postsecondary institutions have not been required to
meet performance requirements, but the lessons learned at the primary and secondary
levels will shape any future accountability systems in higher education.
Higher education institutions have not been immune to accountability
measures aimed at ameliorating the existence of differential learning outcomes. No
consensus exists about how best to measure institutional effectiveness at the
postsecondary level. Several measures have been proposed. The U.S. Department
of education has attempted to establish a national accountability system with some
features similar to NCLB for postsecondary institutions. A feature of the proposed
accountability system calls for better measurement of the way institutions of higher
8
education impact student success by using a value added approach. The value added
approach measures growth at the beginning and at the end of postsecondary
attendance based on indicators of growth (DOE, 2006). Another recommendation
was made by the Educational Testing Service (ETS). ETS made recommendations
to policy makers to create an accountability system that measures student learning
while in college along four dimensions. The four dimensions were workforce
readiness and general education skills, domain specific knowledge, soft skills, and
student engagement (Dwyer, Millett, & Payne, 2006). Similar to the U.S.
Department of Education, Dwyer, Millet, and Payne argued that the focus of
postsecondary institutions tended to be either on input or output characteristics of
students but little attention has been paid to how the institution (environment) is
impacting student outcomes (2006). Postsecondary institutions are likely to continue
to face pressure from federal and state authorities to prove the effectiveness of their
institutions. This accountability movement represents an opportunity for higher
education institutions to assist in the development of an accountability system that
will take into account the variability among higher education institutions.
The postsecondary system of most importance in regard to access,
remediation and equity of student outcomes is the community college system. The
community college system in California is of particular interest because of the sheer
size and impact of the system. California in many ways is a microcosm of what is
occurring in the rest of the nation and can serve as a model for other community
college systems in the United States. The community college system in California
9
with 110 community colleges is the largest postsecondary system in the nation
serving the majority of the college going population in the state. It is estimated that
about 73% of all students in California enter higher education through the
community college system (Moore, Shulock, Ceja, & Lang, 2007).
The community college system in California has multiple missions which
include transferring students to four-year institutions, remedial education, and
economic and workforce development (California Community Colleges, 2007). All
three missions contribute to the economic and social vitality of the state. The
existence of unequal outcomes represents a loss in resources for the individual and
the state. According to Shulock, Moore, Offenstein, and Kirlin (2008), in order for
California to compete in a global economy the number of students earning associate
degrees and bachelor’s degrees needs to increase by more than 50%. On a national
level California is ranked 46
th
in the number of baccalaureate degrees it grants for
every 100 students enrolled at the undergraduate level (Shulock & Moore, 2007).
This means that the majority of the younger working class population is under
prepared to compete on the local and global marketplace, use more state and local
resources, and more likely to be unemployed. The student populations most
impacted by inequitable outcomes are Latinos and African Americans. It is
estimated that California’s Latino population will increase by 26% by 2030 (Myers,
2004). With a demographic shift on the horizon and the economic futures of
residents of California, and the state itself, at risk, action must be taken to address
and ameliorate unequal outcomes at the community college level.
10
Next, I will describe how community colleges in California are attempting to
address the existence of differential outcomes. After that, I will explain the purpose
of the project under study. In the following section, I will explain the research
questions guiding this study. Lastly, I will explain why the study is needed and the
importance of the study.
Statement of the Problem
The community colleges in California have not redressed the differential
learning outcomes among different ethnic groups within the system. Moreover, the
effectiveness of the system in educating students has been questioned. It is estimated
that only about 25% of students whose intent was to obtain a degree, certificate, or to
transfer will accomplish their goal within six years of first enrolling at a community
college in California (Moore et al., 2007). The problem of student achievement at
the community college level is further complicated when one compares achievement
rates among various ethnic groups. Disaggregating data based on ethnicity shows
that the two ethnic groups with the lowest course completion, associate degree
completion, and transfer rates in California are Latinos and African Americans
(Moore et al., 2007). The existence of differential learning outcomes can partially be
explained by the level of proficiency students exhibit in the areas of math and
English in the college placement exam. The majority of students starting off at the
community college level in California are not prepared to do college level work.
This group of students, who are commonly referred to as basic skills students,
represent a large percentage of community college students. These students are of
11
particular interest due to the low success rates among this population to attain a
degree or certificate. Data shows that students placing at a reading fundamentals
level have a 25% chance, over a three year period, of eventually taking a transfer
level English class (as cited in Research and Planning Group for California
Community Colleges, 2005).
The existence of differential learning outcomes signals a problem on the part
of administrators at the institutional level. Administrators have the power and
influence to create changes which impact the type of learning environment students
will experience. Policies and procedures implemented at the institutional level can
help or hinder student success. However, administrators may or may not have all
the information they need to make the best decisions to increase student success.
According to Shulock et al. (2008) practices that promote student success are known,
but community colleges do not use data to inform decisions on how to best help
students succeed.
The nature of the job of an administrator is partly to blame for the lack of use
of data to inform decision-making. Most administrators such as Deans and Vice
Presidents oversee the operational and fiscal aspects of multiple units. It is easy for
an administrator to focus most of their attention on daily operational concerns of the
department or unit and not departmental effectiveness. Administrators at the
community college level are also responsible for compiling reports for the President
of their college. Aside from the oversight of all aspects of units within their division
and reporting requirements, administrators also serve on various institutional and
12
state committees. Therefore, little time is left for long term planning or data
analysis. Administrators rely on their knowledge and experience as administrators to
make decisions.
Therefore, the types of knowledge and experience administrators rely on to
make sense of situations are important to the resolution of institutional problems.
There are different types of knowledge, which administrators bring to bear on the
resolution of institutional problems. Polkinghorne notes that practitioners have in
most instances mastered technical knowledge which he defines as the procedures
taken to address common or recurring problems faced by practitioners of care
(2004). However, practitioners often lack the training to solve complex problems
because they require more than procedural knowledge to identify causes of problems
and create solutions aimed at solving the root of the problem. Perhaps the most
significant element to the problem solving aspect of an administrator’s job is
information. Often administrators are asked to make important decisions which
impact employees and students in a short timeframe. Previous experiences can assist
administrators in the development of their “practical perception,” which is having the
ability to look at a situation and recognize key elements to be able to construct
effective solutions (Polkinghorne, 2004, p.117).
Once a problem is identified, information is collected on the situation in
question and then a decision is made to resolve the problem. For Polkinghorne
(2004) the implementation of the solution is based on the concept of phronetic
deliberation or ability to decide which problem to tackle first by looking at a
13
situation with clarity and openness. Phronesis is a type of knowledge that allows
practitioners to adapt to different situations which helps in the identification of
important elements of a situation to guide action (Polkinghorne, 2004). The purpose
of phronetic deliberation is to assist practitioners in deciding which concerns to
address first.
Polkinghorne believes that practitioners possess the technical knowledge
learned through their education and work experience in their professions that guide
action to resolve certain types of problems. However, he advocates for the
development of phronetic deliberation among practitioners because it involves the
development of a knowledge which allows practitioners to tackle problems involving
problems which do not have simple solutions or which constantly are evolving. A
practitioner’s knowledge and background are important to the resolution of current
and future situations but not all prior knowledge or experiences relate to each
situation faced by practitioners (Polkinghorne, 2004).
The Educational Testing Service (ETS) advanced the notion of creating a
culture of evidence in higher education. ETS introduced a seven step approach
which includes the following: (1) communicating learning outcomes; (2) assessment
audit; (3) assessment augmentation; (4) refining assessment system; (5) learning
from efforts; (6) ensuring student learning success; and (7) maintaining a culture of
evidence (Miller, Payne, Dwyer, Stickler, & Alexiou, 2008). The first step is
determining the goals the institution is willing to work towards involving student
learning outcomes. Next, the institution determines what information is available on
14
student learning and what information is yet to be collected to demonstrate student
learning. The following step is for the institution to create next forms or structures to
collect data needed to provide accurate assessment of institutions’ progress towards
meeting objectives. The next process is to assess the progress the institution is
making towards accomplishing the goal on an on-going basis. The following step is
to make adjustments to programs and interventions in place to facilitate student
learning. Lastly, the institution institutionalizes the culture of evidence by allocating
the appropriate time and resources to efforts to improve student-learning outcomes.
Additionally, born out of increasing accountability efforts in higher education
scholars have advanced the notion of creating a culture of inquiry in which the
practitioner uses his or her experience to learn more about the institution through the
collection of data (Dowd, & Tong, 2007). The distinction between the culture of
evidence and culture of inquiry is the importance given to the usefulness of data. In
a culture of evidence, the data is the most important while in a culture of inquiry the
practitioner is the center of attention. It is believed that administrators and faculty
engaged in dialogue around the effectiveness of programs and services provided to
students can help practitioners gain greater insight on ways to improve institutional
functioning (Dowd, 2005).
Furthermore, the development of this phronetic decision-making knowledge
is essential to adapt to ever changing conditions. An administrator’s ability to adapt
to evolving situations is dependent on the information he or she has at their disposal.
The nature of an administrator’s job involves making decisions on a limited amount
15
of information. The collection and interpretation of information can give an
administrator an accurate picture of the situation at hand. Administrators may
possess the technical knowledge to address certain types of problems which are
common but may not possess the ability to phronetically deliberate on situations
which are more complex. Moreover, all institutions of higher education collect data
but very few consistently review data and make decisions based on lessons learned
from data. Data and on-going analysis of data by practitioners is seen by scholars as
a way to help practitioners create usable knowledge to solve problems (Dowd, 2005;
Dowd & Tong, 2004; Boudett, City, & Murnane, 2005). Techne and phronetic
knowledge is impacted by data because both types of knowledge are informed
through greater knowledge of the institutional environment. The practitioner can
build their knowledge of institutional processes and gain skills to adapt to changing
institutional situations through data. However, what is needed is competent
practitioners to be able to guide less experienced practitioners to build their techne
and phronetic knowledge.
In order for administrators to effectively meet the needs of a diverse student
population, a systemic and integrated way of assessing what works at the
institutional level is needed. Administrators’ decision-making role requires that they
be able to effectively gauge situations to make informed decisions. The knowledge,
experience, and resources an administrator relies upon will influence the impact on
students of decisions made. Institutions of higher education will continue to face
growing pressure at the local, state, and national level to effectively use the resources
16
at their disposal. Specifically, administrators will bear the greatest burden in coming
up with systems and structures to increase institutional efficiency. Equity and
efficiency are synonymous terms because essentially the goal is to educate as many
students as possible. If certain student groups are achieving at higher rates, the
institution is not doing all it can do to assist students who are underperforming.
Essentially, scholars like Polkinghorne suggest that the knowledge a
practitioner has about their institution and how they develop this knowledge may
hold the key to the resolution of complex problems. Unfortunately, little scholarly
research examines how practitioners or communities of practitioners develop this
expertise. Kruse and Louis (1997) found that schools could benefit from the
development of teams of practitioners who constantly dialogue about ways to
improve student performance. However, teams of practitioners must be managed
well and demands placed on the time of instructors should be considered. Perhaps
the most significant finding of the Kruse and Louis (1997) study is that changing a
practitioner’s mindset from a student deficit perspective to institutional responsibility
for student learning takes time and effort. Everyone in an institutional environment
plays a role in increasing student success. The methods institutional leaders and
practitioners use to learn and construct knowledge may hold the key to creating more
efficient educational institutions.
The most effective institutions use data to inform decision-making. The
Education Trust reported that the colleges and universities across the country
demonstrating the best undergraduate graduation rates shared some common factors
17
which included: administrators and faculty using data to understand students and
change policies, constant use of data to identify problems and effectiveness of
changes implemented and a strong commitment to student success (Carey, 2005).
Consequently, the community college system in California is taking steps to redress
the existence of differential learning outcomes among basic skills students by
offering funding to institutions to implement research based strategies and
assessment aimed at increasing student success. The California Community College
Chancellor’s Office Basic Skills Initiative provides individual institutions additional
funding to improve the effectiveness of services offered to basic skills students
within their institution. Each institution is required to submit a plan which
references current research on the steps the practitioners will take to address the
diverse learning needs of basic skills students. Within the institution, practitioners
may be required to submit separate proposals that use research based strategies and
on-going assessment to measure the effectiveness of interventions implemented with
Basic Skills monies.
There is a gap in learning on the part of administrators due in large part to the
lack of on-going assessment at the institutional level and training available to assist
leaders in their learning of factors influencing student success. Several attempts
have been made on the part of the system to create an accountability system to
redress inequitable learning outcomes, but all have been compulsory reporting
systems. The compulsory reporting structure has probably not assisted
administrators in their learning to the extent desired because numbers alone do not
18
teach administrators why students are not performing academically. In most
instances, administrators do not know or fully understand how to make sense of the
performance indicators reported on their institution. Many administrators do not
have a research background that allows them to have the understanding of how to
decipher large data sets or be able to correlate pieces of information to show program
effectiveness. Additionally, data may or may not be used on a consistent basis at the
institutional level to impact student success. One of the reasons data is not used on a
consistent basis to make decisions is the lack of “research capacity” at the
institutional level (Shulock et. al, 2008). Very few community colleges have the
resources to create or sustain an institutional research department devoted to the
collection, interpretation, and dissemination of data to assist administrators in
making more informed decisions. A study conducted by Morest and Jenkins (2007)
of community colleges in the United States identified barriers to developing a
“research capacity’ which were: staffing, funding, time spent on compulsory
reporting, and research not a priority. Out of the participating institutions surveyed
and interviewed, about 75% of colleges had two or fewer staff devoted to the
institutional research function. As shown by survey results, most colleges did not
find the Institutional Research department as instrumental to institutional functioning
and thus very few resources were devoted to this duty. Lastly, the majority of the
time of Institutional Researchers was spent on compliance reporting such as federal,
state, accreditation, and grant reporting.
19
Furthermore, the few community colleges, which do have staff devoted to the
collection, interpretation, and dissemination of data, do not always present data in
terms practitioners can understand. Institutional researchers may or may not present
information to groups of practitioners in a clear and understandable format. The
importance of being able to make sense of data is profound because data can guide
action on the part of the practitioner. Although not organized to do so, some argue
that one of the main functions of the institutional research department is to teach and
impact organizational change (Dowd, Malcolm, Nakamoto, and Bensimon, 2007).
Institutions without an institutional research department leave administrators without
a valued resource to be able to make sense of data. We know very little about how
administrators in higher education use their knowledge and institutional data to make
informed decisions. Administrators risk creating solutions for situations that they do
not fully understand. Therefore the actions administrators take informed by data can
provide insight that can be used to solve the root of a problem. Student success data
which graphically show practitioners disparities in achievement can effectively bring
about insight into institutional efficiency. Data presented in an understandable and
digestible format by a knowledgeable researcher to the non-expert practitioner can
help illuminate a situation which might not have visible previously. Data may need
to be presented in a written format depending on the practitioners involved in the
data analysis process. Essentially, the data has to be tailored to the sophistication of
the practitioner making decisions on the data.
20
Accountability in the California Community College System
The California community college system has made attempts to address the
existence of differential outcomes through the implementation of accountability
systems. Accountability is defined as policies and procedures designed to hold
institutions responsible for performance outcomes. However, what is needed is a
clear assessment system to inform practice tied to accountability reporting to enable
leaders to improve outcomes. Assessment is the organized set of activities designed
to collect information on a situation under investigation that uses all the resources
and expertise in order to inform decisions about how to improve the situation under
investigation (Walvoord, 2004). A few of the accountability efforts attempted at the
community college level include the following: Partnership for Excellence (PFE);
State Report Card; Vocational and Technical Education Act (VTEA); and Student
Learning Outcomes (SLO’s) (Grubb & Badway, 2005). Each of these accountability
systems was concerned with reporting information on student outcomes, but has
fallen short of making substantive changes to impact or reduce the existence of
differential outcomes present within the system. Part of the reason these
accountability measures have not been effective has to do with the learning taking
place on the part of administrators. Practitioner inquiry and learning are visibly
absent from these compulsory reporting systems. Leaders have been trained to
collect and report numbers, but they have not been taught to inquire what the data
means or how the data can inform institutional practice.
21
Moreover, the latest attempts at accountability, Accountability Reporting for
Community Colleges (ARCC) and Basic Skills Initiative (BSI), continue the trend of
compulsory reporting with little training for institutional leaders to make sense of
data. ARCC, also known as California State Assembly Bill AB 1417, requires all
community colleges to report data that can be used to determine institutional
performance and effectiveness in meeting state and local priorities. ARCC mandates
all 110 community colleges in California to report on “college performance
indicators” which include: student progress and achievement; percentage of students
earning 30 units; persistence rates; annual course completion rates for vocational
courses; course completion rates for basic skills and ESL students; improvement
rates for ESL and credit basic skills courses; demographic information, and a college
self-assessment (California Community College Chancellors Office, 2005).
However, ARCC does not report data disaggregated by race which leaves the
existence of unequal outcomes hidden. Furthermore, the multiple missions of the
community college make it difficult for institutions to make substantive change due
to scarce resources distributed among different divisions. Administrators may once
again be left with little knowledge on how the institutional environment impacts
student outcomes.
Furthermore, the Basic Skills Initiative seeks to improve the success of
students who start at the basic skills or pre-college level within the California
Community College system. The initiative provides the 110 California Community
Colleges with resources to increase the success of students coming in at the remedial
22
level. The implementation of the initiative came in three phases which were: (1)
communicate effective practices; (2) professional development to aid in assessment
and evaluation of basic skills and ESL efforts; and (3) implementation of plans to
improve basic skill student success (Adams, and Illowsky, 2008). Community
colleges were required to assess the success of basic skills students at their institution
and create plans to improve student success. However, the form which was required
by the state did not ask institutions to report how the plan addressed student success
or how data would be collected to demonstrate progress for continued funding.
While BSI funding provides institutions with much needed resources, the connection
between professional development and student success remains unclear. Moreover,
the lack of resources at the California Community College Chancellor’s office results
in little support in the form of training for practitioners to ensure student outcomes
are being improved. The initiative promotes fiscal accountability with the reporting
of expenditure on a bi-annual basis, but no reporting is required demonstrating the
effectiveness of measures implemented at the institutional level. Current
accountability initiatives are meant to improve student learning outcomes, but
practitioners are given little training on how to best accomplish this.
Purpose of the Study
The purpose of this study was to investigate the extent to which inquiry
activities at the institutional level performed by practitioners participating in the
Institutional Self-Assessment Project facilitated learning. The purpose behind the
activities was to facilitate learning among administrators and other professional staff
23
around the existence of differential learning outcomes. Learning was facilitated
through the use of data and inquiry activities. Inquiry activities were designed to
assist practitioners, including administrators, in the identification, and resolution of
obstacles to student success. All inquiry activities revolved around data. Inquiry
activities included the collection and interpretation of baseline student performance
data, interviews, and surveys. Formal leaders were the focus of this study because of
the pivotal role they play in influencing student success. Formal leaders such as
administrators have the most potential to implement policies and procedures which
will help the institution to deal with unequal student outcomes. Leaders have
specialized skills sets, knowledge, influence, and power which they can use to create
sustainable change around the creation of equitable student outcomes. Structured
activities were used in the project to bring about learning among leaders and other
practitioners around the existence of inequitable learning outcomes at their
institution.
The educational outcomes which the Institutional Self-Assessment Project
was most concerned with was the success of community college students in the
completion of courses related to the preparation and eventual transfer to four year
institutions. Transfer is the completion of the first two years of college level
coursework on the part of a student at the community college level with the intent of
going to a four-year university to attain a baccalaureate degree. Data illuminating
student progress through the core English and math courses helped practitioners
identify how students were doing in these courses. Any activities that shed light into
24
how students were progressing, which students were succeeding, which students
were not performing well, and what the success rates were overall for students, have
the potential to build administrators’ knowledge of institutional functioning.
The main concepts used to frame the analysis of this study were
organizational learning and institutionally based funds of knowledge. Organizational
learning is the process of using data to assist in the identification and resolution of
institutional problems (Kezar and Eckel, 2007). Institutionally based funds of
knowledge refers to the notion that practitioners possess institutional knowledge
about the policies and procedures that govern people within the institution (Stanton-
Salazar, 1997). Knowledge is what a person knows about a subject or topic while a
fund of knowledge is information which can be used to accomplish a task or goal.
The literature suggests that organizational learning provides administrators with a
mechanism to increase their knowledge of the institutional conditions impacting
student success. In essence, the processes of organizational learning were expected
to assist administrators in the development of their institutionally based funds of
knowledge. This insight of how the organization functions assists administrators in
effectively addressing institutional problems like differential learning outcomes. The
existence of differential learning outcomes was of particular importance due to the
reluctance of higher education institutions to measure equity as an indicator of
institutional effectiveness. Furthermore, Bensimon (2007) suggests that institutions
do not look at equity as a measure of effectiveness among students because they do
not have the capacity to recognize the presence of inequality. Equity mindedness is
25
needed to bring about greater awareness among practitioners of how policies and
procedures impact students. Bensimon (2007) defined equity mindedness as the
awareness of how practices, racism, and power asymmetries impact student
outcomes for students of color.
The focus of this study revolved around the processes practitioners used to
collect, interpret, and apply learning from data to inform decision making.
Practitioners in community colleges do not use data on an on-going basis to inform
practice. Scholars have noted that in order for change to be substantial and effective,
it must be driven by practitioners within institutions. Furthermore, an equity minded
perspective was needed among practitioners to see how different institutional
characteristics were impacting the student success of all student groups. Dowd and
Tong (2007) suggest institutions use evidence based inquiry councils (EBIC), which
are practitioner driven inquiry teams to use data to create greater awareness among
the campus community. Implicit in this design is an emphasis on the unique level of
understanding of their campus environment which a practitioner relies on when
facing institutional problems. Due to the importance of context in data
interpretation, data is not an end in itself, but serves as evidence for the existence of
conditions which need to be addressed by administrators. Moreover, this study
focused on the role formal leaders played in addressing the institutional problem of
unequal learning outcomes.
As mentioned above, the setting for the study was the Institutional Self-
Assessment Project which was one of several action inquiry projects conducted by
26
the Center for Urban Education. The project was a collaborative venture between
the University of Southern California and, in its pilot phase, three community
colleges in the Los Angeles area to develop strategies to increase the number of
African Americans and Latinos starting out at the basic skills level transferring to
four-year universities. Hills community college, a pseudonym, was the primary
location of the study. Hills College was a participant in the Institutional Self-
Assessment Project.
The three types of inquiry processes used to stimulate learning of institutional
effectiveness among practitioners were: performance benchmarking; effective
practices benchmarking; and process benchmarking. These processes were used to
assist practitioners in the inquiry process of how their institution was performing to
assist all students achieve their educational goals. In performance benchmarking, an
organizational team establishes baseline data on which to construct goals for
improvement and to track progress. Performance benchmarking was used after the
site teams collect and discuss the student success data such as course completion
rates and graduation rates. The second process which was used to stimulate learning
was effective practices benchmarking. This type of benchmarking involved an
institution in identifying other institutional practices which might facilitate the
achievement of desired outcomes in the performance goal. Site teams study
participants identified colleges of similar characteristics with exemplary programs in
the areas which they sought to improve to visit. The purpose of the visits were to
learn from the institutional practices of another similar institution. Lastly, process
27
benchmarking was the process of identifying and understanding how institutional
practices helped or hindered student achievement (Dowd, 2008). Through the above
mentioned processes, the institutional team at Hills community college was asked to
use data to determine changes to institutional policies and procedures which needed
to be made to increase the educational outcomes for all students. Site teams were
asked to examine data to further understand what was occurring at the institutional
level which impacted students.
Research Questions
The purpose was to discover how administrators associated with the
Institutional Self-Assessment Project at Hills Community College used data to
inform their decision making to impact differential learning outcomes. Specifically,
I examined the following: (1) What types of learning took place among leaders
facilitated through collaborative interpretation of data? (2) How did the learning
contribute to leaders’ institutionally based funds of knowledge? (3) What proposed
changes were planned or made to existing institutional structures, policies,
procedures as a result of insights gained through the institutional self-assessment
process? Essentially, I sought to understand how an administrator’s learning through
data informed inquiry increases the knowledge of their environment to impact
student learning outcomes. The purpose of the study was to obtain greater
understanding of how an administrator’s understanding of their environment was
guided by data.
28
Significance of the Study
The accountability of higher education institutions in the areas of student
outcomes has drawn national attention. The U.S. Department of Education has
criticized higher education institutions, including community colleges, for not
providing “useful data” and for not being accountable to the public by reporting
success outcomes (U.S. DOE, 2006). However, the collection and reporting of data
to show success may not be enough to help practitioners understand how they can
create more equitable learning outcomes for all students. During the administration
of George W. Bush, the U.S. Department of Education mentioned their intent to
create a No Child Left Behind type of accountability system for postsecondary
institutions. In the current economic crisis that State and Federal governments are
facing, pressure put on institutions of higher education to prove their effectiveness is
unlikely to subside. In this new accountability environment, institutions are expected
to perform at a high level without the adequate resources to do so. This
accountability movement represents an opportunity for higher education institutions
to shape the direction of how accountability will be measured and analyzed in higher
education. More specifically, administrators at higher education institutions can help
lead the direction the accountability movement will take in the United States.
Currently politicians at the state and federal levels decide which accountability
measures will be implemented to assist them in determining how to fund institutions.
Yet, administrators’ knowledge and experience are likely to be primary factors in the
impact that an accountability framework will have on the assessment of practices at
29
the institutional level. Practitioners have a specialized institutional knowledge which
can contribute to the creation of accurate accountability measures to gauge
institutional effectiveness, but a strategy for incorporating that kind of information
into accountability is needed.
Rising tuition and increasing competition for admission have exacerbated the
debate regarding institutional accountability surrounding educational outcomes.
According to Dowd (2008), the assessment movement has forced the community
colleges to draw their attention to the achievement of “outcome equity” (p.5). The
community college system is dominated by multiple tensions which impact students’
educational outcomes. With greater pressure being placed on community colleges to
produce success outcomes among students with limited resources, many are hoping
that assessment will guide improvements. However, it may be unrealistic for
administrators to do more with the limited resources they have to work with. The
current economic situation which the country faces will increase the pressure placed
on administrators to do more with less. Administrators are key figures who decide
the direction that institutions will take on issues impacting students. The community
college system and four year universities in the United States educate the youth of
today and thus play a vital role in ensuring the economic vitality of this country.
Many professions and technical industries are requiring that employees have more
education to enable employers to compete in the local and global marketplace.
Much is at stake and administrators are at the forefront of finding ways to effectively
reach parity in educational outcomes among all students.
30
CHAPTER 2: CONCEPTUAL FRAMEWORK
Overview
The purpose of my dissertation was to understand how administrators within
a team setting used data to learn about problems students faced in the attainment of
their educational goals. I also sought to understand how learning based on data leads
to informed decision making on the part of administrators to influence learning
outcomes for students. Essentially, the purpose of the study was to obtain greater
understanding of how an administrator’s understanding of their environment was
guided by data.
The administrators who were examined in this study participated in the
Institutional Self-Assessment Project. The underlying philosophy of the project was
that awareness of inequitable outcomes is facilitated by inquiry on the part of the
practitioner through the collection of data. Inquiry was defined as the analysis of
information in order to understand the nature of the situation under investigation.
The aim of the project was to enhance institutional effectiveness to create equity in
educational outcomes. The project was based on the reasoning that in order for
practitioners to influence student outcomes, assessment systems are needed to assist
in the development of insight to guide decision making. In the current political and
economic environment, institutions of higher education are being held accountable
for the educational outcomes they produce. I begin the chapter with an explanation
of the main conceptual frameworks or lenses which illuminated my study of how
individuals in organizations learn. I then examine studies of organizational learning
31
and leadership in higher education conducted through these lenses. Lastly, I
conclude with a summary of the main points mentioned in the chapter.
The Institutional Self-Assessment Project involved teams of leaders and
practitioners at the community college level engaged in inquiry activities designed to
increase their understanding of the problem of differential student outcomes. The
premise of the project was that the knowledge that develops out of these
collaborative inquiry activities can equip community college practitioners with
knowledge to make informed decisions to improve learning outcomes. In other
words, when these practitioners inquire into and discuss the causes of inequitable
outcomes through structured inquiry activities, they will be more likely to use the
new knowledge that results from their participation to better meet students’
educational needs. The project sought to provide practitioners with sustainable
assessment procedures which can be used to guide institutional action. The
following sections in this chapter detail the theories of learning that underpin the
project, including organizational learning and funds of knowledge.
As discussed below, theoretical propositions which drove this study were:
1. Organizational Learning is a process that organizations use to learn
about how they understand and solve problems.
2. Practitioners have specialized institutionally based funds of
knowledge of their organization which they use to understand and
solve organizational problems.
32
Organizational Learning
One way in which institutions of higher education may be able to solve
institutional problems, such as inequitable educational outcomes, is through
organizational learning. In this section, I examine organizational learning and what
leaders can learn about their institutional settings to create change. An organization
is defined as a “government, agency, or task system” (Argyris & Schon, 1978,
p.356). Organizational learning has been defined by scholars in a number of
different ways. The most common definitions of organizational learning encompass
cognitive and behavioral perspectives where the goal of learning is to bring about
greater awareness of the institutional environment to prompt or influence informed
action on the part of the practitioner (Fiol & Lyles, 1985; Daft & Huber, 1987;
Argyris & Schon, 1978; Levitt & March, 1988; Kezar & Eckel, 2007; Garvin, 1993).
For example, organizational learning is defined by Kezar & Eckel (2007) as “the
process of intentionally acquiring and reflecting upon information and changing
organizational practices based on that information” (p.20). Similarly, Fiol and Lyles
(1985) define organizational learning as “the process of improving actions through
better knowledge and understanding” (p.803). Key elements of the organizational
learning process as identified in the literature are: data collection, interpretation of
data, conditions influencing learning, and types of learning.
Data is viewed as an important component to the organizational learning
process because it provides leaders proof of the existence of the problem. Thus, data
is the catalyst which facilitates greater awareness of a situation or problem within the
33
organizational learning framework. According to Garvin (1993) data is used to
confirm the existence of the problem, make meaning of situations, create solutions to
address problems, demonstrate effectiveness, and change institutional practices.
Similar to Garvin’s model, members of the Institutional Self-Assessment Project
were guided by an inquiry paradigm which contained the following steps: (1) collect
data, (2) identify gaps in performance, (3) identify causes for gaps, (4) construct
informed solutions, and (5) evaluate implemented solutions. Often practitioners may
identify organizational problems using anecdotal evidence which may or may not
present an accurate picture of the situation being examined. Data in the form of
institutional indicators such as success rates in courses, graduation rates, time to
degree, and attrition rates can help provide a context for leaders in which to view the
problem. The aim of data is to allow practitioners to understand the intricacies of
problems and predict how institutional policies may impact outcomes. Data
increases the understanding leaders have about a situation to begin to think
differently, create new insights to guide action, and establish actions to improve
organizational effectiveness. An administrator is sometimes so far removed from
students that they might focus more of their attention to supervising and managing
resources. While there is no distinction made on the types of data which are better
suited to assist leaders to understand the nature of the problem, it is implied that any
data which illuminates the nature of the phenomenon is helpful to the institution.
However, looking at data collected around a problem does not by itself
encourage organizational learning. The interpretation of data is important to the
34
generation of new knowledge about the institution. Moreover, Garvin (1993) viewed
organizational learning as having three concurrent processes which are meaning
making, management of knowledge, and measurement. Meaning making refers to
the process of creating opportunities for members to be exposed to new concepts in
the form of insights in a group setting. The management of knowledge means that
new knowledge would be disseminated to other departments so they too could learn.
Lastly, measurement is important because it is an indicator of progress towards
organizational goals. Garvin’s (1993) conceptualization places the burden of
encouragement on the part of the organization to create and sanction exchanges
where the interpretation of data would take place. There is no specific mention in
Garvin’s (1993) work about the types of inquiry activities to use, but he does
recommend that activities should be based on the scientific method. Within Garvin’s
model, the facilitator for activities that promote learning is the leader within the
department or division he or she leads. Similarly, Huber (1991) contended that the
creation of new knowledge was generated by “acquiring, distributing or interpreting
information” (p.89). The interpretation of data serves as a mechanism to create
knowledge. Once again no specific mention is given about the specific activities
which facilitate the interpretation of information. On the other hand, Kezar and
Eckel (2007) believe that interpretation of data should be done collectively with the
involvement of groups of individuals. Bensimon and Neumann (1993) found that
teams facilitated cognitive functions for leaders which included: (1) understanding
problems from diverse viewpoints, (2) encouraging discussion, (3) giving thoughts
35
and recommendations. Each member of the team was as important as the next in
assisting the team to reach a consensus on the issues or problems being analyzed.
Teams have members who play different roles which were identified as the
following: definer, analyst, interpreter, critic, synthesizer, display monitor, task
monitor and emotional monitor. They argue that the diverse experience and
knowledge contained within a group assist in the development of knowledge.
Marsick and Watkins (1999) also believed that inquiry activities using groups could
be used to create knowledge because learning is believed to be facilitated through
group discussions. Group activities facilitate the interpretation of information
because of the diversity of expertise each member brings to the group.
Furthermore, Daft and Huber (1987) believed that the interpretation of
information was dependent on the richness of the media used to process information.
For example, activities which were rich led to a more rapid processing of
information. Rich media was described as modes of communication which gave a
practitioner immediate feedback to assist in the processing of information. For
example, face to face conversations and telephone conversations were seen as rich
activities. The awareness data provides comes about by interpreting and making
sense of the data; the data, through this learning process, becomes a form of usable
knowledge.
There are different types of insights gained from interpreting data when using
an organizational learning framework. Some of the types of learning which may
occur as identified in the literature is single, double, deuteron, past, lower-level,
36
higher level, and mutual learning. The most important aspect of analyzing data is the
creation of insights. The generation of new insights may influence how leaders and
practitioners implement solutions. One common type of insight created is termed
single loop learning. Single loop learning essentially means that an organization
gains insight about a problem which was caused by an institutional policy or
assumption and took actions to correct the policy but not assumption which created
the problem (Argyris & Schon, 1978). The insight gained from analyzing data
assisted in the identification of the problem on the surface but at a deeper level the
problem still exists. On the other hand, when double loop learning occurs an
organization changes the assumption or belief which caused the creation of the
problem (Argyris & Schon, 1978). This type of learning addresses a problem at a
deeper level and ensures that this type of problem does not occur in the future.
Next, deuteron-learning is when individuals from an organization learn about
previous learning which took place or did not take place within the institution
(Argyris & Schon, 1978). In this instance of learning, the individual or practitioner
becomes a historian and learns from the experiences of others from the recent past.
Similarly, Levitt and March (1988) stated that organizations engage in “learning by
encoding inferences from history into routines that guide behavior” (p.319). Past
actions provide the information aspect which allows an institution to learn and then
act. Another type of learning organizations can engage in is lower and higher level
learning. According to Fiol and Lyles (1985) lower level learning occurs when an
organization makes basic adjustments to specific behaviors and outcomes. The focus
37
is on resolving a problem in a fast and efficient manner. Higher level learning on the
other hand is aimed at changing institutional rules and norms. Low and high level
learning are similar concepts to Argyris and Schon’s concept of single and double
loop learning.
Lastly, mutual learning is based on the thought that organizations have their
own specific knowledge and ideals which drive an institution. The knowledge and
ideals are transmitted to individuals who form the institution and impact the way that
individuals make sense of situations. March (1991) described the model of mutual
learning as having four key features which are: reality and beliefs, beliefs about
reality are unique to each person, people modify their beliefs all the time, and
organizational knowledge and beliefs mirror the reality of the institution. The
“organizational code” will adapt to the beliefs of the group in power. Essentially,
organizations impact how individuals learn and individuals impact how
organizations learn. A symbiotic relationship exists where institutions change
individuals as much as individuals change institutions.
The understanding that leaders have about their institution is key to the
establishment of new policies and procedures which impact student outcomes.
Institutional contexts have their own unique cultures which impact the type of
interpretation which will take place based on the information collected. For
example, Fiol and Lyles (1985) mention that the accepted norms or culture at the
institution will influence the cognitive and behavioral development among
practitioners. Moreover, Argyris and Schon (1978) advanced a similar notion that
38
institutional actors act based on their conceptualization of the institution’s theory of
action or accepted assumptions to decide which theory to use. The theory of use is
the action based on the theory of action which a leader or practitioner chooses to use
to act. Key in this conceptualization is the knowledge that the practitioner has about
their institution. If practitioners do not have an accurate picture of the assumptions
guiding the institution, he or she may not act according to established norms.
Furthermore, leaders and practitioners use their theory of action or theory in use to
interpret information.
Essential to the learning which can take place when an organization learns, is
the establishment of policies and procedures within the organization to encourage
learning. The establishment of policies and procedures has been dependent on a
number of different institutional factors. For example, Levitt and March (1988)
believed that organizational learning was dependent on the routine or rules,
procedures, beliefs, or culture established at the institution. The routines were based
on previous learning which took place and was institutionalized in the form of a
routine. The organization had to believe that the routine was worth implementing
based on the effectiveness of previous learning. This implementation was largely
dependent on the institutional culture or accepted beliefs and traits of the
organization.
Similarly, Marsick and Watkins (1999) suggested that organizations could
develop their capacity to learn by building a “learning infrastructure” which
contained policies and procedures whose aim was to encourage the development of
39
knowledge among individuals and groups (p. 69). At the institutional level, this
would mean reinforcing the importance of learning along three dimensions which
would include: systems level structures for continuous learning, structures to assist in
the management of knowledge outcomes, and improvement in the organization’s
performance and value (Marsick and Watkins, 1999). The rationale for structures to
promote continuous learning is that individuals are seen an integral part of the
system which function on behalf of the system. As such, they are guided by systems
in place at the organizational level. Individuals take what they learned to the system.
There is a symbiotic relationship between the system and the individual. Learning
takes place at the individual, team, and organizational level. In order for learning to
take place on a consistent basis, it has to be part of the system, supported by the
system and sustained by the system. The management of knowledge is important
because facilitating and managing learning is a process. A structure needs to be in
place to help individuals make meaning or construct knowledge. The focus is on
how people make meaning of a situation and build knowledge through action. The
creation and use of knowledge is the most important factor in the management of
outcomes.
Additionally, Fiol and Lyles (1985) described four factors which increase an
organization’s capacity to engage in learning which includes: culture, strategy,
structure, and environments. These four factors affect the probability that learning
will occur within an organization. First, culture refers to the accepted norms which
have the potential to influence a persons’ behavioral and cognitive development.
40
Essentially the assertion is that learning will be influenced by the institutional culture
which places value in learning on the part of individuals. An individual’s action will
be determined by their knowledge of their institutional culture. Second, the strategy
in use at the institution also has the potential to impact learning. Strategy refers to
the goals and objectives that the institution sees as the main priorities of the
institution. The goals and objectives shape how problems are identified and acted
upon. Any individual action which is not congruent with the strategy in place will
influence learning. Thirdly, structure influences the type of learning taking place.
Structures in place at the institutional level determine how the organization will
learn. Some structures encourage more learning while others encourage less
learning. Practitioners are guided by informal and formal practices which encourage
or discourage learning. Lastly, environments are important because the
organizational environment must be stable enough to allow for learning to take place.
For example, an organization that is constantly hiring new personnel may experience
a level of inconsistency because new staff needs to be trained in the responsibilities
of the job so the level of production may not be at the optimal level. Constant
change within the organization discourages learning because conditions are in a state
of flux. An organization cannot afford to be at either end of the spectrum because of
the impact the institutional environment has on learning.
Institutionally Based Funds of Knowledge
The accumulation of knowledge which an individual compiles through the
organizational learning process enables practitioners to understand their environment
41
at a deeper level. One could argue that the aim of the individual at the organizational
level is to develop their understanding of the environment to efficiently navigate
within the organizational environment. This fund of knowledge regarding the
organizational environment is essential to accessing resources at the institutional
level. There is a distinct difference between knowledge and a fund of knowledge.
Knowledge is the accumulation of information one knows about a subject, situation,
or topic. A fund of knowledge is generally described as information which can be
used to achieve an outcome on the part of the individual. For example, Velez-
Ibanez and Greenberg (1985) described a fund of knowledge as a cultural or strategic
resource available to practitioners. Under this definition, the individual knows
something of use to transact or obtain something which is desired. Another
definition for the funds of knowledge concept is that of strategic knowledge and
related activities which are necessary to function within an environment (Moll,
Amanti, Neff, and Gonzalez, 1992). Similarly, the description of the concept was
expanded to include the ability of an individual to accumulate a body of knowledge
which can be used to function and succeed in the environment (Velez-Ibañez and
Greenberg, 2005). Additionally, funds of knowledge have been defined to include
knowledge of the values, attitudes, and beliefs which govern the functioning of the
institutional environment (Stanton-Salazar, 1993). Essentially a fund of knowledge
is defined as information which can be used to create change. A fund of knowledge
is useable information. In order for information to be said to contribute to a fund of
42
knowledge it needs to contribute to a cognitive repository of information and be
useful information which results in the attainment of some kind of tangible benefit.
Stanton-Salazar (1997) theorized that there are seven forms of institutionally
based forms of knowledge at the disposal of practitioners and leaders. The seven
forms of knowledge are institutionally sanctioned discourses, subject specific
knowledge, organizational knowledge, development of network knowledge,
technical knowledge, career knowledge, and problem solving-knowledge.
Furthermore, Stanton-Salazar contends that social and institutional structures in
place perpetuate a cycle of oppression which leads to differential learning outcomes
among minority youth. He argues that in order for educational outcomes to change
for underrepresented populations, practitioners and leaders must connect with
minority youth to help them understand the social and institutional systems.
Practitioners who make a commitment to reaching out to minority youth to impact
learning outcomes are defined by Stanton-Salazar as being institutional agents. The
institutional agent concept is important in our discussion of the role leaders play in
the achievement of equitable learning outcomes because they can serve as advocates
to create more equitable learning outcomes for all students. Institutional agents are
concerned with connecting students to resources and people which will assist them in
being successful. Agents go out of their way to use their knowledge of institutional
structures to influence and to assist students achieve their educational goals.
Additionally, Stanton-Salazar contends that the relationships minority youth form
43
with institutional agents depends on the “social structure of the school and student’s
help seeking orientation (2001, p.191).
Moreover, institutionally based funds of knowledge are the key form of
knowledge which impacts a student’s success. This first form of knowledge is
important because it determines the level of familiarity a student will develop with
the structures in the system. The more connected or socialized a student is the more
likely he or she will become a part of the system. The level of socialization in turn
will determine the student’s progress along educational pathways. Administrators
and other institutional agents can assist students in this socialization process by the
types of programs and services they offer to students. However, the leader or
practitioner must have a certain level of familiarity with the structures of the
institution in order to impact change for minority youth. Administrators identify
which student populations need more resources to ensure their success at the
institutional level. This identification with student populations is not possible
without the use of organizational learning. Administrators cannot understand the
existence of inequitable outcomes if measurable outcomes are not collected and
interpreted to shed light on the prevalence of inequity.
The notion of institutionally sanctioned discourse refers to the appropriate
use of cultural symbols and modes of communication within majority culture
institutional structures. A student’s experience with institutionally sanctioned
discourses helps students communicate, interact, and exchange ideas within the
institution. This decoding of the system leads to a student being able to not only
44
make sense of the messages being sent to him or her but also empowers the student
to maneuver within the system. Administrators help guide students by exposing
them to the common communication methods and accepted language to use when
communicating. While all professional staff has a role in serving as guides, formal
leaders determine the vision and direction the institution takes in serving students.
Obtaining this key form of knowledge is not a simple task for minority
students. A student has two major obstacles to overcome in obtaining the knowledge
of institutionally sanctioned discourses. The first obstacle is gaining awareness of
the importance of information presented within the system and the second is finding
an institutional agent to help them navigate the institutional network. In essence, a
minority youth needs to be able to have access to the information network and a
guide is needed. The guide serves as a resource to contact when there are problems
with the information accessed. Minority students are at a disadvantage from the
beginning because the knowledge constructed within educational institutions is
unfamiliar to them. Knowledge is constructed for students from the majority culture
through discourse in the familial, public, and institutional sphere. Children from the
dominant culture (majority) have an advantage to minority youth because the
majority student learns the system at home through their parents since birth. Minority
youth have trouble forming social networks in large part because they cannot
decipher the information presented to them as useful and worthwhile.
The fund of knowledge which is essential for an administrator to be able to
function within an institution is the organizational fund of knowledge. Stanton-
45
Salazar (2001) defined organizational knowledge as the “knowledge of how
bureaucracies operate—chains of command, resource competition among various
branches of bureaucracy” (p.269). While there is no specific mention of specific
elements mentioned under this definition, the implied elements which make up this
fund of knowledge are knowledge of the values, attitudes, or beliefs which inform
the processes in place within the institutional environment. The rationale or rules
which guide action are important to understand if an administrator wants to function
and succeed within the institutional environment. Consistent with the funds of
knowledge definitions, this fund of knowledge provides the practitioner with useable
information which will help one achieve a task or ensure the ability to function
within the institutional environment.
An essential part of Stanton-Salazar’s institutionally based funds of
knowledge schema is the role that practitioners play in achievement of equitable
student outcomes. Institutional agents may not be aware of the disparities which exist
in the educational attainment among different ethnic groups. A leader’s experience
with minority youth might or might not yield accurate information about the barriers
minority youth face in the attainment of their educational goals. Additionally, a
leader will not have direct contact with students and thus will rely on the knowledge
communicated by his or her staff members. It is the responsibility of the leader to
facilitate learning within organizational learning to assist others to gain new insight
and understanding. Therefore, the knowledge a leader possesses of his or her
institution and information of a variety of student populations can help in the
46
identification of institutional barriers which hinder student success. Most
importantly, the knowledge of the social structure at the institutional level is a
necessary element to bring about changes which will impact the creation of equitable
learning outcomes. Moreover, the knowledge of the social relationships which
impact student success is key to making changes to impact equitable learning
outcomes. The role of the leader is emphasized in a study conducted by Bensimon
and Neumann (1993). Bensimon and Neumann conceptualized the main function of
a leader as helping individuals construct meaning by bringing out what they “already
know, believe, and value” (xv). In this definition of leadership, the leader uses the
collective knowledge and experiences of others to create new knowledge and
solutions to problems.
Organizational learning helps develop an administrator’s understanding of
the barriers students face. The administrator can then implement policies and
procedures to ameliorate the problems. The process that administrators use to
recognize and solve problems has the potential to impact some or all student learning
outcomes. Recognition of institutional problems may not be an easy process
especially in institutions of higher education with limited resources. Ongoing and
sustainable organizational learning through assessment is key to assisting in the
development of leader’s and practitioner’s funds of knowledge. After all,
institutional leaders have the knowledge and influence to change, implement, and
modify policies and procedures that have the potential to impact student success.
47
Increasing an administrator’s funds of knowledge becomes the single most important
factor in increasing equity in educational outcomes.
In summary, organizational learning occurs through an assessment process
which can serve to guide higher education leaders to resolve the problem of
differential learning outcomes. Organizational learning increases the knowledge
practitioners and leaders have about their environment as it relates to student
populations and organizational processes. Organizational learning relies on data to
create an accurate picture of problems, uses the collective expertise of practitioners
to interpret data, encourages learning by facilitating learning, and it is shaped by the
organizational environment. Data is converted to usable knowledge through the use
of inquiry activities which practitioners can use to increase their understanding of
their institution. This understanding then impacts practitioners’ funds of knowledge
regarding what attitudes, values, and beliefs present in the institutional environment
hinder or increase student success. By values I mean what stakeholders hold as
important in educating students. Attitudes are perceptions held by practitioners
regarding what helps or hinders student success. Lastly, belief is the knowing of
factors that are thought to influence student success without actually knowing.
Beliefs may or may not be based on knowledge.
Organizational Learning in Higher Education
The decisions administrators make have the power of determining the
direction an institution will take in addressing organizational problems. The
knowledge and expertise an administrator has developed guide decisions to obtain
48
resources for existing or to create new programs and initiatives. Some administrators
may have a deep understanding of the appropriate and effective mechanisms to use
to promote or defend initiatives within the organization. Within the organizational
learning framework practitioners play a key role in identifying and constructing
solutions to organizational problems. Other administrators need assistance in
developing their knowledge of the mechanisms which they can use to influence
change at the institutional level. The knowledge they possess of the institution will
help administrators identify problems and appropriate courses of action to solve
problems. There is a lack of empirical studies on organizational learning in higher
education as identified by several scholars in higher education (Bauman 2005; Kezar
2005; Huber 1991). Of the few studies that exist in higher education on
organizational learning, very few explore how leaders use data to inform decision
making.
Ramaley and Holland (2005) described how leaders at Portland State
University (PSU) used organizational learning strategies to create transformational
change. The authors found that change could occur when a leader and campus
community engaged in change as a “scholarly exercise” embedded in the
organizational learning literature (p.75). The scholarly approach taken by PSU
included the following steps: (1) building a case for change; (2) creating clarity of
purpose; (3) working in a scholarly mode at a significant scale; (4) developing a
conducive campus environment; (5) understanding change itself. Central to the
process of understanding the current and changing conditions of the institution was
49
the consistent collecting of evidence based on data. Once the institution had a sense
of the conditions, stakeholders identified institutional priorities based on the
collective mission, values, and beliefs of the institution. Next, institutional leaders
implemented change on an initiative that people could identify and support as
needing attention. Systems were revised to support the change process including the
institution’s capacity to collect and process data related to change. Overall, the
major change was that the institution made informed decision making based on data.
Ramaley and Holland’s (2005) study points out several factors necessary for
organizational learning to take place which include problem identification, creation
of new perspectives on the nature of the problem and implementation of solutions to
ameliorate the problem. Consistent with the organizational learning literature, data
was important to understanding the prevalence of the problem and used to support
decisions made by administrators to create change. The context of the problem was
defined by practitioners through the interpretation of data through collaborative
activities. A “warrant” was then used to drive the implementation of action to
correct the problem (Ramaley and Holland, 2005, p.77).
While Ramaley and Holland emphasized the role of the leaders in a process
of change involving organizational learning, Bauman (2005) emphasized the role of
teams in the change process. Bauman sought to examine how practitioners involved
in teams recognized organizational problems specific to differential learning
outcomes. One of Bauman’s findings was that many practitioners were not aware
that inequities existed. Bauman (2005) also found that learning occurred in different
50
rates among practitioners. There were low and high learning groups. Learning was
high in groups when members engaged more naturally in an organizational learning
process. According to Bauman (2005), organizational learning in groups can take
place if certain elements are present in a group. Environmental factors which
facilitate organizational learning in groups include the introduction of new ideas,
presence and acceptance of doubt, and creation and transmission of knowledge
among group members. Bauman (2005) stressed that high learning groups, those
with the group characteristics mentioned above, process information related to
problems in a distinct way. New knowledge is developed by practitioners in a
collective setting and data is the catalyst for better understanding of an institutional
problem.
The most important finding from Bauman’s (2005) study is the role data
played in the creation of knowledge. Once again, consistent with the literature,
discussions among group members based on data led to the questioning of values,
beliefs, attitudes and institutional characteristics which were accepted previously.
The discussions begin with the presentation of data followed by reflection and group
discussion to the repercussions of what the data suggests. Bauman (2005)
demonstrated the importance of practitioner recognition of differential learning
outcomes. The recognition of the problem is the first step towards the establishment
of change mechanisms at the institutional level to create equitable educational
outcomes.
51
Bensimon (2005) studied how practitioners can solve organizational
problems pertaining to unequal student outcomes among various ethnic/racial
groups. She framed the main problem as not an institutional problem but as a
problem among “institutional actors” such as administrators, faculty, counselors and
others. Bensimon found that a practitioner’s cognitive perspective helped him or her
make sense of a situation and guided decision making. The main issue for
practitioners was how awareness informed practice. Awareness on the part of
institutional actors in combination with their values and beliefs influence their
interactions and actions with students of color. Furthermore, data on the educational
outcomes of students was disaggregated by race to expose practitioners to the
differences in educational outcomes. Bensimon analyzed practitioners’ cognitive
frames through observations and more specifically through the use of language. The
language of practitioners changed in noticeable ways through the process of
collecting data and reflecting on data to create more equitable educational outcomes
for students. Bensimon shows learning did take place in their cognitive framework
but does say that it is not clear how this change in thinking was going to influence
the change undertaken by individuals at the institutional level. The most significant
finding from this study is that in order for individuals to create change a greater
understanding of the true nature of the problem must be reached by practitioners. If
people do not see something as a problem, they are less likely to act.
52
Leadership and Equitable Outcomes
The role leaders play in the process to create more equitable learning
outcomes for students has not been examined according to several scholars in higher
education (Bensimon, 2007; Birnbaum, 1988; Kezar 2003). Few empirical studies
explore the role of leaders in creating more equitable learning outcomes. Leadership
has been defined in many different ways. The definitions can be categorized into
realms such as influence over groups, influence of an individual, and influencing
behavior changes. Leadership is “a process whereby an individual influences a
group of individuals to achieve a common goal” (Northhouse, 2007, p. 3). Similarly,
Birnbaum (1988) viewed the role of the administrators of a college or university to
help institutional members to make sense of the reality contained in the institution
and drive action to achieve outcomes. The administrator serves as a facilitator to
assist others to understand the nature of institutional problems and assist with
defining the range of solutions. The leader’s knowledge of the institution influences
the outcomes they will be able to achieve.
Moreover, the power or influence that a leader has over those he or she
supervises is important as well. The influence that leaders have is due largely to how
they use their power. There are five bases of power which a leader uses to influence
others (Northouse, 2007); those five bases of power are: referent, expert, legitimate,
reward, and coercive power. Referent power is the association that the follower has
with the leader. The more liked the leader is the more influence they have over the
follower. Expert power is the perception the follower has about the leader’s
53
knowledge. The more knowledgeable that a follower thinks their leader is the more
influence the leader has over the follower. Legitimate power is the power associated
with the status the leader within an organization. The higher the title of the leader
the more influence they have. Reward power is the ability the leader has to provide
incentives to others. Lastly, coercive power is the ability a leader has to punish
others. Implied in this theory is that the more power that a leader has the more able
he or she will be to influence changes to the institution.
Leaders use the knowledge and power they possess to make decisions about
which changes to make within the institution. Eckel and Kezar (2003) studied how
leaders created transformational change in institutions of higher education.
Transformational change is a process which “alters organizational structures and
processes, leads to reorganized priorities, affects organizational assumptions, and
ideologies, and is a collective, institution-wide undertaking” (Eckel & Kezar, 2003,
p.53). The primary role of leaders in the change process was to help stakeholders
view things differently. Leaders facilitated the accomplishment of new
understandings by determining distinct ways to make meaning on previously
understood structures and by changing the way ideas were communicated. In
essence, campus leaders facilitated change by challenging existing values and
beliefs, creating new conceptions of existing structures, use of language to create
collective meaning of change, and by involving stakeholders in the change process.
The key component for a leader to be able to facilitate the change process was a firm
understanding of the institution and the institutional culture.
54
A leader’s understanding of the institution is guided primarily by the way
they come to view and interpret situations. Bensimon (1989) found that 24 (75%) of
presidents were guided by one or two cognitive frames or mental schemas. The
majority of respondents where guided by one frame. These mental schemas were
viewpoints which leaders used to make sense of situations. The four mental schemas
or frames that leaders used were the following: bureaucratic, collegial, political, and
human resource. A person who values the bureaucratic frame views organizations
with clearly established divisions of power and leaders have the influence to look at
a problem, decide the solution, and implement it. In the collegial frame, the
development of the individuals in the organization is the focus of the leader. The
political frame guides leaders to view the institution as a group of people
maneuvering for control of policies and resources. Lastly, in the symbolic frame the
leaders create and encourage organizational norms which lead to the development of
a distinct organizational culture. The implication of the findings from Bensimon’s
(1989) study is that a leader’s mindset will help in the identification of institutional
priorities as well as the actions taken to address these priorities.
Furthermore, Carey (2005) found that colleges and universities with the best
graduation rates among historically underrepresented students were successful due to
two factors which were: leadership and use of data to monitor student success.
Leaders played a pivotal role in promoting and sustaining an institution’s focus on
the performance goal of increasing graduation rates. Leaders identified the problem
and took action to correct the problem. Administrators ensured that the priority at
55
the institutional level was the resolution of the problem. The effort exhibited by the
institution stayed consistent for several years to encourage a sustained effort to
support change. Additionally, leaders used data on a consistent basis to obtain
snapshots on student performance to make informed decisions on policies which
could be changed to impact student outcomes.
Moreover, Bensimon and Neumann (1993) examined the role presidents play
in higher education and found that presidents are team builders which use teams in
three distinct ways. The three ways leaders use teams are in utilitarian, expressive,
or cognitive ways. The utilitarian team helps the president of the institution to stay
informed of the issues impacting the institution while helping him or her stay in
control of situations (Bensimon and Neumann, 1993). The expressive function of
the team helps the president by giving him or her advice and support. The president
also provides advice and support for the team members. Lastly, the cognitive
function refers to the team which confronts the president by “questioning,
challenging, and arguing” which in turn shapes the way the team looks at a problem
(Bensimon and Neumann, 1993, p.42). Leaders have the influence and resources at
their disposal to create change at the institutional level.
The knowledge and leadership development of current community college
leaders is of extreme importance given the influence leaders have in creating more
equitable outcomes. There is a gap in the research in this area in higher education
which leaves a gap in knowledge which could benefit all community college leaders.
What knowledge must a leader possess to effectively reach parity in educational
56
outcomes at the community college level? While countless articles exist on
leadership and what it takes to be a leader, there is little to no research on what
knowledge and skills a leader should possess to impact student outcomes. The
American Association of Community Colleges (AACC) published core
competencies which an effective community college leader must possess based on a
survey of practitioners. The core competencies are organizational strategy, resource
management, communication, collaboration, community college advocacy, and
professionalism (Ottenritter, 2006). The two most relevant competencies to this
study are the development of organizational strategy which involves a leader
continuously assessing, developing, implementing, and evaluating strategies to
“monitor and improve the quality of education”(Ottenritter, 2006, p.16). A key
element to assessment and evaluation is the use of data to help identify institutional
practices which positively impact student success.
Summary
The organizational learning literature guided the structure of the present
study. According to the literature, organizational learning involves acquiring data,
interpreting data in a collective setting, and learning through data. Conditions which
impact organizational learning all contribute to increasing a leader’s understanding
of their environment. The two constructs of organizational learning and
institutionally based funds of knowledge are two related theoretical constructs which
were examined to provide insight into how practitioners increased their expertise. I
believed that through organizational learning practitioners can increase their
57
knowledge and awareness of their institutional setting by increasing their
institutionally based funds of knowledge. Within the organizational learning
framework, I focused on determining how data brought about awareness of
institutional factors which impact student success. Awareness was seen as the first
step to help practitioners identify gaps in institutional performance. Once
practitioners were aware of existing conditions then they could identify the problem
or gap they would like to learn about. Learning was defined as the discovery of
factors contributing to the gaps in performance. In this study I focused on the
development of awareness or identification of gaps in performance among students.
The learning aspect I was most interested in was how practitioners determined the
institutional factors which contributed to student under-performance. Moreover, I
then focused on how awareness and learning through organizational learning
impacted the understanding of the environment, particularly a practitioner’s funds of
organizational knowledge which includes the understanding a practitioner has
regarding the attitudes, values, or beliefs present in the institutional environment that
control institutional functioning. The rationale for looking at how data informed the
attitudes, values, and beliefs impacting student success was that administrators must
know their institutional environment to create changes to impact student success.
Additionally, I looked at how administrators would use the information of the
attitudes, values, or beliefs to create changes based on what they knew about how the
organization functioned. Administrators must have the organizational knowledge in
order to make changes within the organizational environment. By having all the
58
necessary information which helps them understand how the attitudes, values, or
beliefs govern institutional processes in place, a practitioner will be better equipped
to understand how to create more equitable learning outcomes. Furthermore, the
literature provides some key insights for the development of knowledge based on
data and facilitated through collaborative activities (Garvin, 1993; Kezar & Eckel,
2007; Marsick & Watkins, 1999). In fact, collaborative interpretation of data
through the use of team meetings has been shown to bring about greater
understanding of the nature of a problem because individuals come to the groups
with varying levels of expertise (Bensimon, 2005; Bauman, 2005). The learning on
the part of institutional leaders leads to more informed action (Ramaley & Holland,
2005; Carey, 2005). Learning is then used by leaders to implement changes based on
existing knowledge of the nature of the institution. Some scholars believe that the
overall aim of organizational learning is for the leader or practitioner to expand their
awareness as to how institutional policies impact student success. Leaders and
practitioners are at different levels of experience which impact the way they view
and act on conditions present within the institution (Bensimon & Neumann, 1993).
It is my belief that by establishing structures which facilitate and encourage
organizational learning, leaders will be better equipped to build their institutionally
based funds of knowledge. Organizational learning can be an important way to
redress the inequitable educational outcomes of students at the community college
level. By developing leaders’ funds of knowledge changes can be made by leaders
based on insight gained through organizational learning. The empirical literature on
59
organizational learning and leadership in higher education contains gaps in the area
of how leaders use processes of organizational learning to increase their funds of
knowledge to impact student learning outcomes. This kind of learning can
presumably be facilitated by using data and inquiry activities, but little empirical
evidence demonstrates how. This study examined how practitioners and leaders
used data to learn in order to make organizational changes to improve student
outcomes.
60
CHAPTER 3: METHODS
I begin the chapter by providing a brief overview of the Institutional Self-
Assessment Project and how participants were selected to participate in the larger
project. Next, I will describe the purpose of this research study and I will delineate
the methodology used to collect, analyze, and interpret data. I conclude the chapter
by describing the relationships I was interested in understanding further.
Overview of Project
The Institutional Self-Assessment Project, funded by the William and Flora
Hewlett Foundation and the Ford Foundation, sought to enhance institutional
effectiveness and equity in community colleges in California. The aim of the project
was to increase the number of underrepresented students, African Americans and
Latinos, succeeding in courses starting at the basic skills level and transferring to
four-year universities. The underlying philosophy of the project was that awareness
of inequitable outcomes was facilitated by inquiry on the part of the practitioner
through the collection of data. The approach that was taken with teams was that data
analyzed in group settings would allow practitioners to gain a greater understanding
of the barriers students face in the pursuit of their educational goals. Data was
collected and reviewed by practitioner teams to confirm the existence of the problem
of differential learning outcomes and to construct solutions to redress outcomes. The
project was a partnership between the Center for Urban Education (CUE) at the
University of Southern California and California community colleges. Practitioner
teams were convened at each institution. The teams were comprised of about no
61
more than ten members from the community college and the Center for Urban
Education. The teams were a method of implementing participatory action research
which involved practitioners and experts working to understand and create solutions
to solve a real world problem. Involving practitioners in the problem solving process
allowed those most knowledgeable about the problem a greater understanding of
how actions and perspectives present at the institutional level impacted the problem
(Greenwood and Levin, 2002). In most instances, college Presidents appointed
practitioners to the team to represent the college. The teams representing different
constituencies on campus such as faculty, staff, administrators, and CUE researchers.
The teams met on a monthly basis at their institutions. Participants engaged
in a practitioner-driven cycle of inquiry. The cycle included the following steps:
building teams, looking at student success rates, examining current data and
institutional practices, setting goals, benchmarking, implementing solutions,
evaluating, and communicating findings to the campus community. The larger goal
for institutions was the development of institutional capacity to conduct assessment,
evaluation, and inquiry for increased equity and effectiveness. The purpose of each
meeting was to have practitioners participate in the cycle of inquiry to create
effective organizational strategies that would impact student success.
The Setting
Hills Community College (a pseudonym) was located in Los Angeles County
and had an annual enrollment of about 20,000 students each semester. Hills was
fully accredited by the Accrediting Commission for Community and Junior Colleges
62
of the Western Association of Schools and Colleges. The institution graduates about
600 students each year, who receive Associate of Arts/Science degrees.
Additionally, the institution awards about 500 certificates in vocational areas. The
ethnic composition of the student population at the college was as follows: 68%
Hispanic, 12% Asian, 10% White, 2% Black, and 8% other or undeclared. Hills was
a Hispanic Serving Institution (HSI) which is a federal designation given to
institutions of higher education serving a student population of Hispanic students
representing at least 25% of the overall student population. There were five board of
trustees, seven senior managers, and twelve Academic and Student Affairs deans.
Purpose
The purpose of the study was to examine how administrators participating in
the project from Hills College have participated in the inquiry activities through the
Institutional Self-Assessment Project and how they applied what they learned at their
institution. I had three research questions, which were: (1) What types of learning
took place among leaders facilitated through collaborative interpretation of data? (2)
How did the learning contribute to leaders’ institutionally based funds of knowledge?
(3) What proposed changes were planned or made to existing institutional structures,
policies, procedures as a result of insights gained through the institutional self-
assessment process?
I used the concepts of organizational learning and institutionally based funds
of knowledge as lenses with which to answer my research questions. Organizational
learning is the process by which organizations identify, interpret, and solve
63
institutional problems. Institutionally based funds of knowledge is a concept which
states that practitioners have specialized knowledge of the educational setting they
work in, which they may use to become institutional agents for students to impact
change.
Design Strategy for Qualitative Inquiry
My study was a qualitative study which used a blend of naturalistic and
purposeful design strategies. Naturalistic inquiry refers to the study of actual
situations as they occur in the world and the findings emerge naturally from the data
collected (Patton, 2002). The data collection was characterized as qualitative for the
purposes of this study. Qualitative data are described as “quotations, observations
and excerpts from documents” (Patton, 2002, p.47). The purpose of qualitative data
is to illuminate a situation, person, or phenomenon so that others may understand the
object being studied. Additionally, Stake (1995) describes the goal of a qualitative
researcher as gaining knowledge of the relationships that exist in the case.
According to Stake (1995) qualitative research is holistic, empirical, interpretive, and
empathic.
Similarly, a case study approach was used in the examination of each
participant’s experience within the study. According to Stake (1995), “case study
researchers use the method of specimens as their primary method to come to know
extensively and intensively about the single case” (p.36). The specimen is the
individual under examination which will assist the researcher to gain a better
understanding of how and what impacts the individual. As such, only administrators
64
who would illuminate the role of formal leaders in creating institutional change was
selected to participate in this aspect of the study.
Participant Selection
Hills community college was selected as a site for this study for several
reasons. First, the institution contained a significant amount of team members who
participated in the Equity for All project, based out of the University of Southern
California. Institutions who participated in this previous study were asked to
participate in the Institutional Self-Assessment Project. Members who chose to
participate in the Institutional Self-Assessment Project from Hills Community
College were selected to participate in this study. Purposeful sampling was used in
selecting administrators to participate in the research study. Purposeful sampling is
the selection of participants who will provide an “in-depth understanding” of the
situation being studied (Patton, 2002, p. 46). The administrators were selected based
on two criteria which were: experience as practitioners at the community college
level, and participation in the Institutional Self-Assessment Project. Table 1 shows
the administrators participating in the study as well as other participants in the
overall Institutional Self-Assessment Project.
65
Table 1: Institutional Self-Assessment Project Participants
Name Position
John Marquez Vice President of Student Affairs
Melody Sky Dean of Academic Affairs
Carl White Dean of Counseling
Samantha Brown Dean of Institutional Research
Joy Ortiz Math Professor
Emily Nightengale Communications Professor
Robert O’Brien Research Analyst
Stefanie Hernandez Math Professor
Juan Martinez ESL Professor
Martha Romano Social Science Professor
Elizabeth Azure Counselor
Mark Gordon Research Associate
Maria Freedman Research Associate
The administrators who were asked to participate in the study had at least ten
years of administrative experience and held key leadership positions throughout the
Hills college campus. This was significant because they were most likely to have
had specialized knowledge of the campus which they could use to impact change at
the institutional level. The candidates who participated in the study had the
potential to illuminate how leaders use data to inform decision making to impact
differential learning outcomes. The administrator’s experiences coupled with their
participation in inquiry activities designed to promote awareness around differential
66
learning outcomes helped the researcher understand how organizational learning
increased the funds of knowledge on the part of leaders.
In addition to the professional expertise, participants were active participants
in the Institutional Self-Assessment Project at Hills community college. Active
participation was demonstrated on the part of practitioners through their
contributions to discussions at the monthly team meetings, participation in various
team activities and through consistency of attendance to meetings during the span of
the project. Lastly, the researcher developed a rapport with participants which aided
the researcher during the data collection process. The researcher’s positionality was
perceived as that of a colleague present during discussions to improve institutional
effectiveness and equity. The researcher’s relationship with participants made it
easier to solicit participation from the administrators and collect data through
interviews. The researcher had access to institutional leaders who were able to
provide a different levels of insights which might not have occurred if the researcher
did not have a familiarity with the participants.
Data Collection
Data collection was done using a three pronged approach which included the
following: observations, document analysis, and interviews. Academic
administrators from Hills community college who were involved in the Institutional
Self-Assessment Project were asked via e-mail, phone, or in person to participate in
this study. They were told that the research study would focus on the role of leaders
67
in creating organizational change. The study specifically aimed to look at
administrators at the community college level.
Observations
Observations were conducted of participants during Institutional Self-
Assessment meetings at Hills community college. Patton’s (2002) six dimensions of
fieldwork were used to guide attention to aspects of the design of the present study.
The six dimensions are role of the observer, perspective used, who observes,
disclosure, duration, and focus. The researcher served as an observer and refrained
from participating during discussions during team meetings, and as an observer only
participated by taking notes during team meetings and observing team member
reactions whenever possible. The researcher maintained an etic approach to data
collection or an outsider approach to ensure objectivity (Patton, 2002). One
researcher was responsible for the data collection to ensure consistency in the way
that information was collected. Team members were informed that researchers
associated with the project were taking notes and were going to compile reports
based on team members’ work. Observations were conducted over a nine month
period during team meetings which occurred once a month. Meetings lasted from
one and a half hours to two hours during each instance.
The purpose of the observations was to identify the extent to which
administrators used data to learn and inform their decision making. The researcher
expected to obtain information about how administrators understood the nature of
problems using data, how awareness and learning was facilitated through collective
68
interpretation of data during team meetings, the types of learning taking place, how
the organizational learning process increased an administrator’s knowledge of the
values, attitudes, or beliefs present at the institution impacting students success, and
how learning was used by leaders to guide decision making. Overall, I was looking
for how the increased understanding of values, attitudes, or beliefs increased
administrators fund of knowledge of institutional functioning. The institutional
culture was also noted during the observations in meetings.
Document Analysis
The researcher analyzed documents created by team members at Hills
College with the input of administrators associated with the Institutional Self-
Assessment Project. Documents included in the analysis include the following: Hills
college Institutional Self-Assessment midterm and final report, and Hills community
college Basic Skills initiative plan. The Institutional Self-Assessment midterm and
final reports were part of the inquiry activities which occurred during team meetings.
These documents represented the efforts on the part of the group to collect, analyze,
and interpret different types of institutional data. For example, the team initially
reviewed baseline data which included student demographic data, transfer rates,
graduation rates, and course completion rates for students. After the initial review of
the data, the team decided which direction to take future inquiry activities. Further
inquiry centered on the success students had in the gateway mathematics course
which all students need to pass in order to be eligible to transfer.
69
The team at Hills collected and interpreted qualitative and quantitative data to
understand why differential learning outcomes existed for students in this gateway
math course. The team collected qualitative data in the form of observations, and
interviews. Quantitative data in the form of student success in developmental/pre-
college math courses was presented to the team. The data was separated by race and
ethnicity to reveal patterns in student performance among groups. Other quantitative
data included course migration data which demonstrated course taking patterns of
student groups. The reports summarized the learning which took place during the
course of the Institutional Self-Assessment Project. The team expressed their
intention to use funding provided by the Basic Skills Initiative to address the
problems in student success rates discussed in the team meetings. These documents
were selected by the researcher for analysis due to two factors which are: they were
produced by members on the committee, and these documents served to inform the
campus community of the direction the college was heading in to address
institutional problems around differential learning outcomes.
Moreover, the documents also gave the researcher an indication of how the
learning would be used by those participating in the Institutional Self-Assessment
Project. These documents would also serve as a way to facilitate the triangulation of
data from the various data collection phases. Additionally, the researcher would
follow up on the changes which were actually made after the conclusion of the
project. The Recommendations from these documents represented the commitment
on the part of administration to change policies and procedure to increase student
70
success. The previously mentioned themes based on the organizational learning and
institutionally based funds of knowledge research would be used to determine the
extent to which organizational learning impacted practitioners’ funds of knowledge.
Interviews
Once identified, four administrators were asked to consent to participate in
the interview. A mutually convenient date and time was selected. The four
administrators were interviewed twice. One interview occurred during the nine
month period when the project was in operation and one after the conclusion of the
project. The interviews were conducted in the administrator’s office during the
work week. Before the start of the interview, the researcher gave the interviewee the
Informed Consent for Non-Medical Research form via e-mail (Appendix A). The
interviewee was instructed to read the consent form before the interview date and
told that the researcher would explain the form in depth the day of the interview.
The interviewee was told in person before the interview that participation in the
study was strictly voluntary and that any information given during the interview
would be kept confidential. The participant was informed that the interviews were to
be audio recorded with their consent and that the audio recording would be used for
data collection purposes only. The audio recording would only be heard by the
research team and was destroyed after the interview.
The participant on the day of the interview was encouraged to read all of the
information contained in the Informed Consent for Non-Medical Research form.
The form solicited the interviewee’s consent before the commencement of interview
71
activities and provided the participant with information on confidentiality. If the
participant agreed to participate, he or she was asked to sign two copies of the
consent form. One form was retained by the participant and the other was kept by
the researcher.
An interview protocol was used to serve as a guide to explain the structure of
the interviews as well as to assist the researcher to stay on track with the interviews
(Appendix B and C). An interview guide is a good tool for ensuring that similar
information is gathered for all participants who have been interviewed (Patton,
2002). General questions were included in the protocol and deeper probing
questions were asked as follow up questions in response to answers given to the
broad questions. Administrators were interviewed for approximately one hour. At
the end of the interviews, the researcher asked administrators for permission to
contact them at a later date to ask any questions to clarify responses given during the
interviews. The researcher asked the interviewees if they had any questions
regarding the interview.
Interview Questions
The purpose of the first interview was to understand how administrators
made sense of problems through the use of data. The probing interview questions
centered on the concepts of awareness, learning, and funds of knowledge. The
researcher expected to obtain information about how administrators understood the
nature of problems using data, how learning was facilitated through collective
interpretation of data during team meetings, how the organizational learning process
72
increased an administrator’s knowledge of the institution, and how learning would be
used by leaders to guide decision making. Specifically, I was interested in
determining the extent to which administrators became more aware about the nature
of problems and learned about the causes. Essentially, I wanted to determine if
administrators engaged in single or double loop learning. Additionally, I determined
how administrators’ knowledge of the institution increased due to their engagement
in the organizational learning assessment process. I also determined if
administrators’ knowledge of the attitudes, values, or beliefs impacting student
success increased. Knowledge is what one knows about a subject. A fund of
knowledge is the accumulation of what you know about a subject that can be used to
inform and transact changes. Funds of knowledge have been defined as the cultural
and strategic resources available to practitioners (Velez-Ibanez and Greenberg,
1985). Lastly, I wished to obtain information on how administrators used data
gained from the organizational learning process to change institutional conditions to
influence change.
The first interview guide was meant to help the researcher develop rapport
with the participants in the study as well as provide information relevant to the
research questions driving the study. The first three questions of the interview guide
were aimed at understanding the administrators’ role at the institution. These three
backgrounds questions elicited information about the roles and responsibilities of the
administrator, length of time at the institution, and motivation for entering into the
field of education. The second set of questions were meant to provide the researcher
73
with information about how data brought about awareness of inequitable educational
outcomes among students and factors impacting student success in the area of
mathematics. Next, information was obtained about knowledge that the participants
had about how the institution functioned. The third section was concerned with
obtaining information about how administrators made decisions on a day-to-day
basis. Within the section, the researcher wished to obtain information about
structures at the institutional level which encouraged the creation of awareness and
learning on a consistent basis. Lastly, respondents were asked how the awareness
and learning taking place was applied at their institution.
The purpose of the second interview was to determine how an administrator’s
participation in the organizational learning process led to changes at the institutional
level. My aim was to identify the types of learning which took place, confirm that
the use of data led to awareness and learning on the part of the administrators, and
understand how the learning taking place increased an administrators’ knowledge of
their institutional environment. An administrator’s level of knowledge of the
institution was gauged throughout the interview, but the first set of questions asked
administrators to share what types of insights they gained through the organizational
learning process. Questions were asked to determine if administrators engaged in
single or double loop learning as described by scholars Argyris and Schon (1978).
Additionally, questions were asked to identify other factors which contributed to the
institutional problem to gauge how the administrator identified factors causing
problems.
74
Furthermore, questions were asked to determine the extent to which
administrators applied what they learned to change the institutional environment to
create more equitable learning outcomes for all students. All of these questions were
used to determine an administrators institutionally based funds of knowledge or
awareness of structures and information needed to navigate and succeed in the
organizational environment. Sub questions were asked to have administrators
identify changes in processes or structures intended to increase institutional
performance in the attainment of student success outcomes. The proposed policy
and procedural changes showed what administrators had learned about institutional
factors which were hindering student success.
The second interview was used to confirm the learning which took place as a
result of administrators’ participation in the Institutional Self-Assessment Project.
The questions 1,2,4,6,7,8, and 9 of the second interview guide were meant to help
the researcher determine what learning took place among participants in regards to
factors impacting student success, how the project impacted their awareness of
institutional factors impacting student success, or how the project increased their
knowledge of how the institution operates. Next, questions 3,10,11, and 13 were
meant to determine how the awareness and learning taking place impacted
participants funds of knowledge of the institution. Lastly, question 14 was meant to
determine what changes were made at the institution based on the learning which
took place during the Institutional Self-Assessment Project.
75
Ethical Considerations
During the data collection process the researcher took steps to ensure that
data was collected in a professional and ethical manner. Before interviews,
participants were briefed on informed consent and confidentiality. An explanation of
their rights as participants was given which included relevant contact information in
the case they had any questions after the interview. The researcher was available to
answer any questions regarding the study after the interview.
Additionally, pseudonyms were used to ensure confidentiality. The actual
name of the community college and participants were not used to protect the identity
of those participating in the study. The participants were informed that data was not
to be shared with people outside of the research study. Audio recordings were used
by the researcher associated with the study and was destroyed once transcriptions
had been completed. Lastly, the researcher did not make any false statements
regarding the study nor make any false promises to the participants. The researcher
was honest and upfront with any relevant and appropriate information to share with
participants. The interviewer acted professionally and ethically during the interview.
He collected information that he saw and heard. The researcher did not guide or
prompt participants as to the types of responses that should be given.
Data Analysis
In this section, I begin with a description of the process I used to analyze
data. I conclude by talking about the concepts which guided my analysis of the data.
An inductive approach to analyze data was used. An inductive approach entails
76
having the researcher look at all the information available to him or her to identify
important relationships and ideas (Patton, 2002). In order for results to be useful and
trustworthy, several protocols and procedures were followed. The researcher used
Creswell’s six step processes of analyzing and interpreting data. The six step
process included the following: organize and prepare data, read through all the data,
analyze data in detailed manner and begin coding, generate themes or categories,
draft plan to describe the themes of interest, and interpret the data (Creswell, 2003).
The first step involved organizing and preparing data. All interview and
observation data was transcribed and prepared for analysis and interpretation.
Interview and observation data was transcribed verbatim to ensure the entire content
of interviews and observations were examined. Documents were gathered to begin
the document analysis process. Next, the researcher read through all the transcripts
from interviews, observations, and notes taken from readings of institutional
documents. The transcripts were read from beginning to end to assist the researcher
in the analysis of the content available from interviews. The reading of transcripts
and institutional documents assisted the researcher in identifying preliminary
categories to use to make sense of the data.
Thirdly, data was analyzed and coded for ease of interpretation. Once the
patterns and ideas were identified, the researcher created an accurate picture
describing the importance of the situation being studied. Interview transcripts and
documents were coded to assist in the identification of emergent themes or patterns
related to the frameworks used in the study. These patterns were recurring thoughts
77
or ideas mentioned by administrators on a regular basis during the course of data
collection. The patterns were placed in categories and then analyzed to ensure the
data fit into the category using internal homogeneity and external heterogeneity.
Internal homogeneity refers to the extent to which data should be placed in one
category and not another and external heterogeneity refers to the differences between
categories (Patton, 2002).
The fourth step involved generating themes or categories to aid in the
identification of major themes for the findings section. Next, a schematic including
possible examples to describe most recurrent themes were created. Lastly, data was
interpreted. Keeping in mind that the goal of a case study is to understand the
situation being examined and the conditions in which it occurs, cases were examined
with a keen eye on the situations which caused the phenomenon. The researcher
made meaning of occurrences by using what Stake (1995) calls looking for
“correspondence and patterns” (p.78). Stake (1995) describes correspondence as the
analysis of occurrences and other evidence in relation to what is being studied in
order to form patterns which explain why things occurred. The conceptual
frameworks of organizational learning, and institutionally based funds of knowledge
were used to explain why cases unfolded as they did.
The specific elements of the frameworks driving this study are described in
further detail in this section. I contended that there were different types of learning
taking place and that the learning infrastructure will play an important role in
creating an environment conducive to organizational learning. Next, I contended
78
that the learning which took place among administrators participating in Institutional
Self-Assessment Project increased their institutionally based funds of knowledge.
The funds of knowledge were then be used by the administrators to make changes or
adjustments to policies and procedures which they have learned to be detrimental to
student success.
Primarily, I was focused on determining what type of learning took place
among leaders during Institutional Self-Assessment team meetings. Learning in a
collaborative setting is of particular importance due to the importance of collective
sense making as pointed out in the literature. Researchers note that practitioners
involved in group settings gain more awareness and understanding of a situation by
collectively discussing and deciphering their insights (Moll, Amanti, Neff &
Gonzalez, 1992; Bensimon & Neumann, 1993; Bauman, 2005). First, I took a look at
attendance records and team meeting notes to determine participation patterns among
administrators in inquiry activities. Inquiry activities included team meetings,
interviews, observations, conferences, and site visits. I examined activity patterns to
determine if administrators’ level of participation demonstrated that they were
actively involved in inquiry activities. The literature on organizational learning
emphasizes the importance of inquiry activities to bring about knowledge creation
although no specific activities are mentioned (Garvin, 1993; Huber 1991). Many
insights were generated on the part of administrators through the inquiry activities
they participated in during the span of the project.
79
Next, I sought to determine if the data brought about awareness among
administrators. I examined team meeting notes and interview transcripts to determine
how inquiry activities influenced the creation of new awareness and learning among
administrators. Data is believed to bring about awareness that translates into
recognition of the existence of institutional conditions (Huber, 1987; Levitt &
March, 1988; Bensimon, 2005). According to Argyris and Schon (1996) awareness
is an essential component of organizational learning because a practitioner may be so
used to their environment that it will make them unable to recognize patterns.
Additionally, I analyzed team meeting notes and interview transcripts to determine
the extent to which administrators were able to learn about the causes of student
underperformance in mathematics. Specifically, I was interested in determining if
administrators had engaged in single loop learning which led to the identification of
a problem caused by an institutional policy which led to a change in the policy but
not the underlying assumptions or values (Argyris & Schon, 1978). Given the limited
duration of the project, single and double loop learning were the only types of
learning which could be observed. I sought to obtain a greater insight into how
administrators learned more about the existence of differential learning outcomes
and what some of the causes of these were.
Lastly, I determined how the learning taking place among administrators
impacted their funds of knowledge regarding student inequity in mathematics and
further the institutional factors contributing to student inequity in mathematics. I
determined this by examining the extent to which administrators expressed that what
80
they had learned added to their previous knowledge of student performance and
institutional factors impacting student performance. Specifically, I looked for
expressions which stated a new knowledge of values, attitudes, and beliefs which
impacted student success which were brought about through the organizational
learning process. As mentioned previously, knowledge is what one knows about a
subject. A fund of knowledge is the accumulation of what you know about a subject
that can be used to inform and transact changes. Funds of knowledge have been
defined as the cultural and strategic resources available to practitioners (Velez -
Ibanez and Greenberg, 1985).
Findings
The findings were organized into major and minor themes using frameworks
mentioned above. Once the themes were identified, description of individuals’ cases
and thick description from participants were used to further illustrate important
findings. Summaries of minor cases were used to provide the reader with a clearer
understanding of how people’s experiences were similar or dissimilar to one another.
Those themes which appeared on a consistent basis throughout participant
interviews, observations, and documents were emphasized more in the findings
section of the report. Recurrent themes were classified as major themes and
classified as major findings. Major findings were illustrated with detailed
descriptions which included specific events, quotations, and related text from
documents. The goal of describing the setting was to provide the reader with
81
“vicarious experiences” to give them a notion of the context in which the study took
place (Stake, 1995, p.63).
After both major and minor themes were presented, interpretations of the
findings were given. The interpretations were centered on the research questions and
theoretical frameworks. The implications of findings were framed around the
present state and national accountability context. The findings were geared towards
informing practice for practitioners at the community college level.
In Chapter 4, I will present and discuss the findings from the interviews,
observations, and document analysis.
82
CHAPTER 4: RESULTS
In this chapter I will examine how the data from my research study answered
the research questions. As previously mentioned, my research questions were: (1)
What types of learning took place among administrators facilitated through
collaborative interpretation of data? (2) How did the learning contribute to
administrators’ institutionally based funds of knowledge? (3) What proposed
changes were planned or made to existing institutional structures, policies,
procedures as a result of the insight gained through the institutional self-assessment
process?
First, I present the changes administrators planned and performed to existing
institutional structures, policies, or procedures, which were informed by the learning
that took place during the collaborative interpretation of data. I answer the third
research question first to illustrate the changes that came about from the insights
gained through this self-assessment project. The changes are taken as a sign they
learned something through the Institutional Self-Assessment Project. The theme that
emerges from the interview data is that administrators learned to take more
institutional responsibility in regards to facilitating student success among all
students. The findings from the institutional self-assessment process served to
inform the changes administrators implemented. All of the changes were aimed at
enhancing support services provided to students.
Next, I focus on presenting data gathered through this study to answer the
first research question. I examine the types of learning facilitated through the
83
collaborative interpretation of data in the form of student success statistics,
observational data, interview data, and information gathered during site visits. I
found that administrators were hesitant to believe what the data said initially due to
the fact that the data did not conform to what they knew previously of student
underperformance. However, the collaborative interpretation of data facilitated
acceptance of what the data meant in regards to student success. This led to an
increase in awareness of inequitable learning outcomes among students. First,
administrators went through a cycle of awareness which resulted in single loop, or
surface-level, learning of factors influencing student performance in math. Second,
administrators went through a cycle of awareness where their previous knowledge of
student outcomes was confirmed, expanded or changed to bring about a greater
awareness of the gap in student underperformance in math. Overall, data showed
administrators that the college was not successful in graduating students.
Additionally, administrators demonstrated surface level learning of the institutional
factors contributing to student success in the area of mathematics. Administrators
became more aware by taking one of two distinct tracks that began with participation
and attendance at team meetings. The collaborative interpretation of data during
meetings led to either doubt in the findings or acceptance. Next, acceptance gave
way to understanding the extent of student success. At this last stage, administrators
became aware of the extent of inequity among student groups present in the area of
mathematics. I will provide more detail about how awareness came about among
administrators in the next section. This awareness laid the foundation for the group
84
to learn about factors that contribute to student success in mathematics. Figure 1
represents the process taken by administrators to become cognizant of the extent of
inequitable educational outcomes for students in mathematics.
Figure 1: Steps to Awareness
Administrators did not learn of the specific institutional factors directly
contributing to student under performance in mathematics but instead learned of
some factors that seemed to contribute to student performance in mathematics. The
result is what I call surface level learning of the causes of inequitable student
outcomes in mathematics.
Lastly, I explore how the learning contributed to leaders’ funds of
knowledge. The knowledge gained through the project was knowledge that was
turned into useable knowledge by practitioners to create changes to institutional
structures impacting student success. The new knowledge expanded administrators’
previous knowledge of the values, attitudes, and beliefs present at the institution
impacting student performance. The data showed that the administrators involved in
85
this study used organizational learning as a way to build their knowledge of
institutional underperformance. Administrators showed intuitive knowledge of
student under performance that was confirmed, challenged, and expanded through
the self-assessment process. The result was administrators going from a student
deficit explanation to explain student performance in mathematics to an institutional
underperformance mindset that acknowledged the institutional responsibility for the
lack of equity in educational outcomes among students. Administrators used their
knowledge to create changes like leveraging resources and changing institutional
policies and procedures to address gaps in student performance as determined by the
self-assessment process.
Informed Changes
In the sections that follow I describe the changes that each administrator
planned or made to existing institutional structures, policies, or procedures which
were informed by the learning that took place during the collaborative interpretation
of data. I describe on a case-by-case basis what changes were planned or performed
and how the changes were related to the awareness or learning which took place
during the self-assessment process. I rely on self-reported data obtained through
interviews, observations, team meeting notes, and reports to understand how the self-
assessment process informed changes. The changes that occurred were based on the
awareness and learning which took place during team meetings. The pattern which
emerges with the changes that took place is that three out of four administrators
which were participating in the self-assessment project took what they learned and
86
applied their knowledge gained to the areas which they had influenced to change.
Administrators implemented what they learned based on their role at the institution.
In the following sections I will explain how the changes, which came about during
the period of the self-assessment project or shortly afterward, were influenced by the
organizational learning process and were related to the single loop learning which
took place among administrators.
First, I begin with a brief description of the role and responsibilities of each
of the participants in this study. The Vice President of Student Affairs at Hills
College was responsible for eight departments on campus including: Financial Aid,
Outreach and Recruitment, Extended Opportunities Programs and Services
(EOP&S), Counseling, Admissions, Campus Safety, Community Services and
Disabled Student Programs and Services. The Dean of Counseling at Hills College
was responsible for providing counseling services to all students on campus. He
supervised all full-time and part-time counselors within the department. The Dean of
Humanities at Hills College supervised faculty in the English, Social Science and
Communications departments. She was also responsible for two academic support
centers on campus. The dean of institutional research was responsible for compiling,
analyzing, and reporting institutional data to local, state, and federal agencies for
Hills College. She also provided data for institutional planning efforts, program
review and accreditation.
The major insight gained by administrators on the committee was that
students were not aware of the support services offered to them through the tutorial
87
learning centers. This insight, that essentially stated that the institution could be
doing more to support students in the area of mathematics, prompted administrators
to examine what they were doing within the areas they were responsible for to better
support students. Data gathered by team members during interviews and
observations made it clear that students were not utilizing tutorial centers. Faculty
and administrators reported that they had seen empty tutorial centers. Based on
interviews, observations, and comments expressed during team meetings all team
members agreed that more could be done to support student success in math through
the tutorial learning centers. John explained what he learned by stating the
following:
It is about marketing and conveying the message that college is accessible.
We do not give students enough information when they get here…And so
that information has informed us on how we need to focus better. Retention
and persistence are two key areas that our college has not paid a whole lot of
attention to.
John acknowledges that the institution could be doing a much better job of helping
inform students to help them acclimate to college by providing mechanisms to
communicate the resources available to them to succeed. This includes letting
students know that they can access tutorial learning centers to help them be
successful in math. Additionally, he noted that the institution could do a better job of
focusing more on ways to improve student retention and persistence. The quote
above represents what John knew towards the end of the self-assessment process. It
was at the end of the project where John seemed energized about making changes at
the institutional level to address the insights gained by the team. He had seen
88
various examples of what other colleges were doing to disseminate information to
students and how colleges were helping students persist.
Influenced by what he had learned, John started to use the organizational
learning process to understand what was occurring within the division of Student
Affairs. The first thing that he did was to obtain information on the number of first
time college students coming to Hills straight out of high school. He stated that he
found that the numbers of graduating seniors from the high schools that Hills
recruited from were going to collectively make up the largest graduating senior class
in the upcoming year. He said that he also collected information on the number of
first time college students by data collected through the student application and
determined that the number of first time college students had steadily been
increasing. He used the information he learned during the organizational learning
process and other data to create changes where they would make the most impact.
John stated that:
Part of the [self assessment process] was not just to come up with solutions,
but to find out what the root problems were, and then from that, using the
data to help create some of the solutions.
He stated that with the information about students not being connected to resources
and the additional data he collected, he had the evidence he needed to address the
problem of communicating information to students.
Furthermore, the data helped him to decide where to start, which was with
first time college students. The large numbers of potential high schools students
starting college from feeder high schools as well as the pattern of increasing first
89
time college students coming to Hills provided John with the evidence needed to
create changes. The first change he created was to institute an orientation program to
expose students to resources available for them to use at Hills. John communicated
his rationale by stating the following:
Signage is important, but the issue is the lack of information during
orientations; because we don’t have mandatory orientations, which is one of
the things that we felt is necessary… So we are looking to do that by 2010 or
2011. So by the time we go live in January 2010, we want to have that spring
for any new high school students coming in for the following year a
mandatory orientation.
He expressed the thought that by providing students with orientations as they got to
Hills College, the institution could begin to address the problem of lack of
information provided to students. The orientations will be mandatory to incoming
students and are currently being piloted. An online orientation website is also being
piloted as another way to deliver information about Hills College to incoming
students. This change created by John was directly related to the insight learned by
administrators related to the information communicated to students regarding support
services. John stated that he took what he learned and decided to find out the best
time to provide students with information. He figured out that the most beneficial
time is when students are entering the institution.
Additionally, related to the insight that students were not receiving the
appropriate information to help them acclimate to college, success workshops were
implemented. These student success workshops were meant to prepare students to
take the placement exam. One barrier identified by John was that students did not
90
take the placement test seriously because they did not understand the importance of
the test in relation to accomplishing their educational goals. He kept on getting data
back in the form of reports from counselors that students were not placing at the
proper skill level in math and English. John stated:
So [we] kept hearing high school students coming back [to counselors]
saying they put me in the wrong class, I shouldn’t be in this class. So it is
about doing a better job with assessment workshops ahead of time; trying to
really help – because see the one thing that we identified, especially with
high school students, they’re so used to the CAHSEE exam, and the CSTs
that they take, and they come take a community assessment test and this isn’t
like anything they’ve ever seen in their lives and they don’t understand the
importance of it.
With the assistance of his staff, he identified that new students were not given
information about the importance of the placement test nor how to prepare to take
the test. He started to pilot high school informational workshops that would expose
students to sample problems and communicate how serious the test was for their
future success in college. The math and English departments created the curriculum,
which would be used at the high schools to prepare students to take the placement
test. The intent was to help students adequately prepare for the exam to ensure they
are placing at the proper level of ability.
Moreover, John believed that any effort to increase student retention and
persistence needed to involve a comprehensive plan, which would involve different
constituency groups on campus. The student success plan was created to link the
services offered through Student Affairs and Academic Affairs. The intent of the
student success plan was to create a cohesive and coordinated approach involving
91
Student Affairs and Academic Affairs to promote student success. This coordinated
approach was a recommendation mentioned by the self-assessment team in the Final
Report (2008). John explained that the self-assessment project made student success
more visible and that it needed to stay visible. John said, “We have much more
awareness on our campus now about student success, and a whole equity aspect to
it.” He noted that the institutional self-assessment process made him and others
aware of how to increase student success. In fact, he credited the self-assessment
process as the inspiration for the student success plan. He went on to say:
The reason that I came up with that model was because of [the self
assessment process], because I realized it’s not just about taking the stinking
test, it’s not just about passing Math 7 so you can get into Math 13, or 15, or
17, it’s about where is student success? How do you initiate student success?
How do you maintain it, and then how do you sustain it to keep going? And I
think that’s what [the self assessment process] shows us is that; because you
want it to be sustainable.
As was noted previously in the final report, there was a disconnect between
the services offered to students from the Student Affairs and Academic Affairs
divisions. John mentioned that he wanted to implement learning communities at
Hills College but that he needed to collaborate with Academic Affairs division to
make it happen. John thought that the creation of learning communities could
benefit the student population at Hills. He came up with the notion of creating
learning communities based partly on the site visit he participated in through the self-
assessment process. During a site visit to a local community college he learned of
the potential for increased student retention and persistence among basic skills
students. The college he visited presented student success information that he found
92
compelling in terms of course completion rates and progression through math and
English classes. He saw learning communities as a way to address what the team
learned during the self-assessment process about lack of success among students in
regards to mathematics. A major component to the student success plan is the
creation of learning communities, which have not been tried at Hills College.
John stated that student services could not do all the work and that a
partnership with Academic Affairs was going to be necessary to promote students’
success. One way that John believed that student success could be accomplished
was through the creation of learning communities. After all, part of the trend seen in
the student success data in the area of mathematics was that students were not
completing math classes nor were they continuing to take the next higher-level math
courses. Through the self-assessment process he determined that the most effective
way to increase student success was through learning communities. He went on to
state:
And the other aspect too that we realized in visiting other colleges is a huge
need for us to have introduction of learning communities on campus. To see
the success over a 12 or 13-year period at my previous college with learning
communities and the persistence rate of student success, and not to even
consider doing that here [at Hills College] is distressing. But I think it’s
slowly changing.
John believed that a direct way that the institution could impact student success was
by providing students with links to professors and students. This program with
demonstrated effectiveness at a number of institutions served as evidence he could
93
use to convince Academic Affairs of the utility of such a program. John went on to
state that the partnership was key by saying the following:
You’ve got to have the academic VP buy-in too. It can’t be the student
services VP pushing for it because it falls on deaf ears because you know
we’re just - one we’re administrators, so we’re already from the dark side,
and if you’re from student services, then you’re like on the super dark side –
you’re like from down south dark side.
John was also taken by what he learned during the self-assessment process which
created changes not only to the services provided through his division, but started to
intentionally create partnerships to make changes extending across campus. The
insights gained and the process helped John to build what he already knew about the
institution regarding the beliefs, culture, or norms of the institution he would need to
navigate to make changes to impact student success. The self-assessment process
helped him be better informed on ways the institution could improve communication
with students.
In conclusion, John attributed the changes he made to his participation in the
Institutional Self-Assessment Project. The orientation, student success workshops,
and student success plan had the common strand of helping students be successful at
Hills College through information and support. The orientation was viewed by John
as a way to provide all incoming students with a connection to campus and resources
available to them to be successful. The student success workshops were seen as a
way to facilitate students’ transition to college by preparing them for the placement
test which would determine where they would start in the math and English course
sequence. Lastly, from John’s perspective, the student success plan was a way to
94
unite common efforts among divisions to increase student success while
strengthening communication.
Change in Placement Cut Scores
Similarly, Samantha took what she learned and examined the placement
scores by hiring a part-time researcher to examine the validity of the cut scores.
These cut scores determined what level of math and English students would start
from. A misplaced cut score that was either too low or too high could mean that a
student would need to take more or fewer classes to obtain their educational goal.
This placement score was pivotal for students because the majority of students were
placing at the pre-college level. Samantha stated her rationale for examining the
placement test by saying the following:
So it [self-assessment process] made us look at doing more research to really
figure out what the underlying problems were in terms of success. Yeah, we
went back to square one. We said everything starts with base one, let’s go
back to square one. If they’re not placed right they can’t be successful, so
let’s just go backwards now and let’s start a whole new area of research so
that we really figure out what’s going on with these students.
Samantha, influenced by what she learned and the self assessment process, examined
the entry point for students to ensure that the placement test was accurately being
used and that placement cut scores were matching up to the test. This data collection
process was used to understand what was occurring before making changes. This
process was emphasized during the institutional self-assessment process. She found
that one of the problems was that student assessment scores were not normed
correctly, which directly impacted how long students took to get through the
95
developmental education sequence in math and English. As a result of further data
collection and interpretation, placement cut scores were changed to more accurately
coincide with what the test was measuring. The placement cut scores determined
how many points were needed for a student to be placed at the different levels in
English and math.
Moreover, during interviews Samantha did not explicitly indicate if others
were involved in the decision to change the cut scores. What is known is that
Samantha’s department collected, compiled, and analyzed the results to the inquiry
into placement score accuracy. While there is no mention of who else was involved
in the analysis of data, it is likely that the results were shared with the English and
Math Department chairs. Additionally, results might have been presented to the
administrator in charge of the assessment department. Furthermore, it is likely that
the administrator then decided to implement changes to the cut scores. Samantha’s
interviews and actions suggest that she followed some procedures which were part of
the Institutional Self-Assessment Project. It is unclear if she followed the
collaborative interpretation structure of having larger committee meetings to analyze
data. Given the scope of the inquiry into placement scores, it would seem unlikely
that a large group of campus stakeholders would be involved in the collaborative
interpretation of data. From the data which was collected, it seems as if Samantha,
while impacted by the Institutional Self-Assessment process, did not involve others
in the collaborative interpretation of data in the same scope as the project. Factors
such as time constraints, scheduling, and campus policies could all have played a
96
role which limited this type of collaborative interpretation. Finally, the short
duration of the Institutional Self-Assessment project could have impacted processes
learned by participants including the use of collaborative interpretation of data.
Student Success in Academic Affairs
The institutional self-assessment process influenced the hiring of an
administrator to coordinate the Basic Skills Initiative and expansion of services in
academic support centers overseen by the Academic Affairs division. Under the
direction of Melody, who was a participant in the self-assessment process and
involved in the Basic Skills Initiative, an administrator was hired and new tutorial
software was piloted in the writing lab for developmental education students. As
was noted in the President’s report, there was a disconnect between Student Affairs
and Academic Affairs divisions. The team realized towards the end of the project
that an administrator was needed to provide leadership for the Basic Skills Initiative
and to coordinate services offered to students from the Student Affairs and Academic
Affairs divisions. This recommendation came about largely due to the overlap
between the goals of the self-assessment project and BSI as well as the availability of
resources given by the state. Melody stated during her interview that the
administrator was needed because while student success was the goal of all programs
on campus that “working in silos” was not going to help students attain their
educational goals. A coordinated approach to student success which was multi-
disciplinary was deemed as the most effective way to create sustainable institutional
change based on the insights learned through the institutional self-assessment
97
process. According to Melody, the rationale for hiring an administrator was to have
one person be able to “make our students support services more available, more
accessible to students, and find out maybe better ways to deliver more.” Similarly
John stated:
We realized that you know, again, you can’t have too many cooks in the
kitchen, because it gets splintered and everyone’s doing their own thing, and
you have a complete mess. And so I think it’s a lot faster this time because
of the [self-assessment process], now we have chosen to hire an administrator
of basic skills.
The team discovered that the services offered to students needed to be better
communicated and coordinated. One of the institutional solutions created was the
establishment of this position. Drawing on their knowledge of the institutional
environment, administrators believed that a way to create change would be to create
an administrative position with the responsibility of keeping student success as a
constant focus and working collaboratively with faculty. Given the separation at the
institutional level between administration and faculty, administrators believed that
having one person with the fiscal oversight and responsibility to move the college
towards greater effectiveness with students was needed. However, administrators
knew that the changes with the most potential to influence student outcomes would
have to be implemented in the classroom. Moreover, the new administrator is a
former faculty member who participated actively in the institutional self-assessment
process. The experience of this person in the classroom gives him credibility among
other faculty that could prove to make the difference in uniting the Student Affairs
and Academic Affairs divisions.
98
Lastly, the writing lab piloted new on-line tutorial software for
developmental education students. Melody mentioned that she involved her staff in
the decision to implement an on-line tutoring program. Additionally, Melody stated
that the insights learned through the Institutional Self-Assessment Project were used
during the process. No specific mention was made as to how data was interpreted
collaboratively based on the collaborative interpretation of data process used in the
Institutional Self-Assessment process. The lack of use of a team setting which
involved the larger campus community could be related to the nature of the decision
being made. Departments at community colleges have a sphere of responsibility
which is related to the department or personnel duties. In Melody’s case this means
that she is in charge of the instructional support or tutoring services provided to
students in developmental English. Therefore, Melody would only consult with
people involved in the day to day operational aspects of carrying out tutorial services
to students in the developmental English center. As the administrator over the area,
she would have the final say as to what programs and services to offer.
Barriers to Change
One administrator, Carl, did not implement any changes based on what he
learned which could be partly attributed to the barriers he identified which hindered
change. During team meetings and interviews Carl expressed a desire to change the
contracts governing the work of counseling faculty and classified staff he supervised
in his area. Time and again he noted that the barrier was the union contract for both
faculty and classified staff that limited the services provided to students in his
99
department. He firmly believed that in order to make any changes that he would
need to make changes to the contracts for both the above-mentioned groups. His
focus on changing the contract suggested that no other solution existed for Carl.
Specifically, Carl wanted to have counselors change their curriculum in the
student success course which was offered to new students at Hills College. He felt
that the curriculum was outdated and not culturally appropriate to the student
population at Hills College. He wanted counselors to be more flexible in their
scheduling especially during the start and at the end of semesters. Students
experienced long lines and expanded wait times during the start and end of each
semester. Carl wanted to ameliorate the situation by having more counselors be
available during these peak times but the union contract limited the time counselors
could spend seeing students. He also wanted to have the classified staff in his area
be able to counsel students on basic educational information, but changes would
need to occur on the faculty union contract to allow classified staff to perform
faculty related work. Changes would have also been necessary on the classified
contract to allow work on the part of classified staff which went beyond the scope of
their role at the institution. Overall the intent was to provide more services to
students.
Moreover, while Carl participated in team meetings and site visits he visibly
was not pleased with the progress being made during the span of the self-assessment
project. He wanted things to change and he thought the change was not coming fast
enough. Towards the end of the project this frustration became more apparent with
100
the tone of comments made like “at some point stop dialoguing and start doing
something” and “no one wants to drag people kicking and screaming to create
change.” Eventually he left the final team meeting and did not participate in the
follow up interview informing this study. Additionally, Carl left his previous
position and became the administrator for a smaller program that was categorically
funded shortly after the conclusion of the institutional self-assessment project. This
frustration with the pace of change was related to Carl’s intent to change things in his
department and the culture of the institution. He stated that people at Hills liked to
meet and discuss things, but that very little was done at the end of the day.
A variety of factors influenced what changes Carl implemented. Perhaps the
most direct reason for Carl’s inaction was that he came in with a firm intent to find
evidence to change the services provided within his department. He seemed to have
learned about the need to provide students with more information, but he was set on
the idea legislating change. By changing the union contracts he hoped to force
compliance on counselors and classified staff, but he did not have any evidence
which could be used to create change within his department. These structures which
represented the two largest groups on campus, faculty and classified staff, were not
going to be changed easily. He felt if these structures could not be changed then
there was no use in continuing the conversation about creating change. It was as if
he felt that what needed to be changed were the norms of the institutional
environment. The union contracts established the norms of behavior for the faculty
and classified staff which he perceived to be the biggest barriers to student success.
101
Summary of Informed Changes
John, Samantha, and Melody applied what they learned by making changes in
the areas they were responsible for. The organizational learning process prompted
administrators to analyze institutional structures that they were already familiar with ,
which were those they were responsible for overseeing. The self-assessment process
encouraged administrators to examine the structures for which they were responsible
to see if they were providing the services that students needed to be successful at the
institution. This examination of existing services provided to students followed a
similar pattern exhibited in the institutional self-assessment project.
Administrators took a hunch that they had about how effective they were at
helping students succeed in college and then they collected information to determine
if they could do more to ensure student success. The administrators interpreted that
data to get a sense of what was occurring and solutions were generated which would
provide students success in college. The data collection process was mentioned by
two administrators, John and Samantha, and both stated that they made decisions
based on what they learned from the data. Melody did not identify what data she
used to justify providing more services to students, but she stated that she applied
what she learned from the Institutional Self-Assessment project in regards to
providing students services. Carl did not mention which changes he implemented as
a result of what he learned through the organizational learning process.
Additionally, the collaborative interpretation of data process emphasized
within the Institutional Self Assessment Project was not replicated among
102
participants to the same extent. John, Samantha, and Melody mentioned involving
others in the decision making process which would mean some level of collaboration
was done to make a decision. However, not enough data was collected to determine
the structure and extent of collaboration taking place among the participants in the
study.
Moreover, it is important to note that administrators used the insights gained
through the organizational learning process but they did not use the entire
organizational learning process when identifying and resolving problems after the
project. The administrators involved in the study chose to not follow all the
organizational learning processes, but not enough information exists as to why they
chose not to. Time constraints, and the fiscal crises could have impacted how
decisions were made. What is known is that the campus Institutional Researcher left
Hills College about six months after the conclusion of the project. This means that
administrators might have not had access to all of the information they needed before
the departure of the Institutional Researcher. Administrators would have had to
make a decision of how to proceed with the information which they had which
would have meant stepping up the decision making process. Additionally, the
involvement of campus stakeholders might not have been possible due to the
scheduling difficulties. The coordination needed to involve faculty, administrators,
and institutional researchers might have played a role in the structure followed by
administrators. Lastly, the short duration of the Institutional Self-Assessment
103
Project could have limited the familiarity among participants of the steps to take in
the organizational learning process.
What is important to point out is that administrators made changes based on
their experience with the structures within the institution. For John, the awareness
and insights learned through the self-assessment process, while not directly
associated with his division, influenced the services John provided to students
through his division. For example, student affairs was not responsible for providing
tutoring services for students, but John felt that it was the role of student affairs
professionals to acquaint new students to the resources available to them to be
successful. John essentially took a good look at what he could do within his division
to improve services given the lack of information students were receiving.
One of the main improvements within student affairs to address this lack of
information provided to students was the creation of a mandatory student orientation
for all new students. The main reason for having a mandatory orientation was to
ensure students were getting the same information. John wanted all students to
succeed in college. Although the team discovered deficiencies in the services
provided to students through the tutorial centers that were supervised by the
Academic Affairs division, John took the information to also mean that Student
Services could be doing a better job of communicating to students about the on-
campus resources and student services available to them.
Similarly, Melody implemented few changes due to her role in the Academic
Affairs division. Her role was to primarily work with faculty by scheduling classes,
104
and evaluating them. Additionally, she oversaw two tutorial centers. The changes
she implemented were limited to her area of responsibility. One of those areas was
the tutorial services offered to student in English and ESL. The other was her role as
the BSI co-chair. Samantha also created a change that impacted all students based
on her role within the institution as the researcher. Lastly, Carl did not implement
any changes due largely to the types of changes he wished to implement.
In the end, three administrators implemented changes based on the insights
gained from the organizational learning process. The changes were directly linked to
the findings of the Institutional Self Assessment project. While administrators
followed some of the steps taken during the organizational learning process, some
steps were not followed. These findings point to the utility of having researchers
work with institutional stakeholders to address institutional problems. This approach
had limitations as was seen at the end of the project. The continuance of
organizational learning processes was hindered by staffing and budgetary issues.
The relatively short duration of the Institutional Self Assessment Project could have
also been a factor. A longer project might have provided institutional stakeholders
with more time to learn how to implement the organizational learning process using
existing institutional structures. Lastly, not all participants implemented changes.
The Carl case study demonstrates that active participation by a participant does not
guarantee changes will be implemented. This case raises the question of how to
increase the likelihood changes will be made by participants involved in an
organizational learning project.
105
The next section will provide greater detail on how awareness and learning
came about as a result of the institutional self-assessment process.
Awareness and Learning
In the following sections, I focus on determining how learning occurred
among administrators facilitated through the collaborative interpretation of data.
First, I focus on the inquiry activities which administrators were engaged in during
the duration of the institutional self-assessment project. Next, I examine how the
participation among administrators in the collaborative interpretation of data that
occurred during team meetings impacted their awareness of barriers influencing
student success in mathematics. I then focus on behaviors and thoughts expressed
during team meetings, conferences, site team visits and interviews which shed
insight on administrators’ awareness of inequity in student outcomes and
institutional factors influencing student success in mathematics. Secondly, I will
explore the surface level/single loop learning which took place for each administrator
expressed during team meetings and interviews.
As mentioned previously, the team of administrators and faculty at Hills
College engaged in the organizational learning process to identify problems and
examine the causes of problems. The problem that was being investigated by the
team was the differential success rates of basic skills students. The team wanted to
determine what steps the institution could take to increase the number of students
transferring who began at the basic skills or pre-college level in the areas of math
and English.
106
The inquiry activities which administrators engaged in were as follows: team
meetings, observations, interviews, conferences, and site visits. As can be seen by
looking at Table 1, administrators attended mostly all meetings, participated in
observations, interviews, conferences, and site visits. Participation in activities and
attendance at team events are important factors to consider when examining the
impact of data on the mindset of administrators within a collaborative setting. An
essential element to facilitate organizational learning is the active engagement of
individuals within a group setting to investigate the nature of institutional
performance (Kezar & Eckel, 2007; Marsick & Watkins, 1999; Bauman, 2005).
Therefore, by examining team meeting notes, the level of engagement and
participation in the collaborative interpretation of data was determined. The level of
engagement and participation are illustrated in Table 2. Codes were developed to
help differentiate participation patterns among administrators. For example, under
the month and year columns on the table the “P” designation refers to the
administrator being present at a meeting and “NP” means not present at a team
meeting. The “L” means an administrator came to a meeting more than 20 minutes
late and the “E” designation means that an administrator left a meeting at least 30
minutes early. The letter in the box with the administrators name denotes
participation in other inquiry activities. The “S” designation means they attended a
site visit, “I” means they conducted an interview, and “C” means they attended the
project based symposium. The pattern which emerges is that John was the most
active administrator in the group as can be seen by his presence at each of the team
107
meetings, conferences, and site visits. Melody was the least active administrator as
is seen by her attendance at self-assessment activities. She attended the least
meetings and other events. Melody did not attend the conference or site visits but
was the only administrator to conduct an interview and observation. Samantha and
Carl attended all team meetings and participated in at least one site visit. Overall, all
administrators demonstrated they were engaged based on their attendance and
participation in self-assessment activities.
Table 2: Participation in Inquiry Activities and Team Meetings
Administrator
Sept
07
Oct.
07
Nov
07
Dec
07
Jan
08
Feb
08
Mar
08
Apr
08
May
08
Carl (S) P P P P P P NP P P(L/E)
Melody (I,O) P P P NP P P NP NP P
Samantha (S) P P P P P P NP P P
John (S,C) P P P P P P P P P
108
Moreover, upon analyzing comments made by administrators during team
meetings administrators did not participate in discussions at the same level as faculty
members but administrators contributed to all discussions during team meetings.
The two most vocal administrators were Samantha and John.
Collaborative Interpretation of Student Success Statistics
Initial data collection efforts and presentation of data were performed through
the office of Institutional Research. The initial baseline data collected by Samantha
and presented to the team of faculty and administrators on the self-assessment
project was in the form of student success statistics. The baseline statistics
demonstrated numerically how students from different ethnic groups performed in
math courses, as shown by an excerpt of the data (adjusted to provide anonymity), in
Table 3. The baseline data for mathematics presented in Table 3 presents the number
of students disaggregated by race and ethnicity which shows the number of students
enrolled in math courses, and number of students successfully completing the math
courses. Successful course completion was defined as students who received a grade
of C or better or completed a course for credit.
109
Table 3: Successful Course Completion Rates of Basic Skills Students in Math at
Hills
3&4 Levels
Below Transfer
2 Levels Below
Transfer
1 Level Below
Transfer
Racial-
Ethnic
Group Statistics Math 2 and 3 Math 5 Math 7
# enrolled 1913 1426 970
SCC# 940 671 438
Hispanic/
Latina/o
SCC rate 49% 47% 45%
# enrolled 115 131 97
SCC# 59 78 49
White
SCC rate 51% 60% 51%
After the initial presentation of the data in the format shown above in Table
3, the team discussed what the numbers were showing about student progression
through the developmental education sequence in the areas of mathematics and
English. For example, data showed that there was a gap in successful course
completion among students who were two levels below transfer level mathematics.
Data showed that 47% of Hispanic/Latina/o students successfully completed courses
two levels below transfer while 60% of White students completed these courses. Not
all student groups are represented in the table above, but it serves as a sample of the
data reviewed by the team. All student groups at Hills College were represented in
the table presented to the team. The team as a whole seemed to not fully understand
what the numbers meant. Comments made by administrators during this meeting
tended to express doubt in the findings that essentially showed the gaps in student
performance in the areas of mathematics.
110
The overall thought which was communicated was that of disbelief towards
the level of underperformance among student groups. The initial presentation of data
made by Samantha, the institutional researcher, elicited many responses of disbelief
which communicated a sense that the team did not believe the institution could be
doing so poorly with students. The comments seemed to put the blame of
inequitable educational outcomes on student related factors such as academic
preparation. One faculty member asked, “Are there bias in the data?” Another
stated, “there are things that could influence numbers like where students are coming
from will make a difference…” Another faculty member asked, “Do we have
students who don’t want to transfer?” These expressions of disbelief were also
demonstrated among administrators’ comments. Melody, expressed disbelief of
student success data as well by stating, “sure various ways it can be interpreted.
Some people may say that we have low expectations. I am not accustomed to
reading statistics…” John and Carl stayed silent during this exchange. These
examples show that most members of the team were not entirely sure that the data
presented accurately captured the success of students and that explanations to explain
the numbers presented seemed to put the onus of control on student related factors or
statistical miscalculation.
However, a shift in the comments came after a comment made by a faculty
member who understood the implications of the data. The faculty member said, “We
are losing a lot of students…wow…” Similar reactions came from the rest of the
members of the team who were silent previously. The team became aware of what
111
the information presented to them really meant and the implications to students
enrolled in the pre-college math and English courses at Hills. It was at this moment
when the team became aware that a problem existed in the progress of students
through the mathematics course sequence. The recognition of the problem on the
part of the team generated a discussion sparked by faculty which served to provide
an avenue to discuss why students were not succeeding in math. Different faculty
members chimed in to provide reasons why students were not succeeding. Several
hunches were thrown out by faculty based on personal experience. After the
discussion of factors influencing student success, a researcher asked the team if they
wanted to investigate student underperformance in math or English. All team
members overwhelmingly spoke about wanting to examine student
underperformance in math. At that moment, the team unanimously decided that they
would look at mathematics as an inquiry area. The team decided to focus on support
services that were assumed to assist students’ progress through the mathematics
sequence.
In this exchange we see data brought up a situation among faculty and
administrators that challenged what they knew previously about the institution.
Team members had a sense of the lack of success students faced in mathematics and
English courses but did not know the extent until student success data was presented.
This disconnect between what the administrator and faculty member thought they
knew and what was presented numerically did not coincide. The analysis of data
within the group setting brought about recognition on the part of team members of
112
the extent to which students were succeeding in math. Additionally, the
collaborative interpretation of data allowed team members to share their experience
and knowledge that resulted in a greater understanding of how students were
performing academically.
Administrator Awareness
In this section I will explore the concept of awareness through the individual
experiences of the administrators involved in the study. Administrators recalled the
importance of inquiry activities on the formation of their awareness of the extent of
student success at Hills.
Recognition of Inequitable Outcomes
The presentation of baseline data and collaborative interpretation of data
increased administrator’s awareness of student under performance or differing
success rates in the areas of mathematics. The critical incident provided above to
emphasize the impact of collaborative interpretation of data is an example of the
impact of data on the team. On an individual basis administrators all referred to
what they discovered as a result of that meeting. In this section, I introduce more
data to demonstrate administrators’ reactions to data and how conversations
involving data helped to bring about awareness among administrators. Melody
recalled the presentation of student success data and what became clear to her was
that not all student groups were performing the same. She stated that, “the success
rates in the math classes were not as high as the success rates in the language classes
113
in English and reading classes.” She recalled that the visual presentation of the
student success data impacted her because:
The idea that it might take a person four or five levels of math, of semesters
of math, stands out in my mind because people get discouraged…visually
[differential student success] kind of clicked in.
The notion that she did not realize how difficult it could be for a student to finish the
mathematics requirements for transfer was communicated a few times as well.
Melody seemed to really connect to this thought of the commitment needed on the
part of students to get though the math sequence to be able to transfer.
Similarly, Samantha recalled that the most important fact for her was that she
was made more aware of the differing success rates of student groups in math
courses. She stated that,
The useful part of the data was aggregating it by race, because it’s very easy
to say, well we have a 60% success rate in this class but what does that mean
for students…so the important thing was to look at those groups separately.
She was able to become more aware of student performance among different groups
and was able to identify inequities in student success. The gaps in performance
determined the direction the group took. She recalled that she “realized that our
math students were the ones that really were in the most dire need so we decided to
concentrate on that.” Another factor that she was more aware of was the student
departure point. Samantha said, “I think one of the most important things was we
couldn’t delineate why some students quit at the Math 5 level and didn’t pursue
Math 7 to go towards transfer.” She was able to determine the key transition point
by looking at the data and this was something she communicated to the rest of the
114
CBP team. Essentially, she became more aware of the patterns and behaviors of
students’ course taking patterns in mathematics courses.
Similar to Melody and Samantha, John became more aware of the pattern of
inequity and persistence of students in math courses that emerged from looking at the
student success data. John stated that,
The data we saw was a little discouraging. The percentage of students that
are being successful from Math 2 to Math 3, from 3 to 5, from 5-7, the
numbers kept shrinking every section and then when we get to 7 you finally
see the percentage that finished.
John became more aware of how students were progressing through the math
sequence and recognized that not all students were making it to the transfer level
math course. Moreover, he further expressed his awareness of student persistence
and equity by stating,
If students are starting at the bottom, that’s three levels of remediation before
they can get to Math 7 and at every level there’s a persistence rate that isn’t
always 100%. We have much more awareness of the data and what it shows.
We have much more awareness on our campus now about student success
and a whole equity aspect of it.
John noticed a pattern emerging which demonstrated that students were departing at
various points during the math sequence and how some groups were departing at
higher rates. He also indicated the usefulness of looking at data to determine what is
occurring with students in terms of success.
Lastly, Carl became more aware of the diverse learning needs of students in
the area of mathematics. He stated:
115
People [students] are not at the same level so preparation is different.
Everyone is at different levels. We need to facilitate learning better for
students because a one size fits all approach does not work.
He became more cognizant of the fact that students were coming in from different
preparation levels. Although not explicitly stated above, Carl expressed that he
became more aware that different groups were performing differently than others and
that the institution did not always address the learning needs of students effectively.
Role of Data in Creation of Awareness
The awareness that was created among all administrators was the recognition
of the existence of inequitable learning outcomes for students. Student groups were
performing differently in the area of mathematics. The data that facilitated the
awareness among administrators of differential learning outcomes was the
presentation of student success data and course migration data. The data was
presented disaggregated by race and ethnicity to determine gaps in performance
among groups. As stated by the respondents, administrators were not aware of the
extent of inequity among student groups until data was presented comparing the
overall success of students in pre-college level math classes. While not all
respondents used the word equity, all refer to the notion of students performing at
different levels either during interviews or team meetings. Consistent with
organizational learning literature, data created greater awareness of the extent of
student performance and institutional effectiveness in educating students that would
have gone unnoticed by practitioners. Argyris and Schon (1996) stated that data
brings about awareness among practitioners because it gives them a true sense of
116
what was occurring. The development of awareness is seen as the first step towards
the creation of insights or learning because “biases” blinds practitioners to the
implications of data being presented (Argyris & Schon, 1996, p. 47).
Learning
In this section, I will explore the learning which took place for each
administrator in regards to student performance in the area of mathematics. I focus
primarily on thoughts expressed during team meetings and interviews which express
understanding of factors contributing to student under performance in math.
Lack of Support Services for Students
One insight that all administrators mentioned that they learned, which
contributed to the lack of student success in the area of mathematics is the lack of
support services available to students. Melody expressed that she learned student
support services in the area of mathematics were not readily available to all students.
She learned the importance of making support services more available for students.
Melody stated that the data “solidified a lot of things that I already knew.” However,
she expressed that the area where she learned more about student under performance
in math was related to the specific factors impacting student under performance.
Similarly, Samantha expressed learning that the institution could be doing
more to provide support services for students in the area of mathematics.
Specifically, she remembered discovering the following:
Actually going into the tutoring center and looking at students’ experiences,
looking at accessibility for students, it was no longer hearsay anymore; we
actually understood what the barriers were for students… we had so many
117
different tutoring centers, that students didn’t know which one they were
supposed to access for which reasons.
Instructional support services in the form of tutoring services were available for
students but students were not using them. Students were not aware of how to access
the services. The data which showed this to Samantha was the qualitative data
collected and presented by faculty involved in the Institutional Self-Assessment
Project. The qualitative data was collected by faculty visiting the tutorial centers at
Hills College and presented during team meetings. The faculty observed student use
of the centers, staff friendliness, resources, tracking of students, and tutor
interactions with students. The presentation of data and discussion of data in a group
setting allowed Samantha an opportunity to find out what was occurring in the
tutorial learning centers. She learned that students did not receive all the information
they needed to be successful.
John learned, like Melody and Samantha, that the institution had to educate
students differently especially in the area of math. He stated:
…by looking at Math 7 it really gives us a better idea of how our students are
doing, and they have made a clear indication to us. Now, if we can do a good
job of intervening at the lower levels-2,3,5, that will mean that there will be
more students taking Math 7.
He felt as if the institution as a whole could do a better job of educating students who
were taking math classes. More specifically, John learned that the campus had to do
a better job of disseminating information to students. He learned that the signage on
campus was a concern in communicating to students the location of the tutorial
centers but he recognized that the problem was a deeper one. He said the following:
118
Yeah, we still have to have better signage, but part of that was not so much
just the signage, because signage is important, but the lack of information
during orientations; because we don’t have mandatory orientations, which is
one of the things we felt is necessary.
John took what he learned and looked at what his division did to prepare students to
acclimate to college. Applying what he learned during the self-assessment process,
he knew students were not getting information about the institution so he reflected on
the best way to provide students the information they needed to be successful. He
felt that staff in his division, Student Services, could do a better job at providing
students with the necessary information to be successful. Lastly, Carl noted that he
learned not enough support services were being provided to students. He stated that
the institution could be doing a better job of supporting students to get through the
mathematics sequence.
Value of External Researchers
Samantha also recalled learning about the benefit of having external
researchers lending support to Hills College to identify performance gaps among
students in a team setting. She went on to say:
That’s why the project was so important, because it got people to work
together for a common cause. Would they do that without that incentive
there-without that external motivation? Like I said, without external or
extrinsic type of motivation being imposed upon them, very few changes
would have taken place.
The resources and added assistance provided by the researchers from the Center for
Urban Education were important in moving the conversation forward regarding
inequity in student performance. Additionally, she noted that she would not have
119
been able to sustain the type of learning facilitated through the organizational
learning process due to the lack of resources available to the research office.
Samantha stated:
Yeah, I think for once – because the expectation is that the research office
will do all that. There’s no way. There’s no way that we could have done all
that work ever. Maybe if we had a staff of five or ten like Emory. But with
two people here, we’re so involved in day-to-day operations stuff that the
only way to do something like that is that kind of a task force. And that’s
why we have the basic skills committee, and I hope their work – that people
realize that it takes everybody who’s a practitioner to do that kind of work.
Lastly, she learned that in her role as an institutional researcher that she needed to do
a better job of explaining data better because faculty and administrators are not
accustomed to making sense of data. She said the following:
One of the issues in dealing with faculty is that they didn’t understand what
the data meant; so it’s a matter of going and really explaining to faculty what
data is. And you saw that with the self-assessment project at our campus too.
She noticed that there were times when the data needed to be broken down further to
ensure all team members understood what the data meant. Faculty may doubt data at
first because they might not fully understand what the data is saying in regards to the
situation being investigated.
Type of Learning Taking Place among Administrators
Factors used to explain student performance among respondents was
described as the inability on the part of the institution to disseminate information
regarding student services available for students. All administrators also stated
learning about the institutional performance in the area of providing support services
to students. While all administrators note that the services are available, all realized
120
that the services were not being utilized by the students who needed it the most.
Administrators were able to identify some factors contributing to student
underperformance that were based on institutional factors. Based on the
organizational learning literature, the learning taking place can be classified as single
loop learning. The learning taking place identified institutional factors which could
be changed to resolve the immediate concern of dissemination of information but the
actual causes of the under use of student services is not yet known. I also refer to
this type of learning as surface level learning because not enough information is
known to address the true causes for the lack of student use of support services in
mathematics. Double loop learning would require practitioners to not only
understand the root causes of problems in regards to institutional procedures but also
understand the biases causing these procedures.
Funds of Knowledge
In this section I explore how the learning taking place during the self-
assessment process impacted administrators’ funds of knowledge relating to
organizational functioning. Specifically, I examine how knowledge among
administrators was built up as a result of participating in the self-assessment project.
Additionally, I examine how administrators expressed they would use this
knowledge to make changes to the institutional environment based on the knowledge
gained. I look at data gathered from observations, meeting notes and interview data
to determine how administrators expressed knowing more about the values, attitudes,
121
or beliefs present at institution impacting student success and how they would use
this knowledge to create changes.
Funds of Knowledge Solidified through Organizational Learning
For Melody, the institutional self-assessment project solidified her previous
understandings related to the awareness of the problem of inequitable educational
outcomes and institutional factors that impact student success. She stated the
following:
I’ve been working a Hills College for more than 20 years, so you know, the
hunches are that conclusions I think were things that are, you know, I mean it
verified – it solidified a lot of things that I already knew.
Specifically, the self-assessment project verified what she thought she knew about
students and math performance previously. Her use of the word “verified” indicates
how the self-assessment process provided her evidence that showed her the extent of
student under performance in mathematics. Additionally, in regards to faculty the
data confirmed the important role faculty plays in facilitating student success.
Melody expressed the thought that if faculty do not know how to teach basic skills
students, there is no mechanism in place to hold faculty accountable for adapting
instructional practices to facilitate student success. She recalled the data confirming
for her how challenging it is to teach students at the basic skills level. Melody stated
the following:
I guess what it (the data) really brought to my mind or to the forefront is the
idea that working with basic skills students is not something that’s gone away
and it’s a challenge – it’s a constant challenge.
122
This challenge that instructors face on a day to day basis is what Melody expressed
needed to be addressed. She believed that in order for substantive changes to take
place in the classroom to impact student outcomes, drastic changes would need to
occur to motivate faculty to learn more about effective instructional practices. She
said the following:
I think people [faculty] get set in their ways, and I think that there’s no
motivation for them [faculty] to change, and so they just keep doing things
the way they’ve been doing them, and they can easily blame it on the
students. You know, if the students aren’t doing well they say well you
know, they weren’t very prepared to begin with, so I just think the way that
our system is set up that there’s no penalty.
Faculty would need to be motivated to change their attitudes towards participating in
professional development activities through the “evidence of results.” Based on her
knowledge of faculty attitudes contributing to student success, major changes would
need to occur. Melody was convinced that the only way to convince faculty to
change was to tie student performance to the classroom. She said, “to be quite
honest with you, I think if they (faculty) felt threatened that they were going to lose
their jobs” that might motivate changes in the classroom. She expressed the thought
that the faculty would have to have an extrinsic motivator such as possibly losing
their jobs to be able to consider changing their instructional practices.
Samantha also believed that she knew what was occurring in terms of student
success but that the self-assessment project confirmed what she knew. Samantha
stated that the two main factors which she knew could influence the creation of more
equitable learning outcomes at Hills depended on a value change on the part of
123
senior staff and the other on the change of attitudes exhibited by faculty. First, the
project confirmed for her that change was dependent on the ability on the part of the
institution to make change a priority. She stated the following:
So it’s a matter of creating a culture of research, solutions, and working
towards solving problems. So I think that was started at Hills, but it is a
matter of can it be continued is really going to be a question of leadership.
Her statement reflects the realization that in order to change student outcomes that
senior administration is going to have to invest in making change a priority through
research based practices. Her statement indicates that steps in the right direction
have been taken but that further investments are going to be made in the form of
prioritization of creating changes and institutional support to continue to examine
problems using data. Furthermore, she believed that faculty needed to change their
instructional practices to ensure student success. She mentioned that the project
made it clear for her that any change to the achievement of educational outcomes
would have to come at the classroom level. Samantha said:
Let’s be honest here, if the faculty don’t do something different you’ll never
get student success. So ultimately, it’s the faculty that has to take the
initiative. The faculty has to care, and they have to do something different
and so you’ve got to kind of have the faculty motivated to care about this,
otherwise change will never ever take place. They are critical and faculty is
the hardest to work with, they don’t trust administrators.
The two conditions which need to exist in her opinion, in order for change to occur,
are that institutional support from administrators need to exist and faculty need to be
motivated to change. The faculty attitude expressed by Samantha is unwillingness
on the part of faculty to implement new instructional strategies to promote student
124
success. However, she pointed to the distrust between administrators and faculty
that could further make change difficult at Hills College. She did not state that she
had a greater capacity to convince faculty to change instructional practices. She felt
that the motivation would have to come from data on student performance and senior
administration.
John expressed that the institutional self-assessment project enhanced his
knowledge what was happening with basic skills students in math. The data
confirmed the importance of having faculty look at instructional practices in the
classroom to improve student success. Staff also needed to be involved in
conversations on how support services influence student success. The data pointed
him in the direction of how to begin to address what he perceives to be institutional
barriers to student success. He noted that he learned how to use data to as a way to
create awareness among faculty and staff of the impact of attitudes he sees as
barriers to student success. John recognized the power that data had to transform the
thinking of all practitioners like administrators and faculty. He stated the following:
I think that involvement with the self-assessment project has really provided
some of the foundational information that we need to try to address the very
laissez-faire and perhaps complacent attitudes of some of the faculty and
staff. I’m not pointing fingers at faculty. I’m pointing fingers at ourselves, at
the labs.
John knew that in order to create change that he needed to have mechanism to begin
a dialogue with faculty and staff over student outcomes. However, he knew that he
would have to have a way to navigate what he described as the perceived adversarial
125
relationship between faculty and administration. He expressed his knowledge of this
attitude towards administrators in the following quote:
As administration-us telling, us directing, us advocating is fine-faculty need
to have our support. But faculty want to hear from their peers; it was not
coincidental that we had Jane, and Joy do the presentations…they understood
clearly what the data meant, and better for the faculty to hear it from their
own than to hear from a cursed administration.
According to John, faculty seemed to value what other faculty had to say and they
had more credibility when compared to administration. Given the relationship
between these two stakeholders at Hills, John’s approach to create changes could not
be direct. He stated that he could not lead the discussion and he knew this so he
stayed silent on most faculty related matters during team meetings. However, he
knew that in order to create change he needed to have faculty buy in. He sought
partnerships with faculty who had a like-minded attitude to create changes. Given
his previous knowledge of the relationship between faculty and administration, John
plans to cultivate relationships with others to create changes to impact student
success. He understands that any changes have to involve partnerships with faculty.
Additionally, John also knew that he had to cultivate relationships with other
administrators to be able to partner with divisions to create changes to impact student
success. Any changes involving faculty had to have the approval and buy in from
the Vice President of Academic Affairs. He stated the following:
You’ve got to have the academic Vice President’s buy in too. It can’t be
student services Vice President pushing for learning communities because it
falls on deaf ears because you know we are just-one we are administrators, so
we are already from the dark side, and if you’re from student services, they
you’re like on the super dark side.
126
John understood that he needed the evidence and support of key faculty and
administrators to create changes to impact student success. He knew the collective
voice of faculty and administrators had more influence that the voice of one person.
He believed that changes could be achieved but that they would have to happen
slowly.
Carl expressed an increase in awareness of the existence of inequitable
outcomes. He also expressed learning more about institutional factors that
contributed to the existence of inequitable outcomes. Like John, Carl believed that
what needed to be changed were instructional practices because he felt that
instructors were not being as effective as they could be. However, the only
mechanism he believed could be used to create changes would be the union contracts
governing classified staff and faculty. This single focus on change limited him from
learning different methods to use to be able to create change. He seemed focused on
creating changes in his department. As mentioned earlier, during team meetings he
mentioned that instructors were not aware of the skills students were coming in with,
and faculty were taking a one-size fits all approach to teaching. He felt that what
needed to be changed were the attitudes of faculty towards teaching and learning
taking place in the classroom. He went as far as expressing his concerns about
making changes to the President during a team meeting by stating, “There is a
perception that because people have been on campus a long time, others felt
powerless to change things.” Carl attended almost all of the meetings so he was
127
interested in learning more about the barriers students were facing to reach their
educational goals. At the same time he felt like the team was not making any
progress towards creating changes. He communicated his frustration by stating:
There is an institutional trend of meeting and discussing things and no action
taking place. When does the self-assessment project end?...When it is done,
then what?...We need to do something. Something has to happen.
Carl feared that too much time was spent talking within the committee and that not
enough action was taking place. He alluded to the culture of the institution of being
that of meeting but nothing being decided or implemented. In May 2008, Carl stated
his frustration one last time by sharing with the team what he had seen during a site
visit. He said:
You could tell they were trying something… At some point stop dialoging
and start doing something. They surrounded themselves with people with a
like-minded attitude. To put it simply no one wants to drag people kicking
and screaming to create change.
Carl did not seem to know how to navigate the institutional environment. His
perception of the Institutional Self-Assessment Project as communicated above is
that people on the committee were unwilling to change. Shortly after the comments
above were made, Carl stood up and walked out of the meeting. While Carl
communicated being more aware of the existence of inequitable educational
outcomes and demonstrating an increase of the instructional attitudes impacting
student success, he was not convinced that anything had been accomplished by the
project. He repeatedly spoke his mind about what needed to be changed. Carl was
contacted for a second interview but did not respond to the request.
128
Summary of Findings
The themes which emerged from the study were steps of awareness, surface
level learning, intuitive knowing turning to informed knowing and funds of
knowledge solidified by organizational learning. The main findings were that
administrators went through steps to awareness that was facilitated by the collective
interpretation of data. Secondly, administrators did learn about some of the factors
which contributed to student underperformance but did not determine all the factors.
According to administrators, they knew intuitively why students were not succeeding
which was confirmed through the learning taking place through the institutional self-
assessment. Lastly, administrators seemed to express the notion that they had
organizational knowledge about the attitudes and values contributing to student
success which was confirmed through the organizational learning process.
Steps of Awareness
Consistent with Garvin’s (1993) work noting the importance of collective
interpretation of data to bring about greater awareness performance gaps,
administrators involved in the study were more aware of the extent of student
underperformance. However, Garvin did not indicate what the collective
interpretation of data involved or looked like. Figure 1 illustrates the steps of
awareness which was observed on the part of administrators engaged in the
organizational learning process at Hills. The first part of the cycle was active
participation on the part of administrators which meant attending meetings regularly
and engaging in discussions. All administrators with the exception of Melody
129
attended most meetings and were involved in conferences, or site visits. In the
second stage, administrators were involved in the collaborative interpretation of data
which brought about more awareness of inequitable educational outcomes among
students from different groups. The data was presented in the form of tables and
included student success data disaggregated by race and ethnicity. Next,
administrators either accepted or doubted the initial presentation of data. For some,
before the data was accepted, doubts about the validity of the data was apparent. The
doubt could be related to how information was presented and explained to facilitate
understanding among members from the team. After further explanation on the part
of the researcher, team members who had doubt reached the acceptance level. At the
next stage, administrators became more aware of student inequity in the areas of
math and English. This format in which the data were presented facilitated the
understanding of all administrators involved. It is important to note I did not have the
opportunity to observe a cycle of learning. Additionally, the data collected was
insufficient to measure the cycles of learning as conceptualized in the literature.
Surface Level Learning
Argyris and Schon (1978) talked about the notion of single loop learning
which meant that an institution learned about the causes of a problem and resolved
the institutional policy which created the problem but did not address the
assumptions causing the policy which lead to the problem. The learning which took
place did not isolate the main factors on the part of the institution which caused the
inequity in performance for students in mathematics. While some factors were
130
identified not all were confirmed to impact student performance. Additionally,
limited action was taken to correct the causes of student under performance in the
area of mathematics. The type of learning taking place could be attributed to the
short duration of the Institutional Self-Assessment Project at Hills College. While
the literature does not indicate how much time is needed for significant learning to
take place, the finding of this study suggests that an institution would need to
examine gaps in performance for longer than a year and a half to identify causes and
implement solutions to problems. Solutions were implemented to resolve some of
the identified institutional factors but they came after the project ended. The
literature suggests that solutions should be implemented concurrently to the
organizational learning process occurring at the institution.
Turning Intuitive Knowledge to Informed Knowledge
Fiol and Lyles (1985) and Argyris and Schon (1978) both spoke about the
notion that each institution has its unique set of assumptions which are transmitted to
practitioners as acceptable assumptions. In the case of Hills College, these shared
assumptions surfaced in the form of intuitive knowing which meant that
administrators had an idea of what the inequities were among students and factors
which contributed to the inequities. While the intuitive knowledge was not
confirmed with data, practitioners came to the data with a predisposition to think that
there were academic areas that students tended to do worse in. Many administrators,
through comments expressed during team meetings and interviews, had a shared
notion of what the areas of student under performance were and factors causing
131
these. Some of the factors were confirmed while others were expanded. The main
collective notion expressed by administrators was that the instructional support
centers were perceived as problems before the start of the project. This unconfirmed
or “experiential” knowledge was based on the previous experiences administrators
had within the institutional environment. This intuitive knowledge was reinforced for
some after qualitative data was collected and presented to the group. What
administrators did not seem to know was the scope of the problem. Administrators
were more aware of the implications of the findings once they confirmed the
existence of what they thought was a problem. They were able to turn their intuitive
knowledge into informed knowledge because they had confirmed their previously
held assumptions. All four administrators believed that in order to impact student
success, changes would need to take place in the classroom. This intuitive
knowledge was not confirmed but was present among all administrators participating
in the study.
Funds of Knowledge Solidified through Organizational Learning
The literature on funds of knowledge points to the importance of practitioners
knowing the values, attitudes, or beliefs which govern the functioning of the
institutional environment (Stanton-Salazar, 1993). Moreover, Stanton-Salazar
(2001) defined organizational knowledge as being the body of information which
provided practitioners with insight as to how and why entities within an organization
functioned as they did. Under this definition the practitioner uses their knowledge of
how the institution functions to create changes using the rules which govern the
132
institutional environment. A theme which emerged from the study, related to the
funds of knowledge concepts, the understanding of values, attitudes, or beliefs
governing the institution. Administrators expressed knowing that the biggest barrier
to student success was the attitudes held by faculty in terms of instructional
practices. All administrators stated that their knowledge of the attitudes faculty hold
regarding instructional practices is thought to be the biggest institutional barrier to
student success. These three administrators also understood that in order for the
change process to occur that they needed to involve other stakeholders such as
faculty. All understood that the greatest changes needed to be made in the
classroom. All three believed that faculty needed to be convinced to change. Two
believed that they could use data to provide evidence to change thinking among
faculty while the other believed that an extrinsic motivator was needed.
Additionally, one administrator noted having her knowledge confirmed of the
importance of administrative support in the change process. This study expands the
notion of attitudes governing the institutional environment and sheds light on how
administrators might use their knowledge of the institutional environment to create
changes. One of the administrators on the Institutional Self-Assessment team did not
understand why the team needed to dialogue so much.
In the following chapter, I describe the implications of the findings and
discuss how they can be utilized to create a new understanding of how administrators
learn using data.
133
CHAPTER 5: SUMMARY AND CONCLUSIONS
The purpose of this study was to investigate the extent to which
administrators learned about their institution as a result of participating in an
institutional self-assessment project. The objective of the study was to understand
how administrators used data to learn about factors influencing the creation of
differential student learning outcomes and what changes were made based on the
insights learned. The rationale for the study was the belief that administrators who
are able to engage in an on-going institutional self-assessment process are able to
identify gaps in performance among basic skills students and adjust institutional
policies and practices to increase the number of students transferring to 4-year
universities. This study was informed by the expectation that on-going institutional
self-assessment will lead to an increased understanding among administrators of
ways to create more parity in educational achievement to diminish the educational
achievement gap present among student groups. This study confirmed the
importance of interpreting data in a collaborative setting to bring about greater
awareness among administrators of inequitable learning outcomes. In this chapter, I
will address the significant findings and implications of the study.
Discussion of Significant Findings
The significant findings of my study illuminate how data and collaboration
can be used by administrators to bring about greater awareness of the existence of
inequitable educational outcomes to create solutions to address institutional under
performance. My findings built upon the current research literature pertaining to
134
organizational learning and institutionally based funds of knowledge. The findings
of my study were that: a) the use of data and collaborative interpretation of data
during the institutional self-assessment process helped administrators at the case
study college to create changes to address root problems causing institutional
underperformance, b) administrators at Hills College went through steps of
awareness facilitated by participation in inquiry activities and collaborative
interpretation of data, c) Three out of four administrators participating in this study
engaged in surface level/single loop learning regarding institutional factors
contributing to the existence of inequitable learning outcomes among student groups,
d) administrators knew intuitively about institutional factors hindering student
performance which was proved or challenged during the institutional self-assessment
process, and e) administrators confirmed what they knew previously about the
attitudes governing faculty and senior administration. In the following sections I will
address how the findings of my study contribute to the organizational learning and
institutionally based funds of knowledge literature.
Data Use
The literature on organizational learning emphasizes the institutional use of
data to help organizations gauge institutional efficiency. Data provides a way for
stakeholders to determine the current institutional conditions and create changes to
address institutional underperformance. According to Garvin (1993), data is used to
confirm the existence of the problem, make meaning of situations, create solutions to
address problems, demonstrate effectiveness, and change institutional practices. The
135
findings of my study built upon Garvin’s notion of data use to identify institutional
performance and illuminated which type of data could be used to bring awareness of
institutional performance. As was seen in my study, after several meetings where
student success data and student migration data was examined, team members
became aware that not all students were performing at the same rate. Quantitative
data in the form of student success data and student migration data made differences
in educational attainment among groups clear. The disaggregation of data based on
race and ethnicity helped team members notice trends in student performance that
they were not aware of before. Team members noticed that the institution was not
effectively educating all students in the area of mathematics. Data that proved to be
the most effective in pointing out patterns in performance in math courses were
success and attrition rates of students disaggregated by race and ethnicity. The
existence of differential learning outcomes became apparent. Moreover, qualitative
data in the form of interview and observational data assisted practitioners in
identifying factors contributing to student underperformance in the area of math.
Collaborative Interpretation of Data
The results of my study also build on Garvin (1993) and Bauman’s (2005)
notions of interpreting data in a collaborative setting. Garvin emphasizes the
importance of interpreting data within a group setting, but there is no mention about
the number of participants, level of involvement, or the role they play in the group
setting. Bauman (2005) found that groups engaged in collaborative interpretation of
data fell into two categories that she called high and low learning groups. High
136
learning groups were open to data and used data to increase their understanding of
what they knew about the environment. Essentially, data promoted doubt for high
learning groups. Low learning groups on the other hand used data to confirm what
they already knew. Additionally, she found that the collaborative interpretation of
data led to what she termed group learning.
My study illuminated the process administrators went through to become
more aware and learn about institutional underperformance within a group setting.
Specifically, the element of learning was facilitated though collaborative inquiry
which assisted in the creation of awareness and learning among practitioners.
Administrators went through what I called steps to awareness which included the
following: 1) active participation, 2) collaborative interpretation of data, 3) doubting
findings, 4) acceptance, 5) awareness of institutional underperformance, 6) actions to
address underperformance. Awareness was the first step taken by administrators,
which made learning of factors causing institutional underperformance possible.
This notion of a cycle of awareness builds on Garvin’s (1993) notion of collaborative
interpretation of data and confirms what Bauman (2005) found while examining high
and low learning groups. Additionally, my study contributes to the findings of these
two authors in providing a sense of the amount of time which is needed in order for
learning for awareness to take place. Awareness and learning were able to take pla ce
within the nine month span of the Institutional Self-Assessment Project. The
collective expertise of participants involved in the collaborative interpretation of data
helped the team to get a sense of student performance in the area of mathematics.
137
Awareness and Learning
This study contributed to the organizational learning literature by providing
greater detail to how people learn. Argyris and Schon (1978) described single and
double loop learning as two types of learning apparent in organizations, but did not
describe how the learning took place. In my case study, administrators engaged in
single loop learning, which helped them identify institutional factors impacting
student performance in the area of mathematics. In order for learning to take place,
administrators had to become more aware of institutional functioning. Once
administrators were aware of a gap in student performance, learning about factors
which contributed to the gaps in performance were made possible. Administrators
then learned that the resources provided to students in the form of instructional
support through the on-campus learning assistance centers were not being advertised
to students. This lack of services was believed to undermine student performance in
math courses. Students in the mathematics classes had high attrition and low success
rates. Changes were made related to the information which team members learned
related to the type of information students received which helped them acclimate to
college. Moreover, the collaborative interpretation of data and involvement of
different campus stakeholders facilitated the awareness and learning which took
place.
Organizational Funds of Knowledge
Stanton-Salazar (1997) theorized that there are seven forms of institutionally
based knowledge at the disposal of practitioners and leaders. The seven forms of
138
knowledge are institutionally sanctioned discourses, subject specific knowledge,
organizational knowledge, network knowledge, technical knowledge, career
knowledge, and problem solving-knowledge. The development of the institutionally
based fund of knowledge which Stanton-Salazar refers to as organizational
knowledge was the fund of knowledge I focused on. While Stanton-Salazar did not
specify what elements made up organizational knowledge, this type of knowledge
was conceptualized in this study as the (1) knowledge of institutional effectiveness
and (2) knowledge of the values, attitudes, or beliefs impacting organizational
functioning. The findings illuminate how the fund of knowledge of organizational
knowledge can be changed through organizational learning.
Furthermore, as seen in my study, administrators and faculty at Hills College
had a conception of how students were performing in the area of mathematics and
the values, attitudes, and beliefs which impacted student success. This conception of
how the institution was functioning to educate students in the area of math was based
on previously held knowledge. This knowledge was challenged through the use of
institutional data. Additionally, administrators’ assumptions were either confirmed
or refuted through the use of data. Evidence was collected which gave
administrators a sense of the scope of the differing student success rates in the area
of mathematics. Data informed knowledge means that knowledge is based on actual
information related to the subject under investigation. On the other hand, intuitive
knowledge is information or a hunch held by a person that is not confirmed through
evidence. In the case of Hills College, administrators had a sense that math was a
139
problem for students, but they did not know the scope of the problem. Once the
scope was known and data demonstrated that Latino students had the highest failure
rates, three administrators were prompted to act to address what they believed were
the causes of student performance in math. This finding that Latino students were
not progressing through the mathematics sequence had important implications for
Hills because of the fact that it was a Hispanic Serving Institution with the majority
of the student population being Latino. While the focus was not entirely on Latino
students, practitioners found that they were not effectively educating the largest
student population on campus.
Lastly, within my study, I found that administrators confirmed their
knowledge of the values, attitudes, and beliefs which impacts student success. This
meant that their knowledge of the institutional functioning was increased.
Institutional functioning is one of the seven funds of knowledge Stanton-Salazar
described. In my study, I found administrators knew about attitudes held among
faculty which they felt contributed to student underperformance in the area of math.
Organizational learning confirmed what they knew about the attitude faculty held
regarding the inability to examine instructional strategies. They noted that this
attitude contributed to student underperformance in the area of math.
Implications for Practice
California Community Colleges have not been able to effectively redress the
existence of differential learning outcomes among students. The lack of use of data
to identify and resolve gaps in student performance are partly to blame for colleges’
140
inability to impact learning outcomes among students. As was seen in this study, the
use of data to identify and resolve problems within an organizational learning
framework leads to further awareness and learning of factors impacting student
success. This awareness and learning leads to a greater understanding of how the
institutional environment functions to successfully educate students. Without
understanding what is occurring at the institutional environment and how policies
impact students, institutions will continue to be unable to create programs or
interventions to assist students in achieving their educational goals.
Furthermore, this study has several implications for practitioners concerning
the benefits of the development and implementation of an on-going institutional self-
assessment process to improve student educational outcomes. The following are
implications which come from the finding of this study for further practice: ongoing
institutional self assessment that emphasizes the use of data which is needed to
identify and create solutions to address disparities in student achievement among
ethnic/racial groups; collaborative interpretation of data involving a variety of
stakeholders leads to greater awareness and learning among practitioners; and
organizational learning, in the absence of organizational changes, at least helps
practitioners build their knowledge of their organizational environment. This benefit
could lead to organizational learning in the future. In the following sections, I discuss
in more detail the recommendations I have made in this section.
141
Constant Data Use
On-going institutional self-assessment within a team setting is needed to
assist practitioners in the identification of institutional barriers impacting students
and creation of policies and procedures aimed to redress the existence of inequitable
educational outcomes. Organizational learning in the form of on going institutional
self-assessment has been difficult to apply at institutions of higher education for a
variety of reasons. According to Morest and Jenkins (2007), the biggest barriers to
on-going assessment on the part of institutional research offices are related to
staffing, funding, and time spent on compulsory reporting. The burden of
institutional self-assessment falls on institutional research departments that are
overburdened with compulsory reporting for state and federal agencies. Colleges
and universities collect an enormous amount of data from student success statistics to
course information. However, little time is spent on determining how the institution
is doing in its efforts to educate all students. Institutions can easily identify gaps in
the educational outcomes of student groups on campus by analyzing existing
institutional data on a consistent basis. While looking at data is not an end unto
itself, data provides practitioners with a sense of how students are performing within
the institutional environment. Without first knowing how students are performing,
institutions will be unable to effectively determine which performance gaps to
address first.
Moreover, results from on-going institutional self-assessment efforts need to
be understandable and accessible to administrators and faculty. Information needs to
142
be presented in a useable form for administrators and faculty to easily draw
conclusions that are relevant to their work to influence their realms of influence. As
was seen in my study, an example of a useable form of data is student course success
rates disaggregated by course and by race and ethnicity. This provides a faculty
member and administrator with a sense of how students are doing in particular
courses. Assessment results organized in a table format with pages and pages of
complex statistical analyses may be overwhelming to a practitioner and not as useful
as one or two data elements which provide an overview of what is occurring at the
class level. Qualitative and quantitative data can be used to bring about a greater
understanding of institutional functioning among practitioners. Information should
be disseminated and accessible to all stakeholders on campus. The availability of
assessment results can facilitate the decision making of faculty and administrators at
the institutional level. If data is not available for practitioners, the likelihood of the
same instructional and institutional practices being implemented will increase. Data
can provide practitioners with a way to gauge institutional effectiveness and as a way
to identify areas in which to rethink strategies currently in place.
Collaborative Interpretation of Data
Secondly, the collaborative approach to data interpretation will facilitate the
consciousness raising necessary to make the learning of factors impacting student
performance possible. This approach will not only facilitate greater awareness and
learning among practitioners, but will help combine existing institutional resources
of various departments to focus improvement efforts on an institutional scale.
143
Institutions of higher education have been described as a mix of various departments
which function as independent entities. Given the current fiscal climate, California
Community colleges cannot afford to duplicate efforts at the institutional level.
Additionally, in order for change to take place resources will need to be devoted to
institutional efforts to learn about and address institutional barriers impacting student
academic performance.
Furthermore, administrative leadership and faculty support creates the
foundation for sustainable assessment efforts. In order for on-going assessment to be
implemented, senior administration and faculty leaders should advocate for the
importance of having on-going assessment. The shared governance structure present
in most institutions of higher education highlight the importance of having
administration and faculty working together to create systemic change. The buy in
from these two key groups of stakeholders will not only communicate the
importance of assessment efforts to improve institutional efficiency but will also help
shape assessments efforts. Additionally, having these two groups involved increases
the likelihood that assessment efforts become an institutional priority with resources
devoted to make it possible. As was seen in my study, the collective knowledge of
administrators, faculty, and external researchers contributed to the interpretation of
data. The experience of administrators and faculty assisted in the identification of
institutional underperformance and ways to improve student success. Both groups of
stakeholders were actively involved in efforts to identify ways to more effectively
help students progress and achieve in their mathematics courses. The institutional
144
self-assessment project was able to take place due to the buy in from senior
administrators and faculty leaders.
Lastly, resources in the form of human resources and institutional committees
are needed to support on-going institutional self-assessment. Community colleges
do not have the “research capacity” to provide data to inform practice on a
continuous basis (Shulock et. al, 2008). This lack of capacity hinders the ability of
institutions of higher education to examine student progression through the general
education sequence. However, institutions need to have more researchers who have
the expertise to collect information and interpret data to guide practitioners.
Accountability requirements like ARCC and IPEDS take most of the time
researchers have to conduct research at the institutional level to inform practice.
Many times information presented for accountability reporting is not disseminated
which results in practitioners not being able to learn about institutional functioning.
The absence of information limits the ability of the organization to modify policies
and procedures to increase student success. Administrators and faculty do not have
the training or expertise to be able to collect and interpret information. Consequently,
an absence of information prompts practitioners to jump to creating solutions to
situations they do not fully understand. Additionally structures in place in the form
of institutional committees with the intent to examine causes of student performance
creates an environment which encourages on-going learning. Currently, community
colleges go through an assessment process infrequently as part of the accreditation
process. Every six years colleges go through an institution wide assessment process
145
whose intent is “compliance” and “improvement” (Lederman, 2009). Colleges are
examined to ensure they are meeting minimum requirements to keep their
accreditation and assessed to show institutional performance. Recently, more and
more colleges within the California Community College system have been placed on
probation due to a lack of adherence to minimum requirements and lack of
institutional improvement. Departments and programs rarely engage in an on-going
evaluation. In contrast, grant funded projects are required to provide an annual
performance report detailing progress made towards meeting grant objectives.
Therefore, institutions have the potential to improve institutional policies and
practices to promote student success if an on-going assessment process existed to
inform program and institutional improvements.
Implications for the Basic Skills Initiative
In this section, I apply what I learned through my study to the current
implementation of the California Community Colleges’ Chancellors’ Office Basic
Skills Initiative (BSI). I draw on my professional experience as a BSI coordinator at
a California community college to consider how institutional assessment could be
integrated into BSI. The California Community Colleges’ Chancellors’ office Basic
Skills Initiative (BSI) was adopted in 2005 as a way for the 110 community colleges
in the system to begin to better assist underprepared students. As a requirement to
receive funding from the Chancellor’s Office, colleges submitted action plans after
the first year of BSI in 2007. Before action plans were submitted, colleges were
required to conduct an institutional self-assessment. Data collected was meant to
146
give institutional stakeholders a sense of how students in basic skills courses were
doing as a whole at the institution. Moreover, institutions were supposed to have a
greater sense of the “scope and efficacy of current practices” (Boroch, D., Fillpot, J.,
Hope, L., Johnstone, R., Mery, P., Serban, A., Smith, B., and Gabriner, R.S., p.7,
2007). The self-assessment contained a standard set of data to measure the number
of students enrolled in basic skills courses in English and math broken down by race
and ethnicity to successful course completion rates of cohorts of students. Data was
seen as a way to have institutions become more aware of their institutional
environment. However, given the timing of the disbursement of funds in the
initiative it is possible some institutions treated the self assessment as compulsory
reporting. In which case, no increase in awareness or learning would be expected on
the part of institutional stakeholders. Action plans were to be based on insights
gained from the institutional self-assessment process but this phase would also be
compromised in the absence of organizational learning based on data.
Given these concerns, the second and third phases of the BSI provided on-
going trainings for practitioners based on the scholarship of best practice compiled
by the Research and Planning group. The typical structure of trainings during the
past three years has been as follows: action plan discussion, pertinent definitions,
discussion of college data, building support at the college level, and showcase of
model programs. Trainings were available for practitioners starting the second year
of the BSI. During the trainings, practitioners would be asked to focus on a part of
their action plan that they would then develop as a result of the insights learned in
147
the training. While data was mentioned as a way to understand current institutional
conditions, data use was not emphasized during trainings.
Moreover, BSI coordinators, mainly faculty, have the responsibility for
allocating BSI funds and managing plans on campuses. During trainings,
coordinators would come up with tentative plans based on the programs presented
during the training. For example, during the training a program might be showcased
which involved contextualizing curriculum within courses in vocational education
courses. The contextualized curriculum could involve instructors putting basic math
or English skills into a recognizable context to help basic skills students learn basic
concepts. In another training, another set of programs would be showcased and
participants were encouraged to brainstorm ways they could implement programs at
their campuses. Practitioners were encouraged to look at programs which
demonstrated some form of success at increasing student success to possibly
implement at their campus. Those programs included the following: learning
communities, tutoring, summer bridge, and first year experience. These trainings
were held once a semester during the fall, spring, and summer.
In my experience, certain elements needed for organizational learning were
not emphasized in trainings, including problem identification, understanding of the
process of implementation, and program assessment. The first element missing was
the identification of barriers students were facing. Practitioners were not exposed to
the process which was taken at each of the institutions to identify barriers to student
success. Therefore, practitioners may have been left with the impression that they
148
could take a program which demonstrated success and implement it at their campus
without considering the student and the local environment. As was seen in my study,
practitioners need to understand how students are performing before a solution is
implemented. A way to identify gaps in student performance is to use data to get a
sense of areas of difficulty for students. If an institution jumps to solutions, it will
not be addressing the root of the problem, but symptoms of a much larger problem.
Organizations need to engage in both single and double loop learning to understand
the attitudes and beliefs causing the problem at the institutional level.
It also struck me that based on BSI, practitioners would not gain an in depth
understanding of the process taken at each campus where benchmarks had been used
in regards to program design and implementation. How was the program designed?
What factors were considered in the design? Again the salient message taken may
have been that one could implement this model program without really taking into
account the uniqueness of each institutional environment. If the presenters of the best
practice programs had disclosed considerable detail about the process, the
practitioner might have learned techniques to use when at the stage of program
design and implementation. As was seen in my study, getting exposed to programs
and practices at other institutions through site visits and guided conversations with
implementers of exemplary practices was beneficial in getting the team to think
about ways to address student gaps in performance. However, the Hills College
team already had a sense of what the gaps in performance were and what
institutional factors contributed to student under performance. Therefore, in order
149
for a team to get to the stage to begin thinking about program design and
implementation, practitioners need to know the target population and the gap in
performance being addressed. Without knowing the needs, practitioners will not be
able to effectively address gaps in performance because they will not know how to
assist students to succeed.
Lastly, given the considerable challenges of assessing program effectiveness
relatively little emphasis was placed on this element of organizational learning.
While data was presented demonstrating program success of the model programs,
data providing details of program evaluation were not provided. Many community
colleges do not have the capacity to collect and analyze data on a consistent basis
due to the limited resources devoted to the institutional research function. This lack
of capacity means that practitioners have to find ways to effectively assess programs
with little to no assistance from the institutional researcher. The pressure on BSI
coordinators to design evaluations which capture changes in student performance
related to programs or interventions implemented becomes difficult. BSI
coordinators may or may not have the expertise to manage and interpret the data
being collected on students. Therefore, it becomes essential for the best practice
program leaders to communicate the process taken for the evaluation of their
programs. As was seen in my study, the collaborative organizational learning
approach had many advantages and one was that external researchers collaborated in
design of research instruments and drew appropriate interpretations from the data,
which pertained to characterizing program delivery rather than effectiveness. As was
150
noted by the institutional researcher at Hills, the external researcher involved in the
Institutional Self-Assessment Project gave the researcher much needed resources to
be able to provide data and data interpretation for a team of practitioners. The use of
external researchers also helped the team to come to know their unique institutional
environment better.
The findings from my study demonstrate that in order for practitioners to
begin to address the gaps in educational achievement among students, practitioners
need to become more aware of their institutional environment and learn based on
institutional data. Trainings are not likely to be enough because in order for change
to take place both collaboration and time is needed, as illustrated in my study where
a team of practitioners met for a common purpose over a period of time. External
researchers may be helpful to assist practitioners to develop effective programs
which will address barriers at key intervention points for students. The proximity of
practitioners to their institutional environment may cause them to miss elements
which they have taken for granted such as the culture of the environment. External
evaluators can help practitioners engage in difficult dialogues without fear of
negative ramifications. Based on the findings of my study and my experience with
the Basic Skills Initiative, I recommend the following structure for BSI trainings: (1)
Colleges need to form a campus wide inquiry team that contains senior
administrators, middle level administrators, and faculty from various disciplines; (2)
The larger campus wide inquiry team should be divided into subcommittees, with
chairs and co-chairs, focused on specific areas of inquiry; (3) Meetings would occur
151
on a regular basis and be divided in phases with the intent of collecting and
analyzing existing data to understand the institutional environment better; (4)
Eventually, the inquiry team would create programs and interventions whose impact
would be regularly evaluated to measure program effectiveness. In the following
sections, I provide more details of the recommended structure.
Composition of Inquiry Teams
As was seen in my study, broad representation of campus stakeholders is
needed to bring about change at the institutional level. This means that faculty,
administrators, and staff from all the major divisions and departments on campus
would be involved in organizational learning efforts. The involvement of Senior
Administrators and Faculty Chairs is important because they hold the most influence
at the institutional level. Without the participation of these two groups, the
likelihood of change to occur is lessened. Priorities in the institutional environment
are decided by these institutional leaders. Therefore, any changes which need to take
place need to be highlighted and supported by these institutional leaders. These
institutional leaders can also ensure that the attention and resources are given to
change efforts to make them possible. Ideally the teams would involve the
participation of the core stakeholder groups composed as follows: Vice President of
Academic Affairs, Vice President of Student Affairs, Vice President of Enrollment
Development, Math faculty chair, English faculty chair, ESL faculty chair, Career
and Technical Education faculty chairs, and Academic Senate President. Other
participants would include: managers from the divisions of Academic and Student
152
Affairs, faculty from Math and English, and Faculty Senate leaders. Currently, most
colleges have a campus wide BSI committee, but not all stakeholders at the
institutional level participate. The departments which usually participate are
mentioned above. However, additional department chairs should participate
especially those offering courses at the basic skills level.
Frequency of Meetings and Length of Project
Meetings should be held on a monthly basis for the larger group which
involves all of the senior campus leaders. Then subcommittees of the larger group
should meet on a bi-weekly basis for the entire academic year. Subcommittees
would be comprised of the managers and faculty who report to the senior campus
leaders. Senior campus leaders would appoint subcommittee chairs and co-chairs
who would be responsible for specific areas of inquiry. The subcommittees would
be charged with examining their respective area and reporting to the senior campus
leaders and the rest of the committee at least once a month. The external researchers
and Office of Institutional Research would be present at both larger and smaller
committee meetings to assist with data collection efforts and data interpretation.
Currently, many college BSI committees meet once a month and are not divided up
into subcommittees a larger group.
During the first phase, meetings 1-8, stakeholders from the larger group
would collect institutional data to understand institutional performance. Data that is
already collected at the organizational level would be compiled to provide insight
into institutional functioning. Additionally, practitioners would be asked to collect
153
data once areas of inquiry are identified. Initially, practitioners would be exposed to
quantitative student success data disaggregated by race and ethnicity to assist with
the identification of performance trends among student groups. As was seen in my
study, data presented in this way was beneficial for the administrators and faculty at
Hills College. For example, the student success of underprepared students in
developmental English and math classes would be presented to help practitioners
understand how students were performing. Additionally, data would be collected to
determine student progression through the developmental education sequence to
identify attrition points among students. By looking at the data, an inquiry team of
practitioners might identify one course as the intervention point due to the level of
attrition (Dowd, 2007).
Moreover, the team would collect their own qualitative data based on the
inquiry area selected among the team. The overall goal of this phase would be to
develop awareness and learning among practitioners of factors impacting student
performance. By developing practitioners’ awareness and learning of what is
occurring at the institutional level, practitioners will be better able to address student
underperformance. For example, as was seen at Hills College the team became
aware and learned about factors which explained why students were not using math
support services available through tutorial learning centers. The team at Hills
identified that there was a lack of dissemination of information during inquiry
activities. Practitioners then took action to address this lack of dissemination of
information. Subcommittees of the larger group would be charged with looking at
154
one or two elements of the area of inquiry. If the team decided to look at math, a
team would look at classroom practices and another would look at support services.
The main reason to have a larger committee and subcommittees would be to have all
practitioners engaged in the process as well as to maximize the amount of awareness
and learning taking place.
In the second phase, which encompasses meetings 9-15, the team would
develop programs/interventions to implement to address student performance in the
inquiry areas. These programs/interventions would be included in the BSI Action
Plans. Additionally, the team would develop an assessment and evaluation plan to
measure program progress and success. The first part of this phase is the creation of
programs and or interventions aimed at addressing gaps in student performance. For
example, in the Hills College case study the team discovered that student groups did
better in some math courses than others. While the team was able to identify courses
where students had the most difficulty, they were not able to create interventions
within the span of the project. If the team had more time within the Institutional
Self-Assessment Project, the team might have decided to implement an intervention
aimed at addressing student attrition and success in those courses students had the
most difficulty in.
The team would also visit other colleges which demonstrated success with
students in the area they were trying to improve in. The purpose of the visits would
be to identify factors which the team would need to consider while designing and
implementing the intervention. Essentially the team would be engaging in what
155
Dowd and Tong have referred to as process benchmarking (2007). Factors such as
the structure of the intervention, staffing needed, and resources would be considered.
Additionally, the team would learn through the experiences of the college they were
visiting as to what not to do. Of course, the team would implement their intervention
based on the characteristics of their institution and student population. The data that
Hills College collected during the site visits assisted the team in considering
solutions which they were not considering previously. One of the solutions being
considered was examining instructional practices. Similarly, site visits can assist
practitioners in considering solutions which could aid teams during the
implementation process.
Lastly, quantitative and qualitative measures would be developed based on
the programs and or interventions planned by the site team. If the team constructed
an intervention aimed at providing instructors in the targeted math classes with
professional development to improve student outcomes, the impact of the
professional development would be measured. The team might conduct focus
groups to examine impact of the intervention on instruction as well as look at student
performance within the courses. The collection of data would be done on a periodic
basis. The team would need to at the very least collect data every semester with an
in depth evaluation each year.
Moreover, the team would examine and interpret the data collected during the
assessment/evaluation periods. The subcommittees would examine data with the
assistance of the Office of Institutional Research and or external evaluators. A
156
synopsis of findings would be created to inform the larger committee as well as the
campus community at large. Improvements and adjustments would be made to the
programs implemented based on the results of the assessment/evaluation. The cycle
of assessment/evaluation would continue on a continuous basis to guide future
improvements. This continuous assessment done at the institutional level might help
institutions not only to ensure that accreditation standards are being adhered to but
that institutional improvements are aimed at ensuring student success for all students.
Additionally, the assessment results could be reported to the California Community
College Chancellor’s office at least once a year. The impact of programs
implemented through BSI are not currently reported. The only institutional data
reported to the Chancellor’s office has been the Institutional Self-Assessment done
during the 2007-2008 fiscal year.
The Basic Skills Initiative (BSI) is currently entering the fourth phase of
implementation which will focus exclusively on professional development.
However, many colleges still struggle with the implementation of programs to
address the gaps in educational attainment among students. This may be due to the
lack of data collection and interpretation which should have been done at the
beginning of the BSI project. To address the lack of inquiry occurring at community
colleges, the trainings offered to practitioners focused on data. Consequently, many
of the regional meetings and trainings offered to BSI coordinators use the same
training strategies to assist practitioners with their implementation plans. The
training agenda generally contains the following: (1) Keeping BSI a focus at the
157
institutional level; (2) Discussion of campus action plan; (3) How to look at data; (4)
Effective practice presentations; (5) How the effective practice could be
implemented at the college level. What is missing is a discussion of what programs
have been implemented at the institutional level and how effective they have been at
assisting students. Additionally, no training is offered to practitioners on how to
collect data at the institutional level to be able to get an accurate picture of what is
happening with students. The identification of gaps in student success is key to
determining where to invest BSI funds to effectively assist students. Data collection
and data interpretation may not be occurring at the college level which would mean
that practitioners may not be identifying organizational factors contributing to
inequitable educational outcomes among students.
Furthermore, perhaps the greatest difficulty experienced at community
colleges is the lack of research capacity present at the institutional level (Shulock, et
al., 2008). The team of community college practitioners and researchers involved in
the implementation of the fourth phase of the BSI could prove beneficial based on
the insights gained through my study. Most importantly, the team could structure
their professional development efforts around the use of data to become more aware
of and assist practitioners to learn about their institutional environment.
Additionally, the use of external researchers could assist in resolving the lack of
research capacity available at most community colleges. By using external
researchers, colleges could also be creators of action plans, evaluation plans, and
inquiry areas which would conform to the unique institutional characteristics of the
158
institution. By partnering with four-year institutions with the researcher capacity
needed by community colleges, practitioners could benefit greatly. The result of
such a partnership would be that practitioners would be trained to conduct their own
on-going data collection, as well as analysis and evaluations to ensure students are
offered services which allow them to attain their educational goals.
Conclusion
The current fiscal crisis gripping the United States has focused attention once
again on the role of effectiveness of higher education institutions at educating the
nation’s populace. The current budget deficit has put more pressure on institutions
of higher education to be accountable for the funding they receive from state and
federal sources. With student enrollment on the rise, higher education institutions
are being called upon to retrain displaced workers and play a role in recovery efforts
to jumpstart the economy. President Obama recently announced the American
Graduation Initiative, which will provide funding to community colleges to not only
increase the number of students graduating with certificates and degrees but also to
assist colleges in the implementation of strategies which will produce student success
(Obama, 2009). The initiative reaffirms the important role that community colleges
play in helping students learn the basic skills in English, math, and science to
contribute to local, state and national economies. Once more institutions of higher
education are being called upon to contribute to economies by being more efficient at
educating students. With increasing number of students underprepared in the basic
159
skills in the area of math and English, institutions of higher education will continue
to be challenged to provide a quality education to all who seek it.
The Basic Skills Initiative has provided support for institutions to address
ways to create more equitable educational outcomes among all student groups.
Regional trainings and meetings could be enhanced by teaching practitioners how to
use data to guide the implementation of interventions aimed at promoting student
success. While California Community Colleges are being held accountable through
ARCC data to make changes which will lead to student progress in basic skills
courses, BSI coordinators and faculty have not been trained on how to assess the
effectiveness of institutional efforts to increase student success. The self-assessment
project team should be involved in the latest phase of the BSI initiative to guide
practitioners on ways to conduct on-going institutional assessments aimed at helping
institutions reach parity in educational outcomes. A key component for all
institutions of higher education will be the ability to assess what is working and not
working in the classroom and what factors are hindering student progress towards
degree completion. In order to decrease the gap in educational attainment among
different ethnic groups, higher education institutions will have to develop ways to
learn about their institutional effectiveness.
President Obama’s American Graduation Initiative is not drastically different
from other efforts implemented by previous administrations to increase institutional
effectiveness. At a time when state budgets for educational institutions are being
drastically cut in states like California, the challenge for institutions of higher
160
education will be on how to create learning organizations which adapt to the
changing needs of the students in a way that can be sustained and used to create
improvements. The disparity of educational attainment among student groups will
continue to grow if institutions of higher education cannot create systems to
ameliorate the attainment gap. Educational leaders such as administrators and
faculty will need to create ways to increase the numbers of students achieving their
educational goals. The contributions of educational leaders will make a difference in
the nation’s economic recovery if college graduates are equipped with the skills to
compete in today’s global economy.
161
REFERENCES
Adams, J. and Illowsky, B. (2008). The California Basic Skills Initiative: Three
years young and still growing. I-Journal, Issue. 20. Retrieved August 3,
2008, from http://www.ijournal.us/issue_20/ij_20_03_adams_illowsky.html.
American Association of Community Colleges (AACC) (2004). Fact Sheet 2004:
Community College facts at a glance. Washington, DC: AACC. Retrieved
April 4, 2008, from www2.aacc.nche.edu/research/index.htm.
Argyris, C., & Schon, D.A. (1978). Organizational learning. In D.S. Pugh (Ed.),
Organization theory. New York: Penguin Books.
Bailey, T.R. & Alfonso, M. (2005). Paths to Persistence: An Analysis of Research
on Program Effectiveness at Community Colleges. Community College
Research Center, Teachers College, Columbia University, Vol. 6 (1), 1-34.
Bauman, G.L. (2005). Promoting Organizational Learning in Higher Education to
Achieve Equity in Educational Outcomes. In A. Kezar (Ed.), New Directions
for Higher Education, 131, 23-35.
Bensimon, E.M. (1989). The Meaning of “Good Presidential Leadership”: A Frame
Analysis. The Review of Higher Education, Vol. 12 (2), 107-123.
Bensimon, E.M. & Neumann, A. (1993). Redesigning Collegiate Leadership: Teams
and Teamwork in Higher Education. Baltimore, MD: The Johns Hopkins
Press.
Bensimon, E.M. (2005). Closing the Achievement Gap in Higher Education: An
Organizational Learning Perspective. In A. Kezar (Ed.), New Directions for
Higher Education, No.131, 99-111.
Bensimon, E. M. (2007). The underestimated significance of Practitioner
Knowledge in the Scholarship of Student Success. Review of higher
education, vol. 30, No.4, pp. 441-469).
Brown, R.S. & Niemi, D.N. (2007). Investigating the Alignment of High School and
Community College Assessments in California. The National Center for
Public Policy and Higher Education.
Birnbaum, R. (1988). How Colleges Work: The Cybernetics of Academic
Organization and Leadership. San Francisco, CA.: Jossey-Bass.
162
Boroch, D., Fillpot, J., Hope, L., Johnstone, R., Mery, P., Serban, A., Smith, B., &
Gabriner, R.S. (2007). Basic Skills as a Foundation for Student Success in
California Community Colleges. Research and Planning Group for
California Community Colleges.
Boudett, K.P., City, E.A. and Murnane, R.J., eds. Data Wise: A Step by Step Guide
to Using Assessment Results to Improve Teaching and Learning. Cambridge,
MA: Harvard Education Press, 2007).
Bustillos, L. T. (2007). Exploring Faculty Beliefs About Remedial Mathematics
Students: A Collaborative Inquiry Approach. Unpublished doctoral
dissertation, University of Southern California.
California Community College Chancellors Office. (March, 2005). Accountability
Reporting for Community Colleges Information. Retrieved from
http://www.cccco.edu/SystemOffice/Divisions/TechResearchInfo/Researchan
dPlanning/ARCC/tabid/292/Default.aspx
California Community College Chancellors Office. (March, 2008). Accountability
Reporting for Community Colleges Report. Retrieved from
http://www.cccco.edu/SystemOffice/Divisions/TechResearchInfo/Researchan
dPlanning/ARCC/tabid/292/Default.aspx
California Educational Code. (n.d.) Educational code number 87482.5(a). Retrieved
January 18, 2008 from http://www.leginfo.ca.gov/cgi-
bin/displaycode?section=edc&group=87001-88000&file=87400-87488
California Educational Code. (n.d.) Educational code number 84362(d)(e).
Retrieved January 18, 2008 from http://www.leginfo.ca.gov/cgi-
bin/displaycode?section=edc&group=84001-85000&file=84361-84362
California Postsecondary Education Commission. (n.d.) Public High School A-G
Completion Rates. Retrieved December 5, 2007, from
http://www.cpec.ca.gov/Accountability/AtoGReport.ASP
California Postsecondary Education Commission. (n.d.) 2005 College-Going Rates
to Public Colleges and Universities. Retrieved December 5, 2007, from
http://www.cpec.ca.gov/OnLineData/CACGREthnicity.asp
Carey, K. (2005). Choosing to Improve: Voices from Colleges and Universities with
Better Graduation Rates. Washington, D.C., The Education Trust.
163
Creswell, J.W. (2003). Research Design: Qualitative, Quantitative, and Mixed
Methods Approaches (2
nd
ed.). Thousand Oaks, CA: Sage.
Daft, R.L. and Huber G.P. (1987). How Organizations Learn: A Communication
Framework. Research in the Sociology of Organizations, 5, 1-36.
Dowd, A.C. (2005). Data Don’t Drive: Building a Practitioner-Driven Culture of
Inquiry to Assess Community College Performance. Lumina Foundation for
Education Research Report, December, 1-23.
Dowd, A.C., Malcolm, L., Nakamoto, J., Bensimon, E.M.,(2007, November).
Institutional Researchers as Teacher and Equity Advocates: Facilitating
Organizational Learning and Change. Paper presented at a meeting of the
Association for the Study of Higher Education, Louisville, KY.
Dowd, A.C. (2007). Community Colleges as Gateways and Gatekeepers: Moving
beyond the Access “Saga” towards Outcome Equity. Harvard Education
Review, 77 (4), 407-419.
Dowd, A.C., & Tong, V.P. (2007). Accountability, Assessment, and the
Scholarship of “Best Practice.” Higher Education: Handbook of Theory and
Research, Vol.XXII, 57-119.
Eckel, P.D. & Kezar, A. (2003). Taking the Reins: Institutional Transformation in
Higher Education. Westport, CT: Praeger Publishers.
Fiol, C.M., & Lyles, M.A. (1985). Organizational Learning. Academy of
Management Review, 10(4), 803-813.
Garvin, D. A. (1993). Building a learning organization. Harvard Business Review,
July-August, 78-90.
Greenwood, D.J. & Levin, M.(2000). Reform of the Social Sciences and of
Universities Through Action Research. In Denzin, N.K. & Lincoln, Y.S.
(Eds.), Handbook of Qualitative Research (2
nd
ed.) p.43-64. Thousand Oaks,
CA: Sage.
Grubb, N.W. & Badway, N.N.(2005). From Compliance to Improvement:
Accountability and Assessment in California Community Colleges. Higher
Education Evaluation and Research Group.
Huber, G. P. (1991). Organizational learning: The contributing processes and the
literatures. Organization Science, 2(1), 88-115.
164
Institute for Higher Education Policy (2004). Investing In America’s Future: Why
student aid pays off for society and individuals. Washington, D.C.
Kezar, A. (2005). What campuses need to know about organizational learning and
the learning organization. New Directions for Higher Education,131, 7-22.
Kezar, A. & Eckel, P. (2007). Learning to Ensure the Success of Students of Color:
A systemic Approach to Effecting Change. Change, July/August No.18(7),
19-24.
Kirst, M. (2007). Who Needs It? Identifying the proportion of students who require
postsecondary remedial education is virtually impossible. National Cross
Talk, No. 15(1).
Knapp, L.G., Kelly-Reid, J.E., Ginder, S.A., & Miller, E. (2008). Enrollment in
Postsecondary Institutions, Fall 2006: Graduation Rates, 2000 & 2003
Cohorts; and Financial Statistics, Fiscal Year 2006. National Center for
Education Statistics, Institute of Education Sciences, U.S. Department of
Education. Washington, D.C.
Kruse, S. D. & Louis, K.S. (1997). Teacher Teaming in Middle Schools: Dilemmas
for a Schoolwide Community. Education Administration Quartely, 33(3),
261-289.
Levitt, B. & March, J.G. (1988). Organizational Learning. Annual Review of
Sociology, No 14, 319-40.
Manwaring, R. (2005). Legislative Analysts Office: Proposition 98. Retrieved from
January 18, 2008, from
http://lao.ca.gov/2005/prop_98_primer/prop_98_primer_020805.htm
March, J.G. (1991). Exploration and exploitation in organizational learning.
Organization Science, 2(1), 71-87.
Marsick, V.J., & Watkins, K.E. (1999). Facilitating learning organizations: Making
Learning Count. Vermont: Ashgate Pubishing Co.
Merisotis, J.P. & Phipps, R.A. (2000). Remedial Education in Colleges and
Universities: What’s Really Going On? The Review of Higher Education, No.
24(1), p. 67-85.
165
Millett, C.M., Payne, D.G., Dwyer, C.A., Stickler, L. M., & Alexiou, J.J. (2008). A
Culture of Evidence: An Evidence-Centered Approach to Accountability for
Student Learning Outcomes. Princeton, NJ: ETS.
Moll, L.C, Amanti, C., Neff, Deborah & Gonzalez, N. (1992). Funds of Knowledge
for Teaching: Using a Qualitative a Qualitative Approach to Connect Homes
and Classrooms. Theory into Practice, No. 32 (2), Qualitative Issues in
Educational Research, p.132-141.
Moore, C., Shulock, N., Ceja, M., & Lang, D.M. (2007). Beyond the Open Door:
Increasing Student Success in the California Community Colleges. Institute
for Higher Education Leadership & Policy.
Morest, V.M., & Jenkins, D. (2007). Institutional Research and the Culture of
Evidence at Community Colleges. Community College Research Center.
Teachers College, Columbia University.
Myers, D. (October, 2004). The Present and Coming Crisis: Demography and
Education. Symposium conducted at the Tomas Rivera Policy Institute
Conference, Los Angeles, CA.
National Center for Educational Statistics (2008). The Condition of Education 2008.
Retrieved June 5, 2008, from
http://nces.ed.gov/programs/coe/2008/section3/indicator25.asp
National Center for Educational Statistics. (n.d.). Digest of Education Statistics
2005. Retrieved June 5, 2007, from
http://nces.ed.gov/programs/digest/d05/tables/dt05_310.asp
Northouse, P.G. (2007). Leadership: Theory and Practice (3
rd
ed.). Thousand Oaks,
CA: Sage.
Ottenritter, N. (2006). Competencies for Community College Leaders: The next
Step. Community College Journal. No. 76 p.15-18.
Patton, M.Q. (2002). Qualitative Research & Evaluation Methods (3
rd
ed.). Thousand
Oaks, CA: Sage.
Polkinghorne, D.E. (2004). Practice and the human sciences: The case for a
judgment-based practice of care. Albany, NY: State University of New York
Press.
166
Public Policy Institute of California, (2007). California Counts: Can California
Import Enough College Graduates?
Ramaley, J.A. & Holland, B.A. (2005). Modeling Learning: The Role of Leaders. In
A. Kezar (Ed.), New Directions For Higher Education. No. 131 p.75-85.
Research and Planning Group for California Community Colleges. (2005).
Environmental scan: A summary of key issues facing California community
colleges pertinent to the strategic planning process: Retrieved January 20,
2006 from www.rpgroup.org.
Shulock, N. & Moore, C. (2007). Invest in Success: How Finance Policy Can
Increase Student Success at California’s Community Colleges. Institute for
Higher Education Leadership & Policy.
Shulock, N., Moore, C., Offenstein, J., & Kirlin, M. (2008). It Could Happen:
Unleashing the Potential of California’s Community Colleges to Help
Students Succeed in California Thrive. Institute for Higher Education
Leadership & Policy.
Stake, R.E. (1995). The Art of Case Study Research. Thousand Oaks, CA: Sage.
Stanton-Salazar, R.D. (1997). A social capital framework for understanding the
socialization of racial minority children and youths. Harvard Educational
Review, 67(1), 1-40.
Stanton-Salazar, R.D. (2001). Manufacturing Hope and Despair: The School and Kin
Support Networks of U.S.-Mexican Youth. New York, Teachers College
Press.
The Education Trust (2005). ESEA: Myths versus Realities: Answers to common
questions about the new No Child Left Behind Act. Washington, D.C.
U.S. Census Bureau (2005) Current Population Survey: Community Survey.
Washington, D.C.
U.S. Department of Education (2006). A Test of Leadership: Charting the Future of
U.S. Higher Education. Washington, D.C., 2006.
Velez-Ibañez, C., & Greenberg, J.(2005). Formation and Transformation of Funds
of Knowledge. In Gonzalez, N., Moll, L.C. & Amanti, C. (Eds.), Funds of
Knowledge: Theorizing Practices in Households, Communities, and
Classrooms p.47-67. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
167
Walvoord, B.E. (2004). Assessment Clear and Simple: A Practical Guide for
Institutions, Departments and General Education. San Francisco, CA:
Jossey-Bass.
Wiessner, C.A., & Sullivan, L.G. (2007). New Learning: Constructing Knowledge
in Leadership Training Programs. Community College Review. No. 35 p.88-
112.
168
APPENDIX A: INFORMED CONSENT FOR NON-MEDICAL RESEARCH
Center for Urban Education
Rossier School of Education
University of Southern California
INFORMED CONSENT FOR NON-MEDICAL RESEARCH
CONSENT TO PARTICIPATE IN RESEARCH
The Institutional Self-Assessment Project: Enhancing
Institutional Effectiveness and Equity (Inquiry and Assessment
– Phase Two)
You are asked to participate in a research study conducted by Alicia Dowd Ph.D.,
Estela Bensimon Ph.D., Robert Rueda Ph.D., and Tara Watford Ph.D., from the
Center for Urban Education at the University of Southern California. Information
collected in the study may be utilized for the following students’ dissertations: Mary
Javier, Alice Young-Singleton, Roberto Gonzalez, Martha Enciso, Seema Gaur,
Cristina Salazar and Sheryl Tschetter. You were selected as a possible participant in
this study because of your knowledge and experience in basic skills coursework at
your college. You must be at least 18 years of age to participate. Your participation
is voluntary. You should read the information below, and ask questions about
anything you do not understand, before deciding whether or not to participate.
PURPOSE OF THE STUDY
The focus of the Institutional Self-Assessment Project is to increase the transfer
readiness and actual transfer of academically under-prepared students who begin
their postsecondary education in basic skills courses. Teams of administrators,
faculty, and counselors at three Los Angeles area community colleges will
169
participate in an assessment process focused on (1) analysis of the college’s student
transfer data, (2) an audit of the resources that support transfer functions, and (3)
identification of strategies that improve the information and counseling students
receive and the institutional policies that serve increase transfer to the four-year
college. The goal is to increase each participating college’s capacity to improve its
transfer effectiveness.
PROCEDURES
If you volunteer to participate in this study, we would ask you to be interviewed in
person. We would like you to tell us in your own words your knowledge and
experience around polices, practices and programs that impact college students who
start in basic skills courses transfer-readiness and ability to transfer. This interview
will last approximately one hour to two hours. If you permit, we will record the
interview conversations. You may refuse the interview to be recorded, or at anytime
during the interview process request that the recorder be turned off.
POTENTIAL RISKS AND DISCOMFORTS
There are no foreseeable risks, discomforts, or inconveniences that may result from
your participation in the research.
POTENTIAL BENEFITS TO SUBJECTS AND/OR TO SOCIETY
You will not benefit from this research study. However, it is expected that your
institution will increase its capacity to improve its transfer effectiveness.
PAYMENT/COMPENSATION FOR PARTICIPATION
You will not receive payment for your participation.
CONFIDENTIALITY
Any information that is obtained in connection with this study and that can be
identified with you will remain confidential and will be disclosed only with your
permission or as required by law. When the results of the research are published or
discussed in conferences, no information will be included that would reveal your
identity.
Only members of the research team will have access to the data associated with this
study. Information collected may be released to the dissertation committee of
participating students. This supervisor of the dissertation is Principal Investigator,
Alicia C. Dowd, Ph.D. The data will be stored in the investigator’s office in a locked
file cabinet/password protected computer. Research data and related records that can
be identified with you will be coded, stored, etc., to prevent access by unauthorized
personnel. If any other uses are contemplated, we will contact you to obtain your
consent. The data will be stored for three years after the study has been completed
and then destroyed.
170
When the results of the research are published or discussed in conferences, no
information will be included that would reveal your identity. Recordings of
interviews will be transcribed and them coded to shield your identity. All audio
recordings will be destroyed once the study has been completed.
PARTICIPATION AND WITHDRAWAL
You can choose whether to be in this study or not. If you volunteer to be in this
study, you may withdraw at any time without consequences of any kind. You may
also refuse to answer any questions you don’t want to answer and still remain in the
study. The investigator may withdraw you from this research if circumstances arise
which warrant doing so.
ALTERNATIVES TO PARTICIPATION
Your alternative is to not participate.
RIGHTS OF RESEARCH SUBJECTS
You may withdraw your consent at any time and discontinue participation without
penalty. You are not waiving any legal claims, rights or remedies because of your
participation in this research study. If you have questions regarding your rights as a
research subject, contact the University Park IRB, Office of the Vice Provost for
Research Advancement, Grace Ford Salvatori Hall, Room 306, Los Angeles, CA
90089-1695, (213) 821-5272 or upirb@usc.edu.
IDENTIFICATION OF INVESTIGATORS
If you have any questions or concerns about the research, please feel free to contact
Alicia C. Dowd: Principal Investigator, and Estela Mara Bensimon, Co-Investigator,
at 213-740-5202 or by mail at The Center for Urban Education, University of
Southern California, Rossier School of Education WPH 702, Los Angeles, CA
90089-4037.
SIGNATURE OF RESEARCH SUBJECT
I have read (or someone has read to me) the information provided above. I have been
given a chance to ask questions. My questions have been answered to my
satisfaction, and I agree to participate in this study. I have been given a copy of this
form.
Name of Subject
Signature of Subject Date
171
SIGNATURE OF INVESTIGATOR
I have explained the research to the subject and answered all of his/her questions. I
believe that he/she understands the information described in this document and freely
consents to participate.
Name of Investigator
Signature of Investigator Date (must be the same as subject’s)
172
APPENDIX B: INTERVIEW GUIDE #1
Introduction: I am interested in understanding the role of administrators play in
creating change. Specifically, I am interested in knowing more about how
administrators identify problems and create solutions to impact student learning
outcomes.
Background Questions:
1. Could you tell me what a typical semester looks like for you?
2. How long have you been in your current role?
3. What prompted you to work in the field of higher education? Community
colleges?
Specialized knowledge of impact of institutional policies on student achievement:
1. What were some things that came to mind when the data on student outcomes
in math was presented to you and the group?
2. Based on your knowledge and beliefs, why did some ethnic/racial groups do
better in math than others? Are there institutional factors contributing to
unequal student outcomes?
3. What did you think of the reaction of the rest of the team when the
achievement data was presented? Faculty reaction?
4. What does a successful student look like? Does this differ based on race and
ethnicity? What are some barriers students face on day to day basis?
5. How do you identify problems within departments/divisions you oversee?
Walk me through your process.
6. Once you have a picture of the situation/problem how do you proceed?
Solutions? Implementation? Assessment?
7. Do you involve others in the process of problem defining, constructing
solutions, and assessing outcomes?
8. How has your participation in the Institutional Self-Assessment Project
impacted the way you view and deal with institutional problems dealing with
differential student outcomes?
173
APPENDIX C: INTERVIEW GUIDE #2
1. Why did African American and Latino students not pass the gateway math
course at the same rates as other students?
2. What insights did you gain from the data collected on successful course
completion rates?
3. Are there any other factors which contribute to the success rates of African
American and Latino students in these courses?
4. What processes or structures are in place at Hills College which encourage
on-going learning among faculty, staff, and administrators? What changes
were prompted to those processes or structures as a result of the finding from
the ISAP?
5. How was knowledge created from learning disseminated and to whom before
the ISAP? Was the knowledge used to make improvements in organizations
performance?
6. How was the insight gained from ISAP disseminated to the campus
community?
7. In what ways does Hills equip you with the tools to help you make meaning
of a situation or construct knowledge of problems students face in the pursuit
of their educational outcomes? How has the ISAP assisted in this area?
8. What are the main priorities for the institution when it comes to student
educational outcomes? Were you aware of the priorities before the ISAP?
9. What institutional strategy/ies in place assist with the identification and
resolution of problems related to student educational outcomes? Has the
ISAP assisted in the development of institutional strategies to assist in this
area?
10. How would you characterize the fiscal stability of Hills? What is the staff
turnover rate like among faculty, staff, and administrators?
11. Tell me about the roles/positions you have held in your professional career.
What are some distinguishing characteristics of Hills college?
174
12. Has your involvement in the ISAP increased your understanding of the
institution? How will you use your knowledge of the institutional
characteristics to create more equitable learning outcomes among all students
at Hills college?
13. What process do you go through to learn more about a problem to implement
solutions to resolve problem prior to the ISAP? How has your involvement
in the ISAP changed how you implement solutions to problems?
14. Did you assess solutions you implemented prior to the ISAP? Is yes, please
tell me more about timelines, data collection, data interpretation, and
adjustments to interventions. If no, how has your involvement in the ISAP
changed the way you assess programs you oversee?
Abstract (if available)
Abstract
This manuscript details the learning which took place among 4 administrators involved in a collaborative Institutional Self-Assessment Project. Additionally, it was the intent of the study to determine the extent to which changes were informed by the learning which took place. The purpose of the overall project was to enhance the effectiveness of educational institutions at addressing the diverse learning needs of students entering the community college at the basic skills level. The intent of the study was to determine how data contributed to the creation of awareness and learning of inequitable educational outcomes among student groups. It was believed that by using current data sources that administrators would be more able to determine institutional performance and adjust institutional policies to increase student success. A qualitative case study approach was used to determine that administrators’ awareness and learning increased regarding the existence of inequitable educational outcomes. Changes which took place were based on the learning which took place.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Institutional researchers as agents of organizational learning in hispanic-serving community colleges
PDF
Remediating artifacts: facilitating a culture of inquiry among community college faculty to address issues of student equity and access
PDF
The influence of organizational learning on inquiry group participants in promoting equity at a community college
PDF
Beyond access: an evaluation of attitudes and learning towards achieving equitable educational outcomes in higher education
PDF
Creating an equity state of mind: a learning process
PDF
How community college basic skills coordinators lead organizational learning
PDF
Peers as institutional agents: acquiring social capital through peer interactions
PDF
The role of the superintendent in raising student achievement: a superintendent effecting change through the implementation of selected strategies
PDF
The impact of learning communities on the retention and social integration of Latino students at a highly selective private four-year institution
PDF
Math faculty as institutional agents: role reflection through inquiry-based activities
PDF
The effect of reading self-efficacy, expectancy-value, and metacognitive self-regulation on the achievement and persistence of community college students enrolled in basic skills reading courses
PDF
An evaluation of individual learning among members of a diversity scorecard project evidence team
PDF
Changing the landscape of institutional assessment on transfer: the impact of action research methods on community college faculty and counselors
PDF
Input-adjusted transfer scores as an accountability model for California community colleges
PDF
Relationships between a community college student’s sense of belonging and student services engagement with completion of transfer gateway courses and persistence
PDF
Designing equity-focused action research: benefits and challenges to sustained collaboration and organizational change
PDF
Making equity & student success work: practice change in higher education
PDF
Institutional researchers and organizational learning for equity
PDF
Examining the networks of program leaders in the community college component of the Puente Project within the context of a social capital framework
PDF
A developmental evaluation of action research as a process for organizational change
Asset Metadata
Creator
Gonzalez, Roberto
(author)
Core Title
Achieving equity in educational outcomes through organizational learning: enhancing the institutional effectiveness of community colleges
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Degree Conferral Date
2009-12
Publication Date
11/20/2009
Defense Date
08/19/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Community Colleges,inequitable educational outcomes,OAI-PMH Harvest
Place Name
USA
(countries)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Dowd, Alicia C. (
committee chair
), Cardoza, Raul J. (
committee member
), Rueda, Robert S. (
committee member
)
Creator Email
Gonzalez_Roberto@smc.edu,Rob_gonzz@yahoo.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2751
Unique identifier
UC1498791
Identifier
etd-Gonzalez-3393 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-282603 (legacy record id),usctheses-m2751 (legacy record id)
Legacy Identifier
etd-Gonzalez-3393.pdf
Dmrecord
282603
Document Type
Dissertation
Rights
Gonzalez, Roberto
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
inequitable educational outcomes