Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Developing adaptive expertise among community college faculty through action inquiry as a form of assessment
(USC Thesis Other)
Developing adaptive expertise among community college faculty through action inquiry as a form of assessment
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DEVELOPING ADAPTIVE EXPERTISE AMONG COMMUNITY
COLLEGE FACULTY THROUGH ACTION INQUIRY AS A FORM
OF ASSESSMENT
by
Sheryl Tschetter
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2009
Copyright 2009 Sheryl Tschetter
ii
DEDICATION
This study is dedicated to all community college faculty professionals and
to the students they serve.
iii
ACKNOWLEDGEMENTS
I would first like to acknowledge my husband, Ryan, who has supported
my educational goals unconditionally for a long time. He was the person who
encouraged me to embark on this academic journey 15 years ago, and I want to
thank him for his love and his faith in my abilities. This degree is truly a
collaborative effort. I share the honor of my degree with my two children, J. J.
(Jennifer) and Max. These two wonderful people bring great joy and contentment
to my life. I want to thank my father, Charles, and my brother, C.J. and his family,
for supporting me throughout this endeavor, and I know that my mother, Pat, my
sister, Jennifer, and my brother, Jeff, are cheering for me in heaven.
I would like to offer my thanks and appreciation to my dissertation chair,
Dr. Alicia Dowd. I would not have made it through the process without your
guidance; you are a tremendous asset to all those who will benefit from the work
being done at the Center for Urban Education at USC. I also have tremendous
gratitude for the members of my dissertation committee, Dr. Robert Rueda and his
guidance through educational psychology and Dr. John Levin and his work with
community college faculty. Thank you for your support and expertise.
Finally, I must acknowledge the support given me by the Norco campus of
Riverside Community College District, the community college campus where I
work. Without the understanding of my campus leadership, it might have taken a
lot more time to complete this study. In particular, I want to acknowledge the
support of Dr. Bonnie Pavlis, a colleague at Norco, for her unlimited support and
faith in my work.
iv
TABLE OF CONTENTS
DEDICATION ii
ACKNOWLEDGEMENTS iii
LIST OF TABLES vii
ABSTRACT viii
CHAPTER 1: INTRODUCTION
Community College Faculty
California Community College Faculty
Faculty Development for the Classroom
Faculty Development through Shared Governance
Accreditation Processes
Institutional Planning, Program Review, and Learning
Assessment
Transferring Promising Practices
Faculty Development Through Action Inquiry as a Form of
Assessment
Action Inquiry and the Center for Urban Education
Statement of the Problem
Purpose of the Study
Research Question
Importance of the Study
Organization of the Dissertation
1
3
7
7
11
15
16
20
21
25
27
28
29
29
30
CHAPTER 2: CONCEPTUAL FRAMEWORK
Reflection: An Overview
The Three Elements of Reflection: An Introduction
Experience
Collaboration
Examination of Personal Values and Beliefs
Knowledge, Expert Performance and Deliberate Practice
From Restrictive Expertise to Adaptive Expertise
Action Inquiry
The Importance of Reasoning in Increasing Adaptive Expertise
Figure 1: Relationships between concepts leading to professional
expertise
33
35
37
38
41
44
46
49
51
53
56
CHAPTER 3: METHODS
Context
Research Design
Data Collection
57
58
62
62
v
Observations
Website and Document Audit
Interviews
Analysis
Summary
63
64
65
69
73
CHAPTER 4: CASE NARRATIVES
Overview
Elizabeth
Grace
Beverly
Randy
Geraldine
Summary
74
74
76
78
82
85
88
90
CHAPTER 5: RESULTS
Center for Urban Education: An Overview of the Project
Action Inquiry Stage One: Frame the Problem
The “Hunches” Activity
Analyzing Statistical Data
Indicators of Adaptive Expertise
Expanded Knowledge Base
Action Inquiry Stage Two: Advocating for a Solution Using Concrete
Observations
Interviews
Stage Three: Assess the Solution
Discussion
Experience: Barrier or Facilitator?
Defensive Reasoning
Productive Reasoning
Conclusion
93
95
97
97
100
101
101
104
105
106
108
110
110
111
113
114
CHAPTER 6: CONCLUSION
Personal Narrative
Background
A New Faculty Member
Finding #1: Developing expertise by building on experience and
knowledge
Finding #2: Using faculty interest to drive development
Finding #3: Using collaborative/team learning to develop faculty
Finding #4: Providing the environment that results in productive
reasoning
Finding #5: The role of collaboration and leaders
Finding #6: Using the practitioner-as-researcher model for
developing faculty
118
118
118
119
120
121
123
124
126
127
vi
The impact of the action research project on my expertise
128
REFERENCES 130
APPENDICES
APPENDIX A: CENTER FOR URBAN EDUCATION: FACULTY
INTERVIEW GUIDE, TUTORING/LEARNING LABS
& ASSESSMENT
APPENDIX B: CENTER FOR URBAN EDUCATION; CENTER
FOR URBAN EDUCATION FACULTY INTERVIEW
GUIDE, ROUND 2, ACTION INQUIRY AND
COMMUNITY COLLEGE FACULTY ADAPTIVE
EXPERTISE
APPENDIX C: COMPLETE RESULTS FROM “HUNCHES”
ACTIVITY, Project Orientation Meeting with Urban
Community College, “Hunches” Exercise Notes
137
137
141
144
vii
LIST OF TABLES
Table 1: Complete list of project activities 68
Table 2: Relationships which research contends exists between action
inquiry, elements of reflection, and positive resulting
characteristics that indicate adaptive expertise
71
Table 3: List of participants’ activities in the project and months when
interviews for this study took place
75
Table 4: Responses of faculty participants to analyzing statistical data 103
Table 5: A comparison of characteristics between defensive reasoning
and productive reasoning
104
Table 6: Summary of responses regarding potential for change based on
experience
110
viii
ABSTRACT
This study examines the potential for developing the adaptive expertise of
community college faculty members through participation in action inquiry. Prior
research contends that for the research activities of action inquiry to result in
adaptive expertise, they must include the elements of reflection: experience,
collaboration, and examination of values and beliefs. Therefore, this study focuses
on the opportunities six full-time community college faculty members who
participated in such an inquiry project had for reflection. It then examines the
relationship between those reflective opportunities and self reported changes in
faculty members’ perceptions of their expertise in instruction and shared
governance. The findings support the view that faculty participants in action
inquiry may gain expertise to make changes in either individual or institutional
practices as a result of their participation. However, those changes may well be
limited or circumscribed due to a lack of experience or knowledge of institutional
and governance processes.
1
CHAPTER ONE: INTRODUCTION
Today, community colleges face increasing enrollments, decreased funding,
changing demographics, multiple educational missions, and greater demands for
efficiency and productivity (Grubb, 1999; The RP Group, 2005; Levin, Kater, and
Wagoner, 2006). Meeting these challenges is important as students are enrolling in
increasing numbers in community colleges (National Center for Education Statistics,
Table 177). In California, the site of this study, the state’s Little Hoover Commission also
emphasizes the significance of community colleges; the report states that “for many
Californians, the community colleges are the gateway to self-sufficiency and a world
class education” (2000). However, the commission also recognizes that there are many
factors that contribute to student success, and one factor that is essential is the “quality of
teaching” (2005). This adds emphasis to the value of community college faculty
development.
In addition, challenges for greater accountability combined with decreased
funding have resulted in expectations that faculty need to participate in multiple roles at
the institution. Inside the classroom some faculty members teach different types of
classes such as basic skills education, transfer preparation to four-year institutions,
English as a second language learning, occupational certificates, workforce preparation,
vocational programs, community education, and associate degree preparation, and some
teach multiple levels of student preparation within these courses. Yet pre-service
preparation focuses on discipline content mastery rather than pedagogical expertise
(Grubb, 1999; Cranton, 1994), and often faculty development programs often do not
2
address the teaching function of these colleges (Little Hoover Commission, 2005, p. 31).
When reports state that most community college faculty in California “have little
teaching experience or teaching skills when they are hired,” this is problematic (Little
Hoover Commission, p. 26).
The increasing reliance of these institutions on part-time instructors for teaching
responsibilities adds to this problem. In fact, statistics indicate that part-time faculty in
academic teaching fields at two-year public institutions of higher education outnumber
their full-time colleagues 2 to 1(National Center for Education Statistics, Table P62).
Outside the classroom, these part-time instructors are not generally required to participate
in any campus-based activities. Therefore, the faculty role in institutional decision
making and leadership falls to the small number of full-time faculty employed by these
institutions. Once again, without the pre-service training to handle their changing roles,
full-time community college faculty often find themselves overwhelmed by expectations
inside and outside the classroom that they are not prepared to meet.
In California today, full-time faculty members are expected to participate as
professionals in shared governance processes to meet increased demands for
accountability. Levin et al (2006) define shared governance in the contemporary context
as “the organizational work in public community colleges which is shared between
faculty and administration, with the role of faculty going above and beyond their
traditional teaching roles” (p. 50). In California, these processes are legislated by
Assembly Bill 1725 which requires that each community college district board of trustees
confer with or reach agreement with the local faculty academic senate on 11 issues
3
related to teaching and learning. Additionally, Levin et al argue that a by-product of the
challenges faced by community colleges is a faculty whose “professional work” is now
subject to “formal planning, systematic performance evaluation, centralized resource
allocation, and directive leadership” in ways it has not been before (p. 4). Therefore, in
California, full-time community college faculty members, whether teaching academic or
vocational courses are expected to participate in an area for which they have not received
pre-service training. Beyond that, additional training through faculty development
programs tends to be limited because resources are often needed to react to other
challenges rather than to develop faculty. Research suggests that existing faculty
development programs tend to be “ad hoc, lacking in institutional support, and having
powerless coordinators” (Twombly and Townsend, 2008, p. 14). Therefore, faculty often
do not receive the training they need to carry out their roles (Grubb, 1999, pp. 294-301).
Further, they are not given the time necessary to independently develop the type of
expertise that would allow them to participate as professionals in reshaping the
“environments in which academic institutions carry out” their missions (Dill, 1999, pp.
127-128). This lack of development diminishes their capacity to exercise professional
judgment with regard to instruction and shared governance processes. Therefore, the
problem of increasing faculty expertise to participate as professionals in their multiple
roles poses a substantial challenge in community colleges today.
Community College Faculty
Recognizing how community college faculty members are perceived differently
from their four year counterparts by researchers and others illuminates some of the
4
challenges these faculty members face in meeting the pressures of their multiple roles.
Studies (Twombly and Townsend, 2008; Spear, Seymour, and McGrath, 1992) discuss
how little attention has been paid to community college faculty members in research
conducted in post-secondary education. Then, when they are mentioned in research, they,
and their colleges and students, are often portrayed as “deficient” (Twombly and
Townsend, p. 8).
Twombly and Townsend (2008) summarize how community college faculty
members are characterized in the literature that does exist. There is recognition that 67%
of community college faculties are employed part-time (p. 12). However, less noted is the
fact that while two-thirds of all community college faculties are part-time, full-time
faculty members do the “bulk of the teaching,” an estimated two-thirds of all classes
taught nationwide (p. 12). Townsend and LaPaglia (2000) argue that since the primary
responsibility of community college faculty is teaching, they are often seen as
contributing little to scholarship. This results in being seen as a “lesser class of college
professors by their four year counterparts (p.41). This view is reinforced by the fact that
community college faculty members are also paid less than their four-year counterparts,
earning approximately 81% of university faculty salaries (Twombly and Townsend, p.
13).
Another reason for this characterization of community college faculty is that
qualifications for community college faculty to teach academic courses are “typically a
master’s degree with 18 graduate hours in the teaching field. The master’s degree is seen
as less narrow than a doctorate yet still providing the depth needed to teach associate’s
5
degree students, while faculty for vocational and technical fields “may require the
baccalaureate degree or less, when combined with work experience in the teaching field”
(Twombly and Townsend, 2008, p. 15). Applicants for positions at community colleges
tend to come from a variety of backgrounds while four-year faculties follow a more
traditional path to the faculty (p. 15). It is also suggested that there is a pecking order
among community college faculties where instructors who teach developmental courses
are seen as having “lesser status” than those who teach transfer level (Twombly and
Townsend, p. 17).
In contrast to university faculty whose primary workload includes research,
community college faculty members’ work is clearly teaching. These faculty members
have an average teaching load of “five 3-hr courses per semester” (Twombly and
Townsend, 2008, p. 14). With teaching as their primary focus, community college faculty
members receive little incentive to conduct research, so they conduct very little. While
this may be viewed as a deficiency by some or as a diminishment of status, community
college faculty members tend to report satisfaction at a higher level than their four-year
counterparts. This may reflect satisfaction with the teaching role or the shorter work week
in community colleges (p. 14).
Asserting that teaching in the community college “fall[s] between high school
teaching and university teaching,” Spear, Seymour, and McGrath (1992) discuss the
challenges facing community college faculty specifically in terms of a lack of
professionalism and professional development. They contend that because teaching is so
clearly the main responsibility of community college faculties, drift occurs toward the
6
role of generic “effective teacher,” which they contrast from the drift that may be
observed among university teachers who become disengaged from the rigor of academic
disciplines (pp. 23-24). As a result of this drift, development workshops often focus on
improvements in teaching in a generalized manner, rather than a thorough review of
relevant research or disciplinary advances. Indeed, these researchers refer to the
“emerging anti-intellectual faculty culture [. . .] of practitioners” at community colleges
where, they assert, academic culture is disintegrating (p. 27). They conclude by arguing
that community college faculty members need to truly develop as a “new professoriate”
(p. 28) by developing professional skills.
Palmer (1992) contends that the potential for community college faculties to
create a new profession is complicated by “the administrative concerns of collective
bargaining, faculty burnout, and the continuation of faculty already hired” (p. 29). He
argues that one way to increase professionalism is through systematic assessments
leading to improved instruction and student learning (1992, p. 30), similar to the
processes that are part of shared governance at these institutions. However, Townsend
and Twombly (2008) acknowledge “little is known about the role shared governance
plays in the work lives of community college faculty members and its importance to
them” (p. 16).From the studies that have been conducted, researchers argue that generally
the processes of shared governance serve the interests of management and the institution,
but not necessarily those of faculty (Levin et al, 2006; Thaxter and Graham, 1999).
In summary, studies indicate that while community college faculty are often
viewed as “deficient,” there are opportunities for increasing their professionalism, and
7
one of those opportunities is through participation in systematic assessments such as
those found in shared governance processes.
California Community College Faculty
One area where community college faculty members in California differ from
their national counterparts is through the power of their academic senates. Assembly Bill
1725, enacted in 1988, was designed to allow faculty “a stronger position within each
college where they would share authority in specific areas of college activity” (White,
1998). The bill was originally conceived with three goals in mind: 1) develop governance
systems that were collegial; 2) increase the power of local academic senates; and 3)
separate the colleges from their K12 “roots” by moving toward a higher education model
(White). In what is known as the “10 plus 1” rule, AB 1725 requires trustees of the
college districts to confer with or reach agreement with the local senates on 11 issues
related to student learning. The current understanding of AB 1725 has evolved over the
last two decades, resulting in greater power for faculty in the community college system
of California. Consequently, to meet this larger role, community college faculty in
California have an even greater need than their counterparts in other states for the skills
necessary to participate effectively in shared governance.
Faculty Development for the Classroom
A significant challenge today for community colleges and their faculties is
improving pedagogical expertise for the sake of enhancing student learning. This has
become increasingly difficult as more and more students are arriving at community
colleges without the preparation to succeed in college level work. Estimates indicate that
8
as high as 64% of students attending community colleges need at least one remedial course,
while 63% of these students average one year or more of remedial course work (Melguizo,
Hagedorn, and Cypers, 2007; Venezia, Kirst, and Antonio, 2006). The large range of student
preparation results in some community college faculty members providing instruction at both
the transfer and remedial levels, often at the same time. Also, transfer level classes with no
prerequisites are increasingly populated with a growing percentage of students who are
underprepared to work successfully at the college level. Faculty is expected to provide
discipline-specific content instruction that takes into consideration varying levels of
student preparedness as well as diverse cultural backgrounds and language issues.
Furthermore, teaching an identified remedial or basic skills course is often considered a “low
status” teaching assignment, but the teaching “is no less difficult and absorbing than the other
forms of teaching” (Grubb, 1999, pp. 171-174). To overcome these issues in the classroom,
community college faculty members clearly need additional pedagogical knowledge and
expertise to meet new challenges in their role as teachers.
In addition to the issue of student preparation, community college faculty
members are expected to provide instruction that encompasses the diverse cultural and
social differences of their students. In California, the typical demographics of the
community college student population are comprised of white non-Hispanic students
(34%), Hispanic students (31%), Asian students (11.5%), African-American students
(10%), Filipino students (3%), Pacific Islander and American Indian students (less than
1% each), and unknown (10%) or other (1.8%) ethnicities (California Community
Colleges Chancellor’s Office website, 2009). To accommodate student diversity,
9
instructors need training to help them develop the expertise to deal with multiple domains
and multiple contexts of learning.
To understand this concept further, Alexander (2003) defines academic domains
as the organizing of “vast bodies of related knowledge and experience and important
cultural tools” that enable instructors to develop expertise in academic domains through
the acquisition of knowledge, strategic processing, and personal interest (p. 10). There
are two types of expertise related to academic domains: restrictive and adaptive. Research
on knowledge related to expertise in teaching has been traditionally focused on teaching
in the K-12 levels of education (Berliner, 2004, p. 203) where development of expertise
is generally domain-specific, contextually bound, and acquired over a significant length
of time (Dreyfus, 1986; Nunn, 2008; Farrington-Darby and Wilson, 2005; Berliner).
These traits do not necessarily reflect the multiple domains and contexts experienced by
community college faculty. In order to understand and respond to current pressures on
faculty to address issues of diversity and multiple learning styles, it is timely to consider
research on teaching expertise as well as to explore ways to extend this research to the
community college setting.
Berliner’s (2004) empirical study on the behaviors of expert K-12 teachers
identifies certain characteristics of context-specific expertise in teaching, also known as
restrictive expertise (pp. 200-201). As Berliner sees it, restrictive expertise is acquired
when teachers develop automaticity, yet remain sensitive to specific classroom contexts.
Expert teachers with restrictive expertise recognize patterns in the academic domain
quickly and develop “pedagogical expertise” through time and experience (p. 201). From
10
this study, it seems that becoming an expert generally takes at least seven years of
teaching. During this time, there are three influences that help teachers develop restrictive
expertise. These include 1) a desire to be excellent, 2) good coaching, and 3) practice
(Berliner, p. 202). Restrictive expertise develops over considerable time and requires
consistent practice in stable, contextualized environments. It occurs by teaching the same
domain knowledge in the same context to the same kinds of students over a period of
time, generally an academic year. Restrictive expertise requires this stability in academic
domain and context because without it, when an expert is moved out of her normal
context, where she has intimate knowledge of the cognitive abilities of her students, the
result can be a loss of expertise (Berliner, p. 203). In other words, restrictive expertise is
not considered transferrable.
Community college faculty members do not have the opportunity to achieve a
similar stability in academic domain and context as their K-12 counterparts. Community
college faculty members experience multiple levels of student preparation in different
courses within a semester, which is generally considerably shorter than a full academic
year. Specifically, the expert community college instructor adapts pedagogy in response
to a variety of elements (such as gender, age, culture, college preparedness, race, and
educational goal) in a very short time, usually 14-16 weeks. Given this information,
community college faculty members could be more effective if exposed to activities that
increase their capacity for what Berliner calls adaptive expertise (Berliner, 2004, p. 203).
Adaptive expertise refers to an individual’s ability to recognize and activate an
ever increasing complex system of connected problems by acquiring conceptual
11
understanding of the “target domain” in order to select an appropriate solution to a given
problem (Schon, 1983, pp. 15-16; Berliner, 2004, p. 203). This kind of adaptive expertise
is of particular importance for community college faculty members who need to move
between levels of student ability and sometimes among multiple levels of courses (basic
skills to transfer) within the discipline-specific domain that they teach. In addition,
adaptive expertise can help community college faculty members serve students more
effectively in the classroom through instruction and in shared governance where they can
advocate on behalf of students. In short, exposure to activities that offer opportunities for
faculty members to acquire adaptive expertise can help faculty members improve student
learning in the classroom. Additionally, increasing adaptive expertise in faculty members
may help these individuals participate more effectively in core institutional assessments
to reshape the educational environment (Dill, 1999; Levin et al., 2006).
Faculty Development through Shared Governance
External public demands for assessing the quality of student learning in higher
education are increasing. One result of the increased demand for accountability is that
institutions of higher education are increasingly calling upon their full-time faculties, as
visible institutional actors, to participate in assessments that impact the “core of the
institution, a core that includes the fashioning as well as the implementation of mission”
(Levin et al., 2006, p. 141). However, in general, these institutions have not addressed the
need for faculty to acquire the skills to participate as professionals in these assessment
processes. More relevant to this study is that in California, faculty participation in
12
assessments of the institution’s effectiveness in areas of learning is legislated through
Assembly Bill 1725 (Academic Senate of California Community Colleges).
Assembly Bill 1725 provides legislative expectations that the expertise of faculty
will be used in developing college policies in areas affecting student learning (“How
College Governance Affects You”). These areas include all facets of curriculum, degree,
and certificate requirements as well as grading, program development, academic
standards, accreditation and other governance structures, professional development,
program review, institutional planning and budget development, and other mutually
agreed upon areas (“How College Governance Affects You”). Therefore, in California,
faculty participation is assumed to be a significant part of all institutional assessment
processes commonly falling under the rubric of shared governance.
As mentioned earlier, Levin et al., (2006) define shared governance in the
contemporary context as “the organizational work in public community colleges which is
shared between faculty and administration, with the role of faculty going above and
beyond their traditional teaching roles” (p. 50). Expectations for full-time faculty
participation are particularly emphasized in the processes of accreditation self studies,
institutional planning, program review, and student learning outcomes assessment. As
expectations for faculty participation in governance have increased, it is possible to
consider the possibility that participating in these assessment processes might increase
professional skills.
13
The American Association for Higher Education (AAHE) offers the following
definition of assessment:
Assessment is an on-going process aimed at understanding and improving
student learning. It involves making our expectations explicit and public;
setting appropriate criteria and high standards for learning quality;
systematically gathering, analyzing, and interpreting evidence to
determine how well performance matches those expectations and
standards; and using the resulting information to document, explain, and
improve performance. When it is embedded effectively within larger
institutional systems, assessment can help us focus our collective
attention, examine our assumptions, and create a shared academic culture
dedicated to assuring and improving the quality of higher education.
This definition suggests that assessment is intended to engage participants in an ongoing
systematic process of setting goals, gathering data, analyzing data, and using data to
understand and to improve student learning. Ideally, participants incorporate transparency
(“explicit” and “public”), sustainable quality, and reflective practice (“examine our
assumptions, and create a shared academic culture”) as a collective. Indeed, in California
especially, this definition assumes faculty participation in assessment and such
participation may have the potential to increase knowledge and expertise (“using the
resulting information to document, explain, and improve performance”) by incorporating
elements of reflection, such as examining assumptions (Barnett, 1995; Birmingham,
2003; Nunn, 2008; Dewey, 1910; Rodgers, 2002; Polkinghorne, 2004; Reason, 2005;
Greenwood and Levin, 2005). Therefore, these activities can provide opportunities for
faculty to help increase expertise to reshape core processes linked to improvement of
pedagogy, which in turn enhances student learning.
However, there are conflicts inherent in many assessment processes that limit the
benefits of faculty participation. At times, participation is perceived by faculty as “an
14
increase in faculty work and responsibility for the management of the institution” rather
than an opportunity for “joint decision making” (Levin et al., 2006, p. 50). There is a
difference between “sharing in decision making” and “faculty participation in decision-
making” (Levin et al., p. 50), and it is the nuanced distinction between these two
understandings that speaks to the tension often embedded in the processes of shared
governance. In addition, faculty interest is a critical element for increasing knowledge
and expertise through participating in assessments. Faculty members who gain the most
from participating in these processes generally have a personal investment in the domain
being assessed, or at minimum, they have a general interest that causes them to access
additional knowledge (Alexander, 2003, p. 11). Others - in fact most faculty -may not
benefit because they do not agree with the purpose of the processes, they are not
interested in the focus of the assessments, or they lack the expertise to participate as
professionals in assessments.
Slevin (2001) discusses the lack of faculty expertise for participating in
assessment (p. 291). He points out that the vocabulary that dominates discussions of
assessment is foreign to most faculty members, whose primary background is vocabulary
that is discipline-specific. As Grubb (1999) indicates, instruction is isolating and
fragmented, which contrasts with assessment concepts of “collaboration” and
“teamwork” (Slevin, p. 291). Slevin then cites the Association of American Colleges and
Universities (AACU) statement on assessment that “The curriculum as a whole and the
institution as a whole are the most powerful teachers” rather than the faculty themselves
Slevin, p. 291). According to Slevin, this statement moves faculty “out of the picture of
15
higher education” (p. 291) at a time when their contributions are needed most. Therefore,
in order to bring faculty back in, Slevin recommends reframing faculty participation in
assessment by allowing faculty to participate in those assessments in which they are most
interested or have the greatest expertise. Slevin contends that by allowing faculty to select
assessments that engage in intellectual work of interest to them, they will participate
more readily, and they will design assessments that can then be “studied and reviewed
with rigor according to norms generally recognized in the academy” (Slevin, p. 298). The
reframing of faculty participation as Slevin suggests might mitigate the conflicts that
exist in assessments generated by external agencies, but to date, there is no evidence that
such reframing has occurred in any assessment process, at least in California’s
community colleges.
Accreditation Processes
An assessment process common to most community colleges is the development
of an evaluative self study (part of shared governance) for the purpose of acquiring or
maintaining accreditation. Within the definition of assessment provided by AAHE (see p.
12), faculties, as institutional stakeholders, participate in the development of a self study
by working collaboratively to collect evidence in support of a study in which the
institution’s performance against predetermined standards is analyzed and reported. In
the West, including the community colleges in California, the Accrediting Commission
of Community and Junior Colleges (ACCJC), a part of the Western Association of
Schools and Colleges, is the accrediting body. ACCJC believes:
The primary purpose of an ACCJC-accredited institution is to foster
learning in its students. An effective institution ensures that its resources
16
and processes support student learning, continuously assesses that
learning, and pursues institutional excellence and improvement. An
effective institution maintains an ongoing, self-reflective dialogue about
its quality and improvement. (ACCJC, 2002)
The ACCJC definition of an effective, accredited community college indicates that
improved student learning is the primary focus of institutional assessment. To this end,
resources, including community college faculties and support areas, are to engage in
continuous improvement of all aspects of student learning, including self-reflective
dialogue (ACCJC). As mentioned previously, in California, this expectation is reinforced
by Assembly Bill 1725 which mandates that faculty be consulted in areas affecting
student learning. However, the requirement for assessing multiple levels of the institution
for purposes of accreditation is externally driven and this often leads to tension within
faculty (Young, 1997; Ewell, 1991; Alexander, 1998; Alexander, 2000; Leveille, 2005).
Specifically, some faculty members view this type of externally imposed assessment as
linked to institutional effectiveness not learning. As such, faculties see this type of
assessment as in conflict with cherished faculty beliefs such as academic freedom and the
sanctity of the classroom (Leveille; Young). Therefore, since accreditation is an
externally driven assessment process, participating in it or in the additional assessment
processes embedded in accreditation may not result in increased knowledge and expertise
because of such faculty conflicts.
Institutional Planning, Program Review, and Learning Assessment
Accreditation is a broad process that incorporates other assessment processes
including institutional planning, program review, and student learning outcomes
assessment (ACCJC, Accreditation Standards, 2004). Institutional planning is the largest
17
of the three assessment processes in accreditation. In institutional planning, accreditation
standards expect participation from all institutional stakeholders in assessments of areas
such as resource allocation, system effectiveness, resource development, and leadership
and governance (ACCJC, 2004; California System Strategic Plan, 2006). A critical
process embedded in institutional planning is program review.
Program review was implemented in California by the California Academic
Senate to provide faculty members the opportunity to learn basic professional standards
regarding “accreditation, general and categorical accountability, and community
education needs” (California Community Colleges, Academic Senate, 1996, p. 1).
Program review also provides faculty with a systematic process for assessing
institutional effectiveness in instruction, programs, and services by participating in
evidence-based analysis and evaluation and that fosters “self-renewal and self-study”
through the accreditation process (California Community Colleges, Academic Senate, p.
9). Using program review processes, faculty can learn how to “monitor and pursue the
congruence between the goals and priorities of the college and the actual practices in the
program or service” (California Community Colleges, Academic Senate, p. 9). By
participating in program review, faculty members can increase their institutional
knowledge and expertise in order to impact core function of the institution. Because
program review was developed by faculty for faculty, it is logical to assume that faculty
conflicts would decrease and opportunities to increase expertise would be greater.
However, most faculty don’t have a working knowledge of “accreditation, general and
categorical accountability,” or other steps in the program review process. Additionally,
18
unless decision making authority is included, faculty interest may be narrow, thereby
limiting the extent to which knowledge and expertise can be increased (Levin et al., 2006,
p. 50; Alexander, 2003, p. 11).
Program review is also linked to other areas of assessment at community colleges
including educational planning, accreditation, budget and capital expenditures,
curriculum, student equity, and regional academic planning (California Community
Colleges, Academic Senate, 1996, pp. 14 – 20). These other areas of assessment offer
additional exposure to important processes for increasing faculty institutional knowledge
and expertise. However, unless faculty have the professional skills necessary to
participate or they are interested in the areas of the assessments, the value of participating
is questionable. A very important assessment process that is generally aligned with
program review, learning assessment, offers opportunities for increasing both
pedagogical and institutional expertise, particularly in instruction, but it comes with
significant tensions.
To understand the significance of assessing student learning, it is important to
understand how these terms are defined. For this study, the definitions originate in
California. A generally accepted definition of learning outcomes states that they are “the
specific measurable goals and results that are expected subsequent to a learning
experience. These outcomes may involve knowledge (cognitive), skills (behavioral), or
attitudes (affective) that provide evidence that learning has occurred as a result of a
specified course, program activity, or process” (Bakersfield College, March 2009).
19
Additionally, the definition of learning assessment from the California Community
Colleges Assessment Institute states:
Learning assessment refers to a process where methods are used by a
faculty member, department, program or institution to generate and collect
data for evaluation of and improving student learning. This term refers to
any method used to gather evidence and evaluate quality and may include
both quantitative and qualitative data. (2009)
Therefore, learning assessment processes connect to elements embedded in the AAHE
definition of assessment (see p. 12) and the accreditation standards of ACCJC (see p. 15).
As such, this important assessment process offers opportunities for exposing community
college faculty members to a variety of paradigms for evaluating the extent to which
changes in pedagogy improve student learning which can result in increased expertise.
Additionally, by participating in assessing all issues of learning faculty members can
acquire an expanded knowledge of the institution and areas outside their normal
environment. However, assessing learning is as complex as learning itself. Therefore, the
challenge remains for helping community college faculty members acquire the expertise
to develop the methods needed to assess learning.
In addition, learning assessment causes tensions for some faculty members
because of the perceived evaluative nature of this type of assessment. The tensions are
exacerbated when learning assessments are required by external agencies (Young, 1997;
Ewell, 1991; Alexander, 1998; Alexander, 2000; Leveille, 2005). More specifically,
faculty members who recognize that the nature of their work is changing and their
autonomy is decreasing (Levin et al., 2006), sometimes view external accountability as
“regulations and incentives from outside an educational institution that try to hold the
20
institution (or parts of an institution [. . .]) responsible for various dimensions of quality”
(Grubb and Badway, 2005, p. 2). In short, faculty believe learning is a complex process
and that externally driven assessment does not take into consideration the multiple
elements that faculty cannot control, such as extrinsic and intrinsic student motivation,
socioeconomic challenges, language difference, and other factors. Since the quality of
instruction is hard to describe (Grubb, 1999), faculty members often participate in
learning assessment superficially or not at all. Therefore, learning assessments, as defined
here, especially those demanded by external agencies may not be an effective way after
all to expose faculty members to opportunities for acquiring the skills necessary to
contribute effectively.
Transferring Promising Practices
Another possibility for increasing professional expertise is through the transfer of
promising practices. While there is no single definition for the transfer of promising
practices, one study from the field of business seems to reflect an understanding of this
process as having three steps: 1) Develop an ongoing process to identify the best or
promising practices; 2) Determine which traits to transfer to your organization and
implement these; and 3) Develop a process that institutionalizes the best practices across
the institution and continue to monitor for improvement (Dehoff, Hauser, F. Jones, L.
Jones, Neilson, Park, Shorten, and Spiegel, “Best Practices Transfer: Unleashing the
Value Within, 2001). The transfer of promising practices is recommended in California
by two documents. The increasing numbers of students arriving at community colleges
underprepared for college level instruction have resulted in the Basic Skills Initiative
21
(Center for Student Success, 2007). The Basic Skills Initiative and the state implemented
accountability system, called Accountability Report for Community Colleges, encourage
the dissemination of “best or promising practices” between institutions as a model
method for improving institutional effectiveness. However, transferring best practices, for
a number of reasons, is an extremely difficult method of trying to improve an institution,
and it may not be a productive way of increasing faculty knowledge and expertise (Clark
and Estes, 2002; Dehoff, et al., 2001; Love, 1985; McKeon, 1998; and Tucker, 1995).
Many scholars have pointed out the challenges that exist on multiple levels to
transferring promising practices (Clark and Estes, 2002; Dehoff et al., 2001; Love, 1985;
McKeon, 1998; and Tucker, 1995). For example, researchers recognize that transfer of
educational practices cannot occur merely by adopting an existing program. Transfer
requires careful selection of the specific traits in a program that are likely to help the new
institution be successful (McKeon, 1998). For these reasons, it is important to question
whether community college faculties in California can increase their expertise and
knowledge by engaging in the transfer of promising practices since it is such a complex
endeavor.
Faculty Development through Action Inquiry as a Form of Assessment
In contrast to the assessment processes previously discussed, community based
action inquiry “provides people with the means to take systematic action to resolve
specific problems” (Stringer, 1999, p. 17). It is built on the collaboration of professional
researcher and local stakeholder to solve a real problem in a real context (Greenwood and
Levin, 2005, p. 54). Action inquiry evolves from a classical technique called phronesis,
22
another term for reflection, which originates with Aristotle’s work Nicomachean Ethics.
Several studies (Greenwood and Levin, p. 52; Polkinghorne, 2004; p. 117) define
phronesis, as the designing of action through collaborative knowledge construction
among stakeholders in response to an indeterminate situation, or problem. An
indeterminate situation occurs when a practitioner’s actions fail. This failure provides an
important opportunity for advancing one’s practical knowledge (Polkinghorne, p. 121;
Nunn, 2008; Dewey, 1910; Rodgers, 2002), which can lead to another indeterminate
situation, which in turn leads to the process all over again. Rodgers equates reflection
with inquiry and offers Dewey’s four criteria for reflection. In general, Dewey believed
reflection requires 1) a meaning-making process; 2) a systematic, rigorous way of
thinking; 3) interaction with others; and 4) open-mindedness for growth (Rodgers, p.
845). Therefore, action inquiry is internally driven to solve a problem that involves
reflection and leads to action through phronesis. As a result, this type of assessment
seems to negate some of the tensions caused by other assessment processes, and perhaps
also offers the best method for helping faculty acquire the characteristics of adaptive
expertise mentioned earlier.
Research (Stringer, 1999; Greenwood and Levin, 2005) discusses how action
inquiry, through phronesis and reflection, differ from other assessments discussed in this
study. Action inquiry takes into account “history, culture, interactional practices, and
emotional lives” (Stringer, p. 17). Action inquiry is a “practice-oriented way of knowing”
that asserts “no results ever will be achieved unless local actors learn how to act in
appropriate and effective ways and use suitable tools and methods,” in other words,
23
increase their expertise (Greenwood and Levin, p. 51). To increase institutional
knowledge and pedagogical expertise, participants engage in ongoing assessment of
“knowledge generation” and “experimentation in context” performed by the group
(Greenwood and Levin, p. 53) by participating in three steps: 1) framing the problem; 2)
collecting evidence to advocate for a solution; and 3) implementing a solution and
assessing its effectiveness. Knowledge construction is facilitated through “observations,
reflection, planning, and review” on an ongoing basis throughout the research period
(Stringer, p. 158). These activities reflect elements found in the American Association of
Higher Education’s (AAHE) definition of assessment (see p. 12), which calls for a
systematic “gathering, analyzing, and interpreting evidence to determine how well
performance matches [. . .] expectations and standards” (American Association for
Higher Education). The activities also relate to the standards put forth by the Accrediting
Commission of Community and Junior Colleges (ACCJC, see p. 15). However, action
inquiry is generally not externally driven. Therefore, the nature of action inquiry
eliminates many of the conflicts articulated by faculty against other assessments.
Recent studies discuss action inquiry as a viable way for improving institutional
effectiveness and student learning (Bensimon, 2007; Dowd, Malcom, Nakamoto,
Bensimon, 2007; Rueda and Marquez, 2008). More importantly, action inquiry activities
include local actors, particularly faculty members, interested in resolving a specific local
problem (Stringer, 1999, p. 17). Therefore, the element of faculty intrinsic interest—vital
to increasing knowledge and expertise—is achieved (Alexander, 2003, p. 201).
Significantly, the activities in action inquiry include many of the elements embedded in
24
the definition of assessment provided by the American Association of Higher Education
(AAHE). These include collaboration, transparency, goal setting, data collection and
analysis, and reflection. Consequently, participating in action inquiry assessment
activities seems to offer a viable way of increasing faculty expertise.
Action inquiry has two objectives: 1) to produce knowledge and action that is
useful to a group of people; and 2) to empower people through the construction and use
of their own knowledge (Reason, 2005, p. 328). A third and important condition of
participatory action inquiry is that there be authentic commitment and genuine
collaboration (Reason, p. 328). These traits of usefulness, empowerment, commitment,
and genuine collaboration, along with participant interest, reinforce the potential of action
inquiry to increase participant knowledge and expertise (Slevin, 2001; Alexander, 2003).
Action inquiry is a different way of constructing knowledge through research
activities such as “setting agendas, data gathering and analysis, and controlling the use of
outcomes” (Reason, 2005, p. 329). It requires “emergent processes of collaboration and
dialogue that empower, motivate, increase self-esteem, and develop community
solidarity” (Reason, p. 329). These processes are of particular importance to community
college faculty participants who often are not trained in the activities of research. These
activities can result in increased self-esteem and greater empowerment that can lead to
opportunities for participants to increase knowledge and expertise, so they will act to
solve a problem. In fact, the success of the action inquiry project is determined by the
“willingness of local stakeholders to act on the results of the research” (Greenwood and
Levin, 2005, p. 54).
25
Action Inquiry and the Center for Urban Education
1
The Center for Urban Education (CUE) at the University of Southern California
has developed multiple projects with various California community colleges using action
inquiry activities between 2003 and 2008. CUE contends that “becoming an expert and
reflective practitioner comes about from participation in facilitated rigorous assessment
activities of how basic skills and transfer are ‘done’ in one’s own campus and in peer
campuses” (Urban Community College Interim Report, 2008). To this end, CUE’s
projects include activities founded in action inquiry, and they are collaborative and
community based. Some of these action inquiry projects focus on the potential of
community college practitioners to become researchers at their local institutions by using
the practitioner-as-researcher model. The practitioner-as-researcher model calls for
principles found in the concept of phronesis. As stated earlier, phronesis is defined as the
design of an action plan in response to an indeterminate situation, or problem
(Greenwood and Levin, 2005, p. 52; Polkinghorne, 2004; p. 117). The practitioner-as-
researcher model helps participants recognize the importance of asking the right
questions, understanding the role of power, fostering equity-mindedness, and viewing
change as “multidimensional” (Bensimon, 2007, pp. 457-458). This model is particularly
important for faculty development because it relies on collaborative activities, and it
1
The Center for Urban Education (CUE) is a research and action center whose mission is to conduct
research that will result in the creation of enabling institutional environments for children, youth, and
adults from socially and economically disenfranchised groups residing in urban settings.
26
emphasizes the role of community college instructors as the “target of change or
intervention” which is rare in higher education research (Bensimon).
Through these ongoing projects, the Center for Urban Education (CUE) helps
community college practitioners identify a local problem, gather data, determine an
effective solution, and assess the process using tools typically used by researchers. Some
of the tools developed by CUE for this study included the interview and observation
protocols used by faculty participants in collecting data and the quantitative benchmark
data on student success. Additional data, both quantitative and qualitative, were collected
from the California Accountability Report for Community Colleges (ARCC), ongoing
institutional self assessments, student equity plans, and accountability assessment
processes such as those found in the development of an accreditation self study.
Participation in these types of collaborative action inquiry activities provided
opportunities to increase institutional knowledge and pedagogical expertise in faculty
(Bensimon, 2007; Bensimon, Polkinghorne, Bauman and Vallejo, 2004: Dowd, 2005;
Dowd and Tong, 2007; Greenwood and Levin, 2005; Polkinghorne, 2004; Reason, 2005;
Stringer, 1999). Working collaboratively as researchers at their own institutions may help
faculty participants increase professional skills.
In this study, the collaborative nature of the inquiry teams composed of CUE
facilitators and local faculty members was significant in that it offered a way to mitigate
barriers such as assessment vocabulary and concepts discussed earlier in this chapter
(Slevin, 2001, p. 291). Working collaboratively, faculty members were exposed
repeatedly to the concepts, contexts, and language normally found in assessment
27
processes. In addition, the focus of the projects on a local problem identified by faculty
participants reinforced faculty interest, an important element in developing faculty
professionalism also presented earlier in this chapter (Alexander, 2003). As a result,
faculty who participated in one or more of CUE’s projects had opportunities to increase
their familiarity with assessment processes and work on an issue of interest for them,
overcoming two barriers to increasing institutional knowledge and pedagogical expertise.
In addition, these faculty participants had access to considerable resources developed by
CUE for the projects. As a result, it is logical to assume that exposure to these types of
activities can lead to increased characteristics of adaptive expertise in community college
faculty. Research (Greenwood and Levin, 2005, p. 54; Reason, 2005, pp. 325-326)
suggests that a primary way for determining the effectiveness of action inquiry is through
the implementation of change. Therefore, this study looks for self reported acts or
changes either in pedagogy or contributions to shared governance processes that exhibit
characteristics of adaptive expertise by faculty participants as a result of engaging in the
collaborative activities of action inquiry.
Statement of the Problem
Faculty members in California community colleges face significant challenges in
their work. As previously stated, these practitioners are often expected to have the
appropriate pedagogical expertise to improve student learning through instruction in
multiple domains and multiple contexts. They are also expected to have acquired
considerable knowledge of institutional operations in order to contribute as professionals
in the assessment activities inherent in shared governance (accreditation, institutional
28
planning, program review, and student learning outcomes assessment). Unfortunately,
most community college faculty members have not been trained to respond to these
increasing challenges. Of perhaps greater concern is that while shared governance
processes offer additional opportunities for developing expertise, these assessment
processes, especially when driven by external demands, can result in a negative response
from faculty that impacts participation. As a result, although faculty need development
opportunities to hone their expertise, research indicates that shared governance,
traditional faculty development methods, such as one-shot workshops (Grubb, 1999, p.
298), and transferring promising practices (Clark and Estes, 2002; Dehoff, et al, 2001;
Love, 1985; McKeon, 1998; and Tucker, 1995) do not lead to sustainable improvement in
faculty expertise. Consequently, through the process of elimination, the most promising
faculty development opportunity appears to be participating in action inquiry activities
where faculty interest, a critical element for acquiring skills, is high, and faculty
practitioners work in collaboration with researchers to solve a local problem (Alexander,
2003, p. 201; Slevin, 2001, p. 298). Some research does exist in this area (Bensimon,
2007; Dowd, Malcom, Nakamoto, Bensimon, 2007; Rueda and Marquez, 2008), but
more is needed.
Purpose of the Study
This study investigates whether or not full-time faculty who participated in an
action inquiry project with researchers from the Center for Urban Education (CUE)
perceived an increase in their adaptive expertise as a result of their participation in the
project. The characteristics under investigation include: 1) an increase in knowledge and
29
understanding of a problem; 2) improved reasoning skills, which often involves moving
from defensive to productive reasoning; and 3) increased self-empowerment to affect
change. Lastly, the study tried to identify particular aspects of the project activities that
may have been the impetus for changes reported by the participants.
Research Question
This study will attempt to answer the following question: Does participating in
collaborative action inquiry activities as a form of institutional assessment result in
increased perceptions of expertise in faculty participants?
1.. If so, what aspects of action inquiry are associated with changes in the way faculty
view their expertise and what is the nature of those changes in perception?
2. If not, what inhibits changes in how faculty view their expertise?
Importance of the Study
This study focuses on the potential to increase faculty adaptive expertise through
various types of institutional assessments within an action inquiry project. Participation in
action inquiry and its potential relationship to increase adaptive expertise is an especially
important topic for community colleges where faculty members are increasingly asked to
participate as professionals in multiple domains of academic knowledge within multiple
contexts that often overlap. Additionally, no literature review has found research that
looks at faculty development through the lens of expertise. As community colleges
continue to educate the largest percentage of undergraduate students for their first two
years in the United States, it is important that they and their faculties are prepared to
handle new and shifting challenges. Specifically, the study discusses any self-reported
increases in characteristics of adaptive expertise by a purposeful sample of faculty who
30
acted as researchers in an action inquiry project facilitated by the Center for Urban
Education (CUE). As such, this study contributes to the growing body of case study
research focused on community college faculty and their preparation to assume multiple
professional roles that has been undertaken by CUE since 2003. This study also
contributes, in general, to expanding research being done on community colleges and
their faculties, and it may also inform future research regarding faculty development and
expertise whether the faculty members studied reported an increase in their expertise or
not.
Organization of the Dissertation
This study is premised on the need for community college faculty members to
increase institutional knowledge and pedagogical expertise to participate as professionals
in multiple roles. The challenges facing today’s community college faculty members are
significant. They are expected to improve student learning in two ways, through
instruction and institutional assessments. The professional development of these faculty
members through exposure to activities leading to increased expertise is considered in the
hope that readers who may be interested in changing the pre-service preparation and post-
hire development of these practitioners can act with greater awareness of the issues
involved.
The study is presented in five sections. The first section frames the problem and
introduces an understanding of adaptive expertise. It overviews a variety of opportunities
currently available that can lead to increased expertise in community college faculty
members. Discussion centers on the value of participating in various types of assessments
31
of institutional areas, transfers of promising practices, and action inquiry activities for
increasing expertise. The discussion concludes that the method with the greatest potential
for increasing adaptive expertise may be participation in action inquiry activities included
in ongoing projects between the USC Center for Urban Education (CUE) whose projects
provide the context and research for my studies and community college practitioners. The
section concludes with an overview of the study and the research question.
The second section provides a literature review and conceptual framework which
contends that adaptive expertise can be increased through action inquiry. Analysis and
synthesis of current literature establishes multiple relationships between the
characteristics of reflection as it appears in action inquiry—experience, collaboration, and
examination of beliefs—and research conducted on increasing adaptive expertise. These
relationships inform the research design presented in section three, and they provide the
framework for analyzing data in section five.
Section three describes the research design based on the concepts presented in
section two. As a bounded case study, the research design includes a purposeful sample
selected from the projects conducted by CUE since 2003. This sample provides focus on
a specific local problem and context. Emphasis is consistent on the role of action inquiry
in providing opportunities for developing adaptive expertise. The selected methods have
been included to measure any development in expertise as a result of participating in
activities of action inquiry.
Section four provides case narratives introducing the six community college
faculty members who participated in the action inquiry project and agreed to be
32
interviewed for this study. These narratives provide an initial understanding of the status
of these individuals at their institution, and they offer a framework for the findings in
section five.
The findings in section five reflect what a purposeful sample of community
college faculty members reported about the impact their participation in the action
inquiry project with CUE had on their institutional knowledge and pedagogical expertise.
Chapter four also reports any changes made, especially at the institutional level, as a
result of their participation. The chapter also discusses specific activities, identified
through the from data analyses, that seemed to promote new understandings and uses of
expertise as well as any barriers that disallowed any changes in the perceptions of
adaptive expertise on the part of these faculty members.
The sixth section concludes the study. This section considers the impact of these
findings from the study and discusses how these findings can impact current practice as
well as future research. It concludes with personal reflections.
33
CHAPTER TWO: CONCEPTUAL FRAMEWORK
Community college faculty have a primary responsibility of teaching, but
additional responsibilities to impact the “core” of their institutions have increased as well
(Grubb, 1999; Levin et al., 2006). Generally, community college faculty expertise “lies in
their subject area specialization” (Grubb; Cranton, 1994, p. 732), and most faculty
development programs do not provide meaningful experiences for improving faculty
expertise. The development programs that do exist are often neither context based, nor do
they have a theoretical foundation (Cranton, p. 727; Grubb). Theories discussed in
research indicate that for development activities to result in increased knowledge and
expertise they generally need to include elements found in reflection: experience,
collaboration, and examination of values and beliefs (Alexander, 2003; Argyris, 1993;
Benner, 2004; Berliner, 2004; Cranton; Dreyfus and Dreyfus, 2004; Farrington-Darby
and Wilson, 2006; Nunn, 2008; and Villegas and Lucas, 2002).
Many scholars have argued, drawing on Dewey’s philosophical studies of
professional practice (Dewey, 1910) and empirical studies conducted using theories of
reflective practice (Barnett, 1995; Birmingham, 2003; Licklider, 1997; Milner, 2003;
Raelin, 2007; Rodgers, 2002; Rousseau and Tate, 2003; and Sunal, Wright, Hodges, and
Sunal, 2000), that reflection is vital to the process of learning. Birmingham suggests that
reflection should be connected to phronesis resulting in phronetic deliberation. She
contends that by using phronetic deliberation we are more likely to reason correctly in
order to consider new strategies, an important element in developing expertise (p. 190).
Phronetic deliberation requires deciding which “moral concerns or excellences” are
34
appropriate to the problem being studied and whether or not the action decided upon
reinforces one or the other or both (Polkinghorne, 2004, p. 116). Polkinghorne also
contends that this type of deliberation “produces knowledge about practical choices by
integrating background understandings” and other elements relevant to the situation (p.
116). By engaging in this type of reflection, professionals will “know how to enact
virtues such as fairness, caring, truthfulness, and kindness in complex situations”
(Birmingham, p. 190). Of greater significance is the connection between these virtues and
the two aims of action inquiry: 1) Produce knowledge and action that is useful to a group
of people; and 2) Empower people through participating in the construction of knowledge
for a useful purpose (Reason, 2005, p. 328). This type of reflection is viewed as a
continuous process because different cultures and experiences call for “different
understandings and different responses” (Birmingham, p. 188). This is especially
important for community college faculty who often work in a classroom of students with
diverse cultures and backgrounds as well as the possibly unfamiliar context of shared
governance. (Note: For the purpose of this study, all future references to the term
reflection will assume interchangeability with phronetic deliberation unless otherwise
noted.)
Studies in action research (Bensimon, 2007; Greenwood and Levin, 2005;
Polkinghorne, 2004; Reason, 2005) support Birmingham’s findings (2003, p. 190) that
relationships exist between the activities embedded in reflection and the potential to
increase expertise. According to researchers, reflection is vital to developing the
“practical wisdom” of experts (Greenwood and Levin; Reason; Polkinghorne;
35
Birmingham). This reinforces the research studies that contend the three elements of
reflection-- experience, collaboration, and examination of personal values and beliefs--
are important to new learning (Barnett, 1995; Birmingham, 2003; Dewey, 1910;
Licklider, 1997; Milner, 2003; Raelin, 2007; Rodgers, 2002; Rousseau and Tate, 2003).
Additionally, other research sees reflection as a catalyst for developing expertise through
metacognition (Barnett, p. 47; Greenwood and Levin; Reason, Polkinghorne; Stringer,
1999; Dewey, 1910; Rodgers, 2002).
Reflection: An Overview
Reflection considers all “particulars” relevant to the situation (Polkinghorne,
2004, p. 116). As mentioned in Chapter One, Dewey’s philosophical work indicated he
believed that reflection should be a meaning-making process that involves rigorous
thinking in the presence of others, and he believed that professionals should enter into it
with attitudes conducive to improvement of self and others (Rodgers, 2002).
Opportunities for reflection in action inquiry often occur where professional researchers
and local stakeholders work collaboratively to: 1) frame the problem; 2) advocate on
behalf of possible responses using concrete examples; 3) inquire how users respond to a
solution when it is implemented; and 4) determine what subsequent steps are necessary to
modify or create a new solution based on responses (Dewey, 1910; Rodgers, 2002;
Greenwood and Levin, 2005, p. 51; Reason, 2005). Consequently, the steps embedded in
action inquiry connect to and reinforce the importance of reflection as a means of solving
a local problem and perhaps increasing expertise.
36
Reflection is also viewed as an “active, persistent, and careful consideration of
any belief or supposed form of knowledge in light of the grounds that support it” (Dewey,
1910, p. 6). Like the steps in action inquiry listed above, Dewey sees reflection beginning
with the framing of a complex problem, and entailing a very active investigation focused
on bringing forth facts and concrete examples that can corroborate or nullify a belief
(Dewey, p. 9). The active investigation is based on past experience and prior knowledge,
and an important aspect is the quality and depth of data collected (Rodgers, 2002). For
Dewey, the most important factor in the reflective thought process is being able to
suspend the need for a conclusion while corroborating or refuting potential solutions
(Dewey, p. 12). Dewey understands that how reflection is used to solve an indeterminate
situation mirrors the steps of action inquiry, which research contends can result in
increased professional expertise.
Reflection is also seen as a catalyst for developing professional expertise when
learners use metacognition to monitor their own processes (Barnett, 2005, p. 47). It is
seen as a way to engage in expert thinking through multiple steps that begin with a
problem, or indeterminate situation (Greenwood and Levin, 2005, p. 53; Rodgers, 2002;
Bauman, 2005; Bensimon, 2007), and continue until the response to a solution has been
assessed (Barnett, pp. 48-49). Barnett contends that thoughtful reflection can result in
characteristics of expertise including an expanded knowledge base, improved pedagogy,
and new learning, or empowerment, leading to action and change (p. 49).
Reflection is context-based and it is interactive where participants are both the
“subject and objects of inquiry” (Rodgers, 2002, p. 53). Reflection, in this study also
37
known as phronetic deliberation, and action inquiry embody three elements: experience,
collaboration, and examination of beliefs (Birmingham, 2003, p. 190; Bensimon, 2007;
Greenwood and Levin, 2005; Polkinghorne, 2004; Reason, 2005). With these three
elements as part of action inquiry, participants have the opportunity to overcome natural
human defensiveness and “produce self-fulfilling and self-sealing systems of action” in
response to a variety of academic challenges (Reason, p. 330). More specifically,
reflection as part of action inquiry can help overcome two threats to any inquiry project:
unaware projection and consensus collusion (p. 327). Unaware projection occurs when
we “deceive ourselves” to avoid critical examination of our own beliefs. Consensus
collusion happens when a group of “co-researchers may band together” to avoid
examining their views because of the anxiety such examination raises (Reason, p. 327).
These considerations regarding the effects of reflection ground the action inquiry
approach, and like Aristotle’s phronetic deliberation, require monitoring of one’s own
responses to choices, which often results in adjustments to individual values and beliefs
(Greenwood and Levin, 2005, p. 53; Polkinghorne, 2004, p. 121; Rodgers, 2002; Reason,
2005, p. 330). In short, the use of reflection, or phronetic deliberation, in action inquiry
can result in more efficient and successful practitioners (Polkinghorne, p. 123). The next
section offers a more detailed review of the findings from research on the potential value
of reflection for increasing expertise.
The Three Elements of Reflection: An Introduction
The first element of reflection, experience, provides opportunities to develop a
knowledge base that is broad and deep and to increase understanding of particular
38
situations (Barnett, 1995; Birmingham, 2003; Dewey, 1910; Licklider, 1997; Milner,
2003; Raelin, 2007; Rodgers, 2002; Rousseau and Tate, 2003). These are important
aspects of developing expertise discussed in the research on expertise (Alexander, 2003,
p. 12; Benner, 2004, p. 190; Berliner, 2004, p. 203; Cranton, 1994, p. 727). Studies
(Dewey; Licklider; Raelin; Rodgers) show that experiences are considered even more
significant when part of collaborative reflection because of “the idea of experiential
knowledge arising through participation with others” (Reason, 2005, p. 333).
Collaboration, the second element of reflection, allows assumptions to be challenged
(Cranton, p. 740), includes diverse stakeholders (Raelin, p. 63), and requires the
willingness to recognize that there is a “possibility of error even in the beliefs that are
dearest to us” (Rodgers, p. 861). Reflection’s third element, examination of values and
beliefs, can lead to new learning and meanings (Rodgers). In fact, Torbert contends that
through self reflective activities, we “become more aware of [our] behavior and its
underlying theories” (qtd. in Reason, p. 330). These three elements of reflection, or
phronetic deliberation, provide the framework for recognizing the potential for action
inquiry to result in improved faculty skills in instruction and governance (Polkinghorne,
2004, p.123).
Experience
Dewey understood experience as a “broadly conceived . . . interaction between
the person and his or her environment,” resulting in a change for the person and the
environment (Rodgers, 2002, p. 846). Without this type of interaction, Dewey believed
that learning is “sterile and passive” leaving the learner unchanged (p. 847). In addition to
39
interaction, Dewey believed experience also required continuity, or the building of one
experience upon another for increased awareness and learning (Rodgers, p. 846), and
without the element of continuity, Dewey contends “learning is random and
disconnected” (p. 847). In short, to effect change, Dewey viewed interaction and
continuity as vital parts of experience.
Studies on reflective practice (Birmingham, 2003; Barnett, 1995) and expertise
(Alexander, 2003; Cranton, 1994) offered findings that experience can increase
knowledge and expertise. Birmingham states that direct experience is vital to preparing
teachers for “working in an unfamiliar culture.” More specifically, she contends that
nothing can “substitute for the kind of learning that comes through reflection on direct
experience” (p. 188). This type of experience provides practitioners with the ability to
move between differing contexts and diverse settings (p. 188). Dewey sees experience as
vital in education. He believes that education is the “reconstruction or reorganization of
experience which adds to the meaning of experience, and which increases [one’s] ability
to direct the course of subsequent experience” (Rodgers, 2002, p. 843). Dewey believes
that moving from one experience to another offers a learner “deeper understanding” of
how the experiences connect and the relationships between them (Rodgers). Experiences
are the “thread that makes continuity of learning possible and ensure the progress of the
individual, and ultimately, society” (Rodgers, p. 845).Concrete experiences provide
opportunities for developing responses to challenges experienced in the classroom as well
as helping promote the “character-intensive endeavor” of developing expertise
(Birmingham, p. 191). Garvin’s study on business organizational learning (1993) also
40
sees experience as a primary part of learning. He sees a connection between reflection as
a meaning-making process and the experiences, or “building blocks,” that form the main
skills and activities of learning. Therefore, learning from one’s own experience and past
history as well as the best practices of others provides “blocks” upon which future
decisions can be made (Garvin, p. 81).
Some studies in the areas of knowledge and expertise refer to experience as part
of an unceasing “journey toward expertise” (Alexander, 2003, p. 12; Barnett, 1995).
Farrington-Darby and Wilson (2006) indicate that “expertise develops from the
experiences an individual is exposed to, with a large influence from personal
characteristics” (p. 20). Nunn’s research (2008) describes the connection between
experience and expertise as “tightly coupled,” requiring a “minimum of 10 years
experience to become an expert” (p. 421). Experience over time helps practitioners
develop the traits of expertise which include: 1) significant problem-solving abilities; 2)
ability to relate problems to social contexts; and 3) well developed mental structures that
require “a knowledge base that is extensive and accessible” (Barnett, p. 47). Experience
also helps experts develop highly integrated clusters of domain knowledge, where they
can identify the underlying structure of domain problems and implement appropriate
solutions (Alexander, p. 12). From their experiences, experts organize knowledge “in
such a way that relevant data may be quickly retrieved and put to use” (Nunn, p. 422). In
brief, researchers in the field of expertise agree that exposure to experiences over time is
necessary for expertise to develop (Alexander; Benner, 2004; Berliner, 2004; Dreyfus
and Dreyfus, 2004; Cranton, 1994; Farrington-Darby and Wilson; and Nunn). Overall,
41
experiences provide continuity and connect one’s past experiences and prior knowledge
to avoid random, disconnected learning that can result in routine actions (Rodgers, pp.
846 – 48). Also, experience helps build certain cognitive processes, which result in
greater learning (Barnett, p. 47; Nunn; Farrington-Darby and Wilson).
However, experience alone is not enough to lead to the type of rigorous reflection
needed to increase professional expertise (Nunn, 2008, p. 421). Expertise also results
from “a deliberate act” (Nunn, p. 421) as well as “complex interactions” and “increased
teamwork” (Farrington-Darby and Wilson, 2006, p. 20). Hence, socially constructed
learning activities provide opportunities to increase expertise (Rodgers, 2002; Garvin,
1993). These activities provide a way “to take in new information, process it and share it
among the members of the organization in steady and regular ways” (Kruse, 2000, p.
362). However, Kruse, in her study on team learning in K-12, warns that it is important to
pay “careful attention to how information enters and informs the practice of teachers” (p.
362). She contends that it is the collective work of teachers and administrators that
provides the capacity for change and development in institutions (pp. 361 – 362). The
potential rewards of these collective activities lead to discussion of the second element
needed for rigorous reflection—collaborative interaction.
Collaboration
Research on reflective practice contends that collaboration with others is
necessary for rigorous reflection to result in increased expertise for several reasons
(Licklider, 1997; Milner, 2003; Rodgers, 2002; Dewey, 1910; Rousseau and Tate, 2003;
Boud and Walker, 1998). First, collaboration helps practitioners identify strengths and
42
weaknesses in their own thinking. It also provides affirmation, exposure to new thinking,
a feeling of interdependence, and a supportive atmosphere (Licklider; Raelin, 2007).
Further, collaborative activities provide the framework through which adults can confront
their beliefs and assumptions. This confrontation is necessary for change to happen
(Licklider). Specifically, studies indicate that learning to increase expertise occurs only
after one’s beliefs and assumptions are examined and one’s experiences are subjected to
criticism in the presence of others (Licklider; Raelin; Rodgers; Dewey; Barnett, 1995;
Nunn, 2008). For Dewey, reflection also requires collaborative activities because
working with others helps sustain working toward improvement. Through collaboration,
participants develop interdependent relationships that can lead to further reflection.
In his study on organizational learning, Dill (1982) contends that a supportive
collaborative culture is necessary to achieve “learning from others.” Bauman’s work on
high learning groups (2005) discusses how these groups were able to admit
collaboratively that the assumptions under which they had been operating might be
flawed or erroneous, and they were open to questioning their own practices and
knowledge (p. 29). Her description of the high learning groups indicates that a high level
of collaborative reflection leads to development or recognition of shared values and the
willingness to broaden experience and knowledge (Bauman).
A collective, interactive work group is one instrument through which
administrators and instructors can work together to effect improvement and change
(Kruse, 2000, p. 362). By participating in team learning, Kruse contends that members
“engage in joint action” that has two possible end products: 1) a non-productive,
43
unreflective response; and 2) a productive response that evolves into a “significant
learning opportunity” (p. 363). Kruse agrees with studies in action research which
indicate that academic practitioners as researchers provide opportunities for institutional
change (Argyris, 1991; Stanton-Salazar, 1997; Rodgers, 2002; Bensimon, Polkinghorne,
Bauman, and Vallejo, 2004; Grubb and Badway, 2005; Dowd and Tong, 2007). She
claims that the “collective, regular processes of teachers and administrators working
together around issues of practice and professional knowledge will provide schools with
the capacity for change and development” (Kruse, p. 362). Kruse contends that such work
groups lead to a scaffolding of experiences resulting in newly developed policies and
practices for the institution (Dewey, 1910; Rodgers, 2002; Garvin, 1999; Argyris, 1993;
Schon, 1983; Kruse, p. 363).
Collaborative reflection is also cyclical in nature and needs systematic assessment
and evaluation (Dewey, 1910; Rodgers, 2002; Garvin, 1999). Assessment activities that
“maintain accountability and control over experiments without stifling creativity by
unduly penalizing employees for failures,” help build transparency in collaborative
groups, an important factor in building trust (Garvin, p. 83). Ongoing transparent
assessments provide team participants with access to information previously unavailable
(Garvin, p. 85). This type of transparency and support in collective reflection, even
during failures, can lead to shared trust and respect for all participants (Garvin; Grubb
and Badway, 2005, p. 14)—important conditions for effecting transformational change
(Angelo, 1999). As mentioned earlier, this type of reflection, also called phronetic
deliberation, “allows us to know how to enact virtues such as fairness, caring,
44
truthfulness, and kindness in complex situations” (Birmingham, 2003, p. 191). Perhaps
this leads to the most important element of reflection—examination of personal values
and beliefs.
Examination of Personal Values and Beliefs
Examination of one’s personal values and assumptions, also referred to as critical
subjectivity (Reason, 2005, p. 327), is necessary in order to engage in the type of rigorous
reflection that can result in increased expertise. Similar conclusions have been reached
through research on reflective practice (Barnett, 1995; Birmingham, 2003; Dewey, 1910;
Licklider, 1997; Rodgers, 2002; Raelin, 2007), organizational learning (Dill, 1982) and
expertise (Berliner, 2004, Nunn, 2008). This type of critical self- examination is also
referred to as self-awareness (Nunn, 2008), double-loop learning (Argyris, 1993),
personal agency (Berliner, 2004; Dreyfus and Dreyfus, 1986), transformative learning
(Cranton, 1994; Licklider), and reflective practice (Raelin; Schon, 1983). Other studies
connect this examination to the development of a composite of “shared beliefs,
ideologies, or dogma” that motivates individuals to “action” or change (Dill, 1982,
p.307). Bauman (2005), Dewey, and Rodgers agree that embedded in high learning
groups are shared visions, values, characteristics, experiences, and social interactions
which are key elements for high learning. Dewey adds wholeheartedness (single-
mindedness), directness, open-mindedness, and responsibility to the list of characteristics
necessary to examine one’s assumptions. It is suggested that the examination of one’s
own assumptions can result in a measured, well-considered line of thought and carefully
constructed interactive thinking (Rodgers).
45
Examining one’s own beliefs as part of critical inquiry can lead to recognizing
new meanings, an important aspect in the development of professional expertise. This
type of reflective practice (Raelin, 2007; Schon, 1983) helps deconstruct assumptions;
identifies privileged local stakeholders; brings out “contradictions in current power
structures”; and supports exploration of “hidden resistances and conflicts in human
discourse” (Raelin, pp. 65-72). Self-examination helps participants recognize the
difference between espoused theory, the “words we use to convey what we do or what we
would like others to think we do,” and theory in practice, the theory that is implicit in our
actions (Argyris, 1993, p. 65). Comparing behavior to espoused theory allows for
modification of beliefs to occur, which can increase expertise (Raelin, p. 66). This is
especially true when examination of beliefs occurs in a collaborative environment
(Rodgers, 2002).
Self-examination of values and beliefs also results in the development of trust in
one’s experiences without being concerned about others’ judgments, and this self
reflection moves participants forward in a “constructive direction” that establishes the
learning situation as educative rather than mis-educative (Rodgers, 2002). An educative
experience broadens experience and knowledge, brings awareness, and is characterized
by forward movement with action based on data (Rodgers). A mis-educative experience
narrows future experiences, suggests a lack of awareness and self-serving motives, and it
does not result in “new perceptions of bearings or connections” (Rodgers). Briefly, by
examining beliefs and values, particularly in a collaborative environment, practitioners
can examine their own practices and determine where change and improvement can
46
occur. Change, resulting from this type of self-reflection, generally serves the individual
or self as well as the group or institution (Rodgers).
In summary, studies contend that activities which incorporate the three elements
of reflection, or phronetic deliberation,--experiences, collaborative interaction, and
examination of one’s beliefs and assumptions-- in rigorous reflective processes can lead
to multiple new understandings (Raelin, 2007) and increased professional expertise
(Alexander, 2003; Cranton, 1994). These types of activities help practitioners engage in
critical aspects of learning—deep and rigorous reflection that leads to “double loop”
learning (Argyris, 1991); acceptance of multiple perspectives (Bensimon and Neumann,
1993); willingness to step outside the situation and examine beliefs and attitudes
(Rodgers, 2002; Argyris); and the awareness and presence in the moment—all of which
research indicates can result in increased expertise (Nunn, 2008; Alexander; Cranton;
Farrington-Darby and Wilson, 2006).
Knowledge, Expert Performance and Deliberate Practice
Literature in the field of educational psychology offers a different understand of
expertise. Ericsson (2000) offers a general definition of an expert. He writes that the
“term expert is used to describe highly experienced professionals such as medical
doctors, accountants, teachers and scientists, but has been expanded to include any
individual who attained their superior performance by instruction and extended practice.”
Generally, experts behave in ways that seem “effortless and natural” to others (Ericsson,
2000). However, when measured with “psychometric tests,” scientists found that rather
than having a special talent, the experts had acquired “demonstrated superiority in a
47
specific domain” (Ericsson). Ericsson also re-examines the results of an earlier review of
research on expertise conducted by Ericsson and Lehmann (1996). The review found: “1)
measures of general basic capacities do not predict success in a domain; 2) the superior
performance of experts is often very domain specific and transfer outside their narrow
area of expertise is surprisingly limited; and 3) systematic differences between experts
and less proficient individuals nearly always reflect attributes acquired by the experts
during their lengthy training” (Ericsson).
Farrington-Darby and Wilson’s review of research on the nature of expertise
(2006) provides an additional understanding of expertise (p. 18). They write that
expertise can “describe skills, knowledge or abilities, in tasks, activities, jobs, sport and
games. It can refer to a process such as decision making or it can refer to an output such
as a decision” (p. 18). Farrington-Darby and Wilson contend that there are three types of
performance related to expertise: social, cognitive, and physical (p. 22).
Ericsson (2000) also discusses how most professional beginners in a domain will
change their behaviors to increase their performance for a limited time until they reach an
acceptable level of performance. In the case of academics, the acceptable level of
performance could be when one achieves tenure. Once the acceptable performance level
is achieved, it is generally not possible to predict future improvements, nor do the number
of years the performer works in the domain serve as a good predictor of future
performance; in fact, number of years worked serves as a “poor predictor” of future
performance levels. Therefore, to continue improving, or changing, one’s expertise,
performers seek out “particular kinds of experience” known as deliberate practice
48
(Ericsson). Deliberate practice is composed of “activities designed, typically by a
teacher, for the sole purpose of effectively improving specific aspects of an individual’s
performance” (Ericsson). Ericsson continues to explain that the cumulative effect of
deliberate practice is “closely related to the attained level of performance of many types
of experts.” Therefore, it is important that close attention be paid to the types of activities
designed, especially in relation to the improvements hoped for.
Ericsson (2000) also discusses experts’ knowledge. He states that this type of
knowledge is “encoded around key domain-related concepts and solution procedures,”
and the expert has developed “rapid and reliable retrieval” of this information when
necessary. Additionally, experts develop “domain-specific memory skills” that allow
them to rely on both short-term and long-term memory (Ericsson). Because experts have
superior “mental representations” of the stored knowledge in their memories, they have
the ability to “adapt rapidly to changing circumstances and anticipate future events in
advance” (Ericsson). This understanding of knowledge and the retrieval of this
knowledge are important for the development of expertise, especially adaptive expertise.
Berliner (1991) discusses the elements that comprise expertise, including
“perception, memory, organization of knowledge, and decision making processes” in his
empirical study on the behavior of expert teachers (p. 145). He also points out that
teachers are generally not included in studies on expertise because pedagogy is an “ill-
structured domain” that does not lend itself to easily identifiable behaviors of expertise,
49
and pedagogical knowledge is not “seen as sophisticated knowledge” for numerous
reasons (Berliner, p. 146). As a result, more studies on expertise, especially expertise for
teachers are called for.
From Restrictive Expertise to Adaptive Expertise
In his review of literature on the expertise of teachers, Berliner (2004) discusses
the assumptions surrounding expertise in teachers and focuses primarily on the theory
developed by Dreyfus and Dreyfus in 1986. The list of assumptions includes the beliefs
that expert teachers are sensitive to the situation and the task at hand, more likely to take
advantage of opportunity, more flexible, fast in recognizing “meaningful patterns in the
domain in which they are experienced,” capable of solving problems using richer
resources, and they develop automaticity and routinization as a result of the repetitive
nature of their work (Berliner, pp. 200-01). More importantly, Berliner notes that a
generally “well-supported proposition about expertise in general [. . .] is that expertise is
specific to a domain and to particular contexts in domains and is developed over
hundreds and thousands of hours” (Berliner, p. 201).
As with other studies on expertise (Ericsson, 2000; Farrington-Darby and Wilson,
2006), Berliner (2004) bases his initial discussion on the extensive research on expertise
that has been conducted in sports (p. 202). In his review of these studies, Berliner
discusses the most important factors listed in developing expertise. He reports that the
most important factor in developing expertise in sports is “the desire to be excellent”
(Berliner, p. 202). Berliner continues by discussing how most teachers desire excellence
with almost a “missionary’s desire to do well,” but with practically no “coaching or
50
mentoring,” teachers often learn their craft independently without the opportunities to
repeat a performance multiple times as sports figures do (Berliner, p. 202).
Berliner continues by examining the differences between restrictive expertise,
which is “contextually bound” and limited to specific domains, and adaptive expertise,
his naming of a type of expertise that allows for unique individuals to perform as experts
outside “their domains of competence” (2004, p. 203). As Berliner discusses adaptive
expertise, sometimes called “fluid expertise,” he points out that adaptive, or fluid, experts
“learn throughout their careers, bringing the expertise they possess to bear on new
problems and finding ways to tie the new situations they encounter to the knowledge
bases they have (p. 203). He cites an empirical study conducted by Wineburg (1998)
where two history professors were asked to talk aloud about primary documents familiar
in only one of the professor’s domains. As the second history professor worked through
the documents that were out of his area of competence his questions “began to cluster
around a set of constructs and relationships that proved crucial to his understanding. [. . .]
adaptive expertise was evident by task’s end” (Wineburg (1998) qtd. in Berliner, p. 203).
Therefore, Berliner contends that while most studies on expertise argue that it is
contextually bound and domain-specific, there is research that indicates “at least some
experts can use the same rich stores of domain-specific knowledge as a basis for adaptive
and fluid expertise” (Berliner, p. 203). The ability to transfer knowledge and skills for
multiple domains and contexts would benefit community college faculty members who
work in multiple academic domains that tend to overlap.
51
Action Inquiry
The research methodology literature, particularly action research methodology,
has also generated insights into the nature of expertise. Rigorous reflection helps “to
foster an environment that is conducive to learning” (Garvin, 1993, p. 91). Action inquiry
in turn offers the potential for developing this type of learning environment, which in turn
leads to adaptive expertise according to the research review. Figure 1, at the end of this
chapter, illustrates visually the relationships between reflection, action inquiry, and the
development of adaptive expertise. Briefly, action inquiry includes many of the activities
discussed in the literature on reflection and expertise: 1) it is collaborative (Greenwood
and Levin, 2005, p. 51; Reason, 2003); 2) it accounts for “history, culture, interactional
practices, and emotional lives” (Stringer,1999, p. 17; Greenwood and Levin, p. 51); 3) it
requires action by local stakeholders (Greenwood and Levin, p. 51); and 4) it provides
activities that can result in increased adaptive expertise for participants (Berliner, 2004).
Researchers working in a multidisciplinary theoretical framework drawing on
studies from practice theory, action science, research methodology, and learning at the
Center for Urban Education (CUE) at the University of Southern California have
conducted research on the results of working with community colleges in action inquiry
partnerships. The conceptual framework for the practitioner-as-research model is
presented in a methodological article by Bensimon, Polkinghorne, Bauman, and Vallejo
(2004). They contend that institutional change requires a “deeper awareness among
faculty members, administrators, or counselors, of a problem that exists in their local
context” (p. 105). Drawing on empirical findings from the Diversity Scorecard Project,
52
Bensimon et al characterize the desired outcomes for practitioner participants in action
research. The outcomes included 1) a new awareness of inequities in educational
outcomes; and 2) participants becoming “committed” to using “data-informed
knowledge” in ways that extended into other areas of their professional responsibilities.
However, not all participants reported experiencing change (Bensimon, Polkinghorne,
Bauman, and Vallejo). Bensimon et al point to the importance of understanding the local
context where research is conducted (p. 124), and recognizing that this type of action
inquiry, where community college practitioners participate as researchers, results in
“findings that can actually make a difference in the understandings and actions of faculty
and staff members within a particular institution of higher education” (Bensimon,
Polkinghorne, Bauman, and Vallejo, 2004, p. 124).
In a 2007 review of the CUE projects where practitioners acted as researchers
presented as the presidential address at the Association for the Study of Higher
Education, Bensimon contends that some of the “principles of phronetic social science”
correspond to principles of the practitioner-as-researcher model found in action inquiry
because both 1) are value-oriented; 2) ask value-rational questions; 3) raise questions
about the power structure; 4) are dependent on context and are sensitive to that context;
5) aim to “make inequality concrete and solvable”; 6) “foster equity-mindedness”; and 7)
believe that change is multidimensional with analysis focused on the individual
practitioners (2007, p. 458).
Dowd (2005) furthers the study of expertise by viewing it through the lens of
action inquiry. In her methodological research, she discusses the importance of
53
developing a culture of inquiry by working with community college administrators and
faculty members to develop the professional expertise (2005). The culture of inquiry is
defined through several characteristics including: 1) the professionalism of practitioners
to identify problems and address them through “purposeful analysis of data about student
learning and progress”; 2) the “dispositions and behaviors” of these practitioners; 3) the
“willingness to engage in sustained professional development and dialogue”; and 4) the
“capacity for insightful questioning of evidence and informed interpretation of results”
(Dowd, p. 5).
The culture of inquiry should be driven by developing a culture of evidence where
institutional research is a primary function for faculty and administrators to more fully
engage with data, so decision-making uses productive reasoning rather than “soft data”
found in defensive reasoning (Dowd, 2005, p. 6). In her empirical study, Dowd discusses
a theory of change that derives from phronesis. In this theory, practitioners would “have
independent capacity and expertise to knowledgeably implement reforms and improve
institutional effectiveness (pp. 12-13). However, she also argues that without “promoting
practitioner inquiry and knowledge,” community college faculty will continue to face
significant challenges in their work without the benefit of development opportunities that
can increase professionalism and expertise (p. 13).
The Importance of Reasoning in Increasing Adaptive Expertise
Argyris’s work (1993) indicates that when a culture of inquiry is structured to
include elements such as seeking disconfirmation, working in groups, and basing
decisions on evidence, the result is improved reasoning and effectiveness. To improve
54
effectiveness, Argyris contends that individuals must change their beliefs and actions
through processes of reasoning (p. 55). More specifically, he argues productive reasoning
is needed to achieve the goal of any intervention intended to increase expertise.
Individuals who use productive reasoning are willing to permit disconfirmation from
others for their beliefs and decisions. They use “directly observable data” to support their
premises and they are open to criticism from others (Argyris, p. 56). However, many
times, individuals filter information through defensive reasoning. Defensive reasoning is
defined by Argyris as “self-serving, anti-learning, and over-protective [. . . ensuring that]
organizational defensive routines will be maintained and rewarded” (p. 56). In general,
defensive reasoning results in explanations and evidence that are tacit and data used is
“soft” (Argyris, p. 56).
Defensive reasoning (Argyris, 1991) occurs when professionals experience
failure. Often, these failures cause defensiveness as professionals “screen out criticism,
and put the ‘blame’ on anyone and everyone but themselves” (p.4). In his review of his
attempts to implement continuous improvement at organizations, Argyris realized that
once the continuous improvement “turned to the professionals’ own performance,
something went wrong” (p. 5). These professionals were committed to excellence, but it
became clear to Argyris that only if “the ways managers and employees reason about
their behavior [is] a focus of organizational learning and continuous improvement
programs,” will change occur (p. 5). He explains that teaching “people how to reason
about their behavior in new and more effective ways breaks down the defenses that block
learning” (p. 5). He also cautions that careful consideration should be given to the types
55
of activities developed to increase expertise in professionals. This study, which presents
findings from an action research project, contributes to expanding research on community
college faculty. Specifically, it offers evidence of self-reported changes by faculty
participants related to how they see their participation in the project impacting their
actions to improve learning. However, there are limitations in the ability of the project, as
structured, to bring about such learning.
The figure that follows (figure 1) represents the overlapping nature of the three
primary conceptual frameworks discussed in this chapter: 1) reflection, or phronetic
deliberation (emphasizing experience, collaboration, and self-examination of values),
which comes from the practice theory literature based in philosophy and humanistic
psychology; 2) action inquiry (particularly the inquiry stages of framing the problem,
collecting evidence, and assessing solutions), which is rooted in action science and
educational research methodology; and 3) adaptive expertise (emphasizing characteristics
such as increased knowledge, productive reasoning, and empowerment to implement
change), which is derived from studies of expertise among teachers, athletes, and others. I
frame my analysis in this study using all of these concepts in an interrelated manner
because all are relevant to bring about significant improvements in institutional
effectiveness. I investigate whether reflection (represented in circle #1) is brought about
through action inquiry activities (circle #2), and if so, whether it leads to an increase in
adaptive expertise by promoting productive reasoning and diminishing defensive
reasoning (circle 3). Thus, the study offers an opportunity to generate a new and unique
way of understanding faculty development.
56
Figure 1: Relationships between concepts leading to adaptive expertise
1. Reflection:
Experience
Collaboration
Examining values
3. Adaptive
Expertise:
Increased
knowledge
Productive
reasoning
Empowerment to
act
2. Action
Inquiry:
Frame the
problem
Collect evidence
Implement &
assess solution
57
CHAPTER THREE: METHODS
For over one year I observed faculty members participating as researchers in
action inquiry activities at a California community college. These practitioner researchers
were engaged in an action-oriented, problem-solving research project in collaboration
with the Center for Urban Education (CUE) at the University of Southern California. As
such, they were engaged in action inquiry activities designed to investigate a local
institutional problem (Patton, 2002, p. 121; Stake, 1995; Stringer, 1999). The project
activities emphasized experience, collaboration, and examination of beliefs-- elements of
reflection or phronetic deliberation. A full list of the project activities in chronological
order is presented on p. 65. Chapter 2 presented the research which contends that these
activities can result in opportunities for increased awareness of the complexities of an
issue, improved problem solving, and empowerment to act, characteristics of adaptive
expertise that, in turn, can result in action and change at a specific institution (Alexander,
2003; Argyris, 1993; Benner, 2004; Noffke, 1997, p.309). My interactions with these
faculty researchers in various activities and locations provided opportunities to collect
data related to my dissertation topic, which attempts to answer the question: Does
participating in collaborative action inquiry activities as a form of institutional
assessment result in increased perceptions of expertise among faculty participants?
1.. If so, what aspects of action inquiry are associated with changes in the way faculty
view their expertise and what is the nature of those changes in perception?
2. If not, what inhibits changes in how faculty view their expertise?
58
Context
Case study research emphasizes the uniqueness of a context and focuses on
particularization, not generalization (Stake, 1995, p. 8). This is especially important in
this study as reflection is viewed in the research on academic expertise as context-based
(Rodgers, 2002, p. 52). Therefore, the context in which this study took place is important
for framing the results of the analysis that are discussed in the next Chapter 5. The site of
my research was Urban Community College
2
(UCC). The college agreed to participate in
one of the collaborative action inquiry projects sponsored by the Center for Urban
Education (CUE) at USC because the project offered opportunities for practitioners to
improve institutional outcomes by engaging in “purposeful inquiry” to create “locally
meaningful best practices” (“Letter to President,” Urban Community College, 2007). As
such, participants were provided means for engaging in action inquiry research in order to
solve a local problem identified by the UCC institutional stakeholders.
The goal of the project between UCC and CUE was to “foster growth in equity-
based, practitioner-driven assessments as a way to improve community college student
success [. . .] from developmental (basic skills) to transfer-level courses” (Center for
Urban Education). This led to the selection of Urban Community College (UCC) and its
math program as a meaningful sample through which to research these goals. To better
understand why the math program was selected, an overview of the math students at
UCC is appropriate.
2
Urban Community College is a pseudonym for the community college where research was conducted
for this dissertation.
59
More than 97%
3
of incoming potential UCC students taking the math placement
test in 2006 needed to take at least one math course that is below college level work.
Only 3% of incoming students who took the placement test in math placed directly into a
college level, transferrable math course (approximately 50 out of nearly 2000 test takers).
Of the approximately 1950 students who did not place into a college level course, about
1600, or 89%, needed to take and pass more than one pre-college (basic skills) math
course to reach a transfer level course (Fact Book, UCC, 2006, p. 2-12). This represents a
significant challenge for students as research indicates that needing to take multiple pre-
college level courses often results in adding years to a student’s community college
academic progress (Melguizo et al., 2007). The placement data was not disaggregated by
ethnicity, so comparison between ethnic groups was not possible. However, more details
to help understand the demographic context at UCC are provided next.
Urban Community College District (UCC)—a single college district--was
established in the early 1960s to serve more than 380,000 residents living just outside a
large metropolitan city. Census data (2000) indicated the ethnic composition of the
district was more than 70% Hispanic, giving the UCC district a higher concentration of
Hispanic citizens than either the county or state in which it is located. Conversely, the
district population has a smaller percentage of non-Hispanic and African American
citizens than the county or the state while the Asian population in UCC’s district reflects
the make-up of the general population of the county and state. Trends from 1990 to 2000
indicated a 77% increase in the Hispanic population in UCC’s district, while the white
3
All statistics are rounded at approximately +/- 3-5% to provide anonymity to the college.
60
non-Hispanic population had decreased about 50% (Fact Book, UCC, 2006). As a result
of these data, it is safe to assume that a large percentage of UCC students can be
categorized as minority students, who are generally underserved. In turn, it is important
to understand that educating students of underrepresented minority groups has unique
institutional challenges not necessarily found in community colleges serving a student
population with a large percentage of students from the dominant group.
In fact, in 2006, the student demographic data at UCC indicated 59% were male
and 41% female, with 36% less than 19 years old and 19% falling between 20 – 24 years
of age. Hispanic students were the largest ethnic population at 45% of all students; white
non-Hispanic students were 6% of the population with African American students
making up 2% and Asians 4%. The second largest population consisted of those whose
ethnicity was unknown or those who declined to provide information at 41% (Fact Book,
UCC, 2006, p. 2-7). These statistics resulted in UCC being designated as both a Hispanic-
serving institution
4
and a minority-serving institution
5
. The demographics also frame
additional understanding of the challenges UCC faces in helping their students overcome
potential socioeconomic and second language challenges in helping their students
succeed.
UCC’s Fact Book (2006) reports that 20% of the district’s population indicated
they communicate poorly in English or not at all. The largely Hispanic population
reported that 37% speak Spanish at home with 24% being bilingual. Data indicated entire
4
A Hispanic-Serving Institution (HSI) is defined as a non-profit institution that has at least 25% Hispanic full-time equivalent (FTE)
enrollment (http://www.ed.gov/programs/idueshsi/definition.html).
5
Institutions that enroll at least 25 percent of a specific minority group are designated as “minority-serving” for that group
(http://nces.ed.gov/pubs2008/2008156.pdf).
61
households speak only Spanish or an Asian/Pacific Island language. Based on these data,
it can be assumed that a percentage of UCC students may experience difficulty in
understanding the educational process and may need supplemental help.
Socioeconomic data indicate that within the UCC district, adults 25 years and
older have an education level significantly lower than the state or county with 41% not
completing high school. This lack of education may correlate with the median household
income of UCC’s district which is $42,229. While this median household income was on
par with a large urban county, it is $5,000 less than the state median. Of greater concern
is that census data revealed more than 61,300 individuals in UCC’s district live in
poverty, indicating a “higher proportion of individuals live in poverty here (16.3%) than
in [the state] (14.2%) or the United States (12.4%)” (Fact Book, UCC, 2006). The
socioeconomic data signify additional challenges that UCC must address to help students
succeed. For example, UCC students impacted negatively by socioeconomic factors often
need to work many hours a week to supplement household income. This, then, can result
in reduced time for academics, contributing to academic difficulties or failure.
A look at the labor market projection for UCC’s service area indicates future
growth in three areas: 1) professional & business services; 2) government; and 3) health
care & social assistance (Fact Book, 2006, p. 1-15). The largest job growth areas for
individuals with an Associate’s Degree were projected in the following occupations:
registered nurses, computer support specialists, dental hygienists, paralegal and legal
assistants, and electrical and electronic engineering technicians. For those earning a
bachelor’s degree, job growth areas included: elementary school teachers, secondary
62
school teachers, and computer software engineers (Fact Book, UCC, 2006, p. 1-15). The
labor market projection can help inform the educational goals and programs at UCC. In
fact, all of these data inform the discussion that follows in the Chapter . However, before
moving to the next chapter, it is important to understand how data was collected and
analyzed for this study.
Research Design
This is a single case study that is bound by a fourteen- month time span. It focuses
on understanding a particular context by examining a purposeful sample (Stake, 1995, p.
36) of faculty participants in one of the action inquiry projects conducted by the Center
for Urban Education (CUE) between 2003 and 2008. Selecting these participants allowed
for a deeper understanding of how--or if--participation in action inquiry activities results
in changes in self-perceptions of academic expertise and professional skills (Patton, 2002,
p. 36). The research design also permitted the deductive examination of complex
relationships among various elements of the project through multiple data collection
methods (Patton, p. 36; Stake, p. 37; Yin, 2003, p. 34).
Data Collection
To triangulate the collection of data, multiple methods were used. Data were
collected through observations of multiple sites; an audit of Urban Community College’s
(UCC) website; the review of many documents; and two rounds of individual interviews
with specifically chosen faculty. Observations sites included monthly team meetings,
learning centers, department presentations, and visits to other community colleges.
Document review included the results from multiple rounds of interviews, presentations
63
at symposia by the UCC faculty participants, field notes, the union contract, the basic
skills plan, and other general documents published by the college and/or the participants.
The two rounds of interviews referred to here were conducted within a 10-months period.
The first phase of the process entailed placing evidence from the collected data in
categories related to the specific activities that were connected to stages of action
inquiry— framing the problem, advocating for a solution by collecting and using
concrete data, and assessing the solution (Reason, 2005, p. 332). Then, the second phase
of the analyses reviewed each stage of the action inquiry activities looking for evidence
of the elements of reflection: experience, collaboration, and examination of personal
beliefs. The final phase interpreted data from individual interviews for changes in
respondents’ perceptions of expertise brought about by participation in the project. The
indicators of such changes were generated from the literature review: 1) an expanded
knowledge base and deeper understanding of the problem; 2) improved problem solving
skills through productive reasoning; and 3) a greater feeling of empowerment to effect
change. These multiple stages of analyses were necessary in order to determine if the
structure of the activities in the action inquiry project provided participants with the
opportunities to increase adaptive expertise.
Observations
Data were first collected through observations of multiple sites. Observations
were conducted in various locations noted earlier. The natural setting of the supplemental
instruction math/science center, where math students could access additional instruction
from faculty and peer tutors on duty or participate in modular courses, was observed
64
twice. Other locations for observations included monthly team meetings between the
community college faculty participants and the researchers from the CUE at USC and
two symposia sponsored by CUE in month two and month eight of the project. At one of
these symposia, faculty participants from Urban Community College (UCC) reported on
the results of their action inquiry project to date to other project participants from various
community colleges. At the second symposium, faculty participants from UCC attended
and engaged in roundtable discussions. Observational data was also collected by
observing UCC faculty participants as they visited another community college where
they were presented two practices viewed as exemplary: 1) a model for learning
communities composed of underserved students; and 2) a successful supplemental math
instructional center. Observation protocols for all observations were developed by CUE
researchers
6
.
Website and Document Audit
Another way data were triangulated was through an audit of the institution’s
website. A review of the website and other documents at UCC was completed to help
readers of this study further understand the academic landscape in which interpretations
and assertions culled from the evidence are made (Stake, 1995, p. 114).The website audit
provided general understanding of UCC’s mission, vision, general principles, and the role
of certain institutional assessment processes, including accreditation, strategic planning,
program review, and learning outcomes assessment. The website also housed detailed
information on the math/science supplemental learning center. In addition, the website
6
All project activities are listed chronologically in Appendix A.
65
audit made available the current faculty union contract and the college’s Fact Book,
which contains statistical data such as demographics, student retention, student
persistence, and student success.
Additional triangulation happened through review of college documents. Two
important documents collected for review were the list of “hunches” developed by UCC
participants just prior to the start of the project and notes taken at monthly team meetings.
Specifically, these two documents helped triangulate the data culled from observations of
these same activities. The “hunches” document was developed at an orientation meeting
just before the start of the action inquiry project. This exercise was designed to help
participants from UCC articulate beliefs and practices they saw at UCC and to begin
identifying the problem, or indeterminate situation, which would become the focus of the
project. In addition, it asked participants to collaboratively articulate the college’s
challenges in a variety of areas. As a result, UCC’s practitioners—in their own
language—articulated how they viewed the institution and their roles in its processes
prior to engaging in the action inquiry project. This was an important element of this
study as it provided institutional context and helped CUE researchers and the inquiry
team establish a reference point from which to determine and document changes in
beliefs and behaviors.
Interviews
Another method of data collection consisted of individual faculty interviews.
There were two rounds of interviews; the first took place in month 10 of the project and
the second round occurred after the project was finished. Six instructional faculty
66
members at UCC were selected for interviewing in the first round, and four accepted the
invitation. Three of these faculty members were selected because of their participation in
the project, while the other three were selected because they had not participated, and
thus could act as a control group. The four who agreed to be interviewed were composed
of three project participants and one non-participant. At the time, it was hoped that some
comparison between participants and non-participants could be made. The six faculty
members selected to be interviewed in the second round included the three original
project participants interviewed during round one and an additional three faculty
participants selected because of their engagement with the project. It was anticipated that
re-interviewing the faculty participants who had been interviewed previously might result
in their reporting a comparison between their level of expertise at the midway point of the
action inquiry project and their level of expertise several months after the project had
ended. However, two of these three project participants were not available for the second
round of interviews: one declined to be re-interviewed and another was on sabbatical
leave. Consequently, the second round of interviews included one of the original three
faculty participants interviewed at the midpoint of the project and three additional faculty
participants.
The first round interview guide (see Appendix A) and protocol was developed by
project researchers at CUE. It was geared to elicit responses related to the project,
especially focused on the math/science center. These were categorized in four areas: 1)
roles and responsibilities; 2) instructional beliefs; 3) basic skills and equity; and 4)
assessment and evaluation. The first area asks for information on the interviewee’s
67
position at Urban Community College (UCC), the length of employment, and types of
courses taught. The second grouping of questions on instructional views and practices
focused on the interviewees’ experiences related to working with students in the
math/science center, the processes of the center, and faculty development related to the
math/science center. The section on basic skills, developmental education and equity
asked for interviewees to articulate their ideas about these issues. The final section
centered on the role student learning support centers played at the institution. More
specifically, the questions asked whether or not the math/science center had been
evaluated, how the results of any assessment or evaluation had been used to change
practice, and whether or not the students knew how the support center could help them be
successful.
I developed the second interview guide (see Appendix B) in collaboration with a
researcher from the CUE. This interview guide focused on the impact of action inquiry
activities on faculty participants’ adaptive expertise drawn from the conceptual
framework. The questions were clustered to reflect the characteristics of expertise that
guided the analysis including an increased broader knowledge base (questions 2, 3, 6),
increased teamwork and collaboration (question 4), learning patterns of experts
(questions 2, 3, 6, 7), and willingness to examine beliefs (questions 3, 4, 5, 6). This guide
also asked participants about their overall assessment of the action inquiry project and
how the project’s findings had been disseminated by the institution (questions 9 and 10).
Human Subject training was completed at the University of Southern California, and both
interview guides were approved by IRB.
68
All activities that were part of the action inquiry project between Urban
Community College and the Center for Urban Education (CUE) are presented in
chronological order in Table 1.
Table 1: Complete list of project activities
Date Activity
Month 1 Team meeting
• determine focus areas for research
Month 2 Team meeting
• report on research activities & findings
• discuss exemplary programs to visit
Month 3 Team meeting
• report on research activities & findings
• narrow potential exemplary programs for visit in spring
Month 4 Team meeting
• plan site visit to exemplary program
• draft report to the President
Month 5 Team meeting
• memo to President summarizing fall findings and recommending
action
• implement small changes
• plan student and peer interviews to inform action plan
Month 6 Team meeting
• prepare to present findings at March symposium (all
benchmarking community colleges and potential alliance
colleges)
Site Observation
• observed activities and participants in Supplemental Learning
Center (SLC)
• visited other learning centers with host team member
Month 7 Symposium
• present findings to attending community colleges
• participate in effective practices discussion groups
69
Table 1: Continued
Month 8 Team meeting
• report on findings from student and peer interviews
• revise recommendations for action plan
• visit exemplary program site and debriefing
Site Observation
• observed activities and interactions in SLC
Month 9 Team meeting
• finalize revisions to recommendations for action, evaluation, and
ongoing assessment
• decide on dissemination plan
Site Observation
• observed activities and interactions in SLC
Faculty Interviews
• conducted four audio taped individual interviews with three
faculty team members and one faculty non-team member. Three
interviewees were math faculty and one was non-math faculty
Month 10 Team meeting
• report on action and evaluation plans for 2008-2009
Month 11 Team meeting
• social meeting to discuss final report
Month 12 Symposium
• participate in group learning activities for team building
Analysis
Evidence was analyzed using multiple filters to determine if through participation
in the action inquiry activities participants perceived an increase in their adaptive
expertise. Evidence collected was initially categorized into the four steps of action
inquiry described in Chapter 1 (see p. 22). Categorizing the data into the four steps
provided an opportunity to identify activities that might lead to increased expertise. The
four steps include: 1) framing the problem-- where participants increase their awareness
70
of the complexities of the problem and identify the assumptions that exist; 2) advocating
for a particular response or action to the problem by collecting data; 3) illustrating or
supporting the response or action with concrete data; and 4) monitoring how users
respond to the solution through ongoing assessment (Reason, 2005, p. 332). After the
initial categorization, data was reviewed once again and divided into subcategories
representing the three elements of reflection: experience, collaboration, and examination
of beliefs. Dividing data into these second set of categories helped establish an organized
picture of which activity included which elements of reflection. Further, this organization
provided opportunities to look for evidence of which activities and which elements were
associated with specific indicators of adaptive expertise – an expanded knowledge base,
including a deeper understanding of the problem; improved problem solving skill through
productive reasoning; and/or empowerment leading to action. Table 2 below presents the
relationships suggested by research between action inquiry, elements of reflection, and
resulting characteristics that indicate adaptive expertise.
71
Table 2: Relationships which research contends exists between action inquiry, elements
of reflection, and positive resulting characteristics that indicate adaptive expertise
Action Inquiry
Activity
Elements of Reflection
or Phronetic
Deliberation
Positive Indicators of
Adaptive Expertise
1. frame the problem
• Experience
• expanded knowledge base
• increased understanding of
particular situation
2 & 3. advocate for a
solution using
concrete evidence
• Collaboration
• allows assumptions to be
challenged
• helps identify strengths &
weaknesses in own thinking
• provides opportunities for
joint action that can result in
new learning
• scaffolds experiences that
helps develop new
strategies/improved problem
solving
4. assess the solution
• Examining values and
beliefs
• Examining actions,
effects, outcomes
• deconstructs assumptions
• results in shared vision,
values, goals that lead to
higher learning
• develops trust in own
experiences
Using the indicators listed in Table 2, the first stage of analysis looked for how
experience, an element of reflection, impacted participants’ action and language during
the action inquiry project. More specifically, analysis focused on trying to determine if
faculty participants did in fact develop an expanded knowledge base and an increased
understanding of the problem which was the focus of the action inquiry project—
indicators that research contends point to adaptive expertise (Barnett, 1995; Birmingham,
2003; Dewey, 1910; Licklider, 1997; Milner, 2003; Raelin, 2007; Rodgers, 2002;
Rousseau and Tate, 2003). Evidence of the existence of these indicators is important
72
because research indicates that an expanded knowledge base with deeper understanding
of a problem is a critical component of expertise (Alexander, 2003, p. 12; Benner, 2004,
p. 190; Berliner, 2004, p. 203; Cranton, 1994, p. 727).
Similar to stage one of analysis, the second stage looked at how the collaborative
nature of the action inquiry project impacted participants’ problem solving abilities, and
the analysis looked for evidence that the collaborative environment helped participants
challenge assumptions and scaffold on their earlier experiences for new learning.
Additionally, this analysis also explored if faculty participants were willing to expose
themselves to potentially embarrassing situations in order to develop improved problem
solving strategies (Argyris, 1993; Schon, 1983; Licklider, 1997; Raelin, 2007). This stage
of analysis tried to identify actions or language that indicated expertise based on the
review of research in Chapter Two (Licklider; Milner, 2003; Rodgers, 2002; Dewey,
1910; Rousseau and Tate, 2003; Boud and Walker, 1998).
The third stage of analysis reviewed how the third element of reflection,
examining values and beliefs in a collaborative setting, impacted participants’ actions and
language. Research contends that an increase in one’s confidence and greater trust in
one’s own experiences may result from examining values and beliefs. In turn, increased
confidence and trust can lead to a feeling of empowerment resulting in action and change.
This empowerment is another indicator that research argues indicates expertise
(Licklider, 1997; Raelin, 2007; Rodgers, 2002; Dewey, 1910; Barnett, 1995; Nunn,
2008).
73
Summary
In summary, this research design and methodology was intended to provide an
organized approach by which this study could examine multiple artifacts from Urban
Community College (UCC) in order lead to greater understanding of how action inquiry
facilitates changes in expertise. Specifically, the research design and methods attempted
to provide a scaffold approach to categorizing data through multiple conceptual frames,
namely the theoretical frameworks for reflection, action inquiry research, and adaptive
expertise.
74
CHAPTER FOUR: THE CASE NARRATIVE
Overview
As this study evolved, I had many opportunities to get to know the project
participants from Urban Community College (UCC) through observations and interviews.
The observations took place in different locations: monthly meetings, promising practices
site visits, conferences, a department meeting, and observations of the learning centers.
This helped me compare how they operated in a variety of environments. All of the UCC
project participants were interviewed for various studies, but since I am a community
college faculty member, I was interested in how the faculty participants viewed the
project and its potential to help them effect change at the institution. I felt that what they
learned could help me learn.
Once I decided to focus my study on faculty participants, I began developing a
plan to interview these individuals in addition to observing them. Initially, I sent
invitations to many faculty members, project participants and non-participants. I hoped to
establish a thorough understanding of the faculty and the institution by interviewing as
many in academic disciplines as possible.
In all, I interviewed six faculty project participants and one project non-
participant for a total of seven. Together they have an average of 11 years of service at
UCC and several have held administrative positions, including Vice-President. All are
tenured faculty in academic departments, and they taught a range of classes including
multiple disciplines within the area of communications and multiple strands within
75
mathematics. All had experience teaching from the basic skills level to transfer level
within their field.
Table 3: List of participants’ activities in the project and months when interviews for this
study took place
Name of Participant
7
Activities Month of Project when
Interviewed
Elizabeth monthly meetings
interviews
observations
site visit
conference
month 9
Grace monthly meetings
observations
site visit
conference
month 9
month 18 (post project)
Beverly monthly meetings
observations
site visit
conference
month 9
Geraldine monthly meetings
interviews
observations
conference
month 18 (post project)
Randy monthly meetings
observations
site visit
conference
month 18 (post project)
As the table above indicates, five of the seven faculty members interviewed participated
in multiple activities in the project. The other two faculty interviewed are not included in
the table for different reasons. Michael, is not included because he did not participate in
the action inquiry project, and Vicky is not included because she did not participate
actively. Also, neither Michael nor Vicky is included in the findings detailed in the next
chapter of this study. The remaining five participated extensively and contributed to the
7
All participants have been given a pseudonym to maintain anonymity.
76
study’s findings. Next, I provide a brief description of these five faculty members’ views
of their own expertise to help frame the context for the next chapter’s discussions.
Elizabeth
The first faculty participant I interviewed was Elizabeth, and we talked during the
ninth month of the action inquiry project between her institution (UCC) and the Center
for Urban Education (CUE). Elizabeth teaches discipline content at all levels from basic
skills to transfer level. She also works with students, her own and others, to help them
identify and solve problems they experience in understanding course content.
Elizabeth perceived that her experience had resulted in expertise regarding the
processes of her department as well as some understanding of the institution; she also
realizes the impact these processes have on students. For example, she knows that as the
institution adds more classroom-based sections, demand for online or self-paced classes
has decreased (Interview, month 9). In addition, Elizabeth recognizes the value of
determining which students use the learning centers and for what purposes in order to
determine which resources to provide.
Elizabeth shared her expertise in helping students develop strategies for
improvement. During her interview, she reported that many times, she helps students
develop the self esteem to keep them believing they can do the academic work necessary
to succeed. She also recognizes that instructors are often partially at fault when students
do not succeed. The examples Elizabeth offered represented her integrated knowledge of
various aspects of her discipline that has helped her identify students’ problems and
provide “an appropriate solution” (Alexander, 2003, p. 12).
77
Elizabeth offered a specific example of how she interacts with students needing
help. When she cannot get them to articulate their problem, either because they don’t
have the self-esteem to do so, or they don’t know where their problem exists, she realizes
this and responds by suggesting a game to get them to try. She reports, “that always gets
a smile” (Interview, month 9). After breaking the ice, students generally take the first
step, and she reinforces their actions by telling them “That’s it; that’s what I was looking
for” (Interview, month 9). She reports that in her experience, most students who are
faltering are missing a basic piece of knowledge that once identified is fixed easily. Then,
once the problem is fixed, she sits with them and has them complete the assignment,
building their self-esteem in the process. In these examples, Elizabeth exhibits expertise.
Elizabeth offers some insight into how she works interactively and collaboratively
with students – hers and others – to improve student learning. However, she also
recognizes that faculty who work with students need more training. When asked what
three issues need to be addressed to make the learning centers better, Elizabeth reported
that tutors need training, there needs to be more effective faculty contact with students in
the centers, and more staff is needed to proactively intervene and contact students who
are “not progressing” (Interview, month 9).
Elizabeth’s self reported expertise lay in her ability to work side-by-side with
students to increase student learning. She recognizes the underlying structures of her
discipline content and the needs of students who are just developing academic and
discipline skills. However, when contacted for a second interview, Elizabeth declined
because “nothing has changed” in the learning centers as a result of the action inquiry
78
project. Therefore, while Elizabeth perceives her expertise for helping students overcome
their problems, she does not yet report feeling empowered to effect change at the
institutional level on their behalf.
Grace
I interviewed Grace twice, once during the ninth month of the action inquiry
project and once after the project had finished. The results of both interviews are included
here and identified by the month in which they occurred. Grace teaches all levels of a
discipline where matriculation is required. Grace is very active in institutional activities
and has developed various programs during her tenure. She has considerable knowledge
of discipline content, her department, and the institution.
Grace teaches all levels of her discipline from basic skills to transfer level
courses. She knows there is a difference in how she should approach students along this
path of matriculation. In her first interview, she reports how students in basic skills
courses need help acquiring the skills necessary to succeed; she tries “to teach them more
about how to be a student. She gives them more hints and tips about how to do well”
(Interview, month 9). Grace also reports that she does not believe the institution or
faculty do enough to identify and help students, especially those taking basic skills
courses. She states, “We don’t do enough; we are focused on our subjects; we get tunnel
vision, and we don’t really know how the counselors can help” (Interview, month 9).
Here, Grace shares her expertise by indicating her knowledge of the various aspects of
her discipline content. Her response also indicates she has developed the ability to access
79
that knowledge quickly. Both of these responses feature characteristics of adaptive
expertise (Barnett, 1995, p.47; Alexander, 2003).
Grace is willing to support changes to the current situation, especially in the
learning centers, even if her position is unpopular. For example, Grace explained in the
first interview that faculty used to hold office hours in the learning centers. However, one
faculty member used the union contract to stop that practice. She wants to “resurrect this”
practice because for the most part, faculty do not have students visiting them during their
office hours. By holding office hours in the learning centers, faculty would be able to
interact with more students. However, Grace realizes this would be an unpopular change
because faculty office hours are a “productive time because not many students come”
(Interview, month 9). Grace also recognizes that changing the practice would be
challenging because the state calculations for teaching loads give less value to lab hours
than teaching/lecture hours. Grace teaches multiple subjects, or clusters of knowledge,
and she recognizes that it takes multiple domains to effect change at varying levels. By
sharing her beliefs, Grace exhibits characteristics of adaptive expertise. She reports how
she wants to change the learning center regardless of how other might judge her.
Grace has experience in dealing with the limited resources of community
colleges. For example, she reported that several years ago she recognized the need for
training of peer tutors to help them interact with students. As a result, she developed a
new course to train them. However, because tutoring classes generally have very small
enrollment and an instructor and a classroom still need to be used, the course has never
80
been offered. She stated in the interview that she hopes the class will be offered in the
near future, perhaps as a result of the action inquiry project.
Grace reports she feels empowered to implement change. In her first interview,
Grace revealed that she had developed a new learning community for transfer level
students at the institution. She had secured the president’s approval and had 15 students
enrolled. Grace shared that she had guaranteed her college president that there would be
60 students enrolled by the beginning of the following fall semester. In her second
interview, Grace reported that the learning community had 190-200 students involved.
When asked about this number of students, Grace shared that she had contacted many
other directors of similar learning communities and took some of their advice for the
program. She admits that she has to “stop admitting [students], but it’s working; they are
getting something out of the program [. . .] because there is an incredible amount of help
they can get in here” (Interview, month 18). She goes on to explain the difference in the
learning community versus the learning centers; Grace reports that in the learning
community, “They see each other in classes and now they’re becoming a group, a
community, so now they’re hanging out together; they are not just another face in the
class. They’re learning as a community” (Interview, month 18). As a result of this
success, Grace and the institution are considering adding more learning communities
based on majors. Here, Grace articulates that she recognizes the sensitivity of her task
and she explains how she overcame challenges using resources, two more indicators of
adaptive expertise (Berliner, 2004).
81
When asked how she acquired the knowledge and expertise to implement these
types of changes at her college, Grace explained that when she arrived at the college, she
was determined to learn how the institution worked, so her first years were spent serving
on the academic senate and various committees. She sat on these committees “for a year
or two and listened” before she got involved further (Interview, month 18). Then, over a
period of time, she slowly began to lead these committees. She believes she has two
strengths; she is an achiever and she wants to make a difference. Grace reported that she
sets the bar; achieves the goal; then resets the bar a little higher; and continues to repeat
the process. At the same time, Grace refers to herself as someone who “likes to work,
who is driven, who likes to grow, and wants to make a difference,” so she takes on new
responsibilities and steps in when no one else in the department is willing to do the work
(Interview, month 18).
In summary, Grace has considerable knowledge and expertise about her
discipline, her department, and her institution. By sharing her examples, Grace also
provides evidence that she uses adaptive expertise in the ways she overcomes challenges
and implements change. Her responses in the two interviews also indicate she has a
familiarity with the processes with the California Community Colleges System and has
developed a network of allies in other colleges. Perhaps as a result of her confidence in
her knowledge and expertise, Grace feels empowered to exercise these skills and
implement changes at the institution. In fact, in her last interview, Grace considered that
her next step might be to move into administration.
82
Beverly
Beverly arrived at Urban Community College with considerable teaching
experience including high school level instruction and teaching at the university level
(Interview, month 9). I only had one opportunity to interview Beverly, but of course, I
observed her over the life of the project. Beverly teaches transfer level courses, but as she
points out, since her courses have no prerequisites, she still has to develop pedagogy for
students who are underprepared to do college level work.
Talking about her work, Beverly reports seeing expertise in her ability to diagnose
her students’ weaknesses and develop pedagogy to address those weaknesses. During the
interview, she shared that initially her approach to instruction often caused the high
performing students to drop because they found the approach boring, and at the same
time, the approach alienated students who needed the most help because they already felt
like “losers” and did not want to participate in any activity that would reinforce that
feeling (Interview, month 9). As a result of these issues, Beverly explained that she
sought out former students and asked them how they thought she could improve her
pedagogy from the first day of class. As a result of the students’ comments, Beverly has
implemented high reward, low risk activities during the first week of class that
accomplishes several goals. First, Beverly puts the students into teams, so no one student
is highlighted. Second, she assigns each team the same task, which she refers to as a
“simple activity” where she pairs a concept with a skill that results in high rewards.
Third, this activity establishes the framework for all the activities in the course
throughout the semester. Therefore, once students succeeded in the first task, they
83
believed they could be successful in the course. By examining her pedagogy publicly,
asking her students for input, Beverly engages in a process that research suggests leads to
new meanings, in this case a new understanding of pedagogy. This also provides
evidence that Beverly recognizes the sensitivity she needs to meet her students’ needs.
Both of these indicate two characteristics of adaptive expertise (Raelin, 2004; Schon,
1983; Berliner, 2004).
Another way that Beverly tries to help students, especially those who may be
underprepared to take her college level courses, is by sharing her expectations for writing
done in the course. During the first week, Beverly distributes her expectations for writing
in both power point and written lecture. She suggests that if the students have difficulty
following either of these documents, they might want to consider taking a few more
courses before they tackle hers (Interview, month 9). For those students who stay in her
courses, Beverly allows all writing to be revised up to three times. Her experience and
knowledge have led to this revision process along with the creation of 1) a rubric, so
students can identify expectations throughout the written assignments; 2) a peer review
process, so students share ideas; and 3) a prescriptive checklist for each assignment
(Interview, month 9).In addition, Beverly has completed a primary traits analysis for each
of the hoped for student learning outcomes in her courses. A primary traits analysis is the
process of identifying each individual trait or unit of learning that when taken together
make up the larger student learning outcome. Beverly presents these traits to students by
scaffolding her units of instruction. She provides access to these units through the course
84
syllabus and the web enhanced portion of the class. This allows her to re-emphasize
these traits throughout the semester (Interview, month 9).
Other statements that Beverly shared in her interview indicate she is constantly re-
assessing and changing her pedagogy for her courses based on results of her ongoing
formative assessments. When asked about changing instruction as a result of the
assessments, Beverly reported:
I am always surprised. The first three years I was here, I changed
constantly to try to meet the needs of people who weren’t doing well, and
then I realized that’s a never ending process because some people do well
with one thing and some people do well with other things. What I try to do
now is I try to make the problems we’re working on as clear as I can.
Then, their [students’] feedback makes me clarify things, and then I try to
develop slightly different pathways of getting there. [. . .] I found that
there are certain principles I have learned from my students. (Interview,
month 9)
These principles include: 1) students don’t read instructions carefully, so they need to be
reinforced in writing; 2) all learning builds on previous learning (scaffolding); and 3)
students need very prescriptive instructions (Interview, month 9). As reported in her
interview, Beverly uses these principles to guide her development of pedagogy. Here,
again, Beverly sees her expertise in her ability to recognize patterns in student learning
and develop appropriate solutions in her approach to instruction, further evidence of
adaptive expertise (Berliner, 2004).
In her interview, Beverly also expressed her beliefs in relation to the functions of
a community college. She compared community college to parenting and stated she
believed that there must be a “set of standards, and standards can’t be moved around”
(Interview, month 9). She shared that often, community colleges offer “hand holding” to
85
increase admissions, but they fail to teach students empowerment to act as their own
advocates. Beverly reported:
The institution needs to grapple with how to relate in this new world that
students are entering. The generations change, and I think that the way
people are communicating –the differences—are phenomenal. Eight years
ago, the majority of my students did not have access to a computer. Today,
getting students to produce things on a computer is pretty simple. But,
give that, the institution really hasn’t changed much. (Interview, month 9).
Beverly points out the lack of online resources at UCC and the continuing differences in
other areas such as: online versus traditional classes; teachers’ expectations for students
from class to class; and a lack of expectation by some faculty for students to purchase the
class textbooks. Beverly ends this part of her interview by suggesting that faculty and
staff at the institution needs to be “on the same page” in regard to these expectations and
help students understand these expectations and the resources that will help them succeed
(Interview, month 9).
In summary, in reflecting on her own expertise, Beverly emphasizes her focus on
delivering instruction to her students and recognizing the need to change her pedagogy
when her student population changes. Beverly has been empowered to implement
change, but for the most part, those changes have been limited to her pedagogy.
Randy
Randy is a full time tenured faculty member who generally teaches basic skills
courses. He has served on many committees, including the newly formed (in the last
year) Basic Skills Committee (BSI). When asked in his interview about how he knows
that what he does in the classroom during the first day actually works for students, Randy
reported:
86
Usually, I share a little bit of myself. I ask a few questions to break the ice
a little bit, but then I talk about the class and I have a very, very detailed
syllabus. The syllabi I distributed are very detailed and clearly state the
student outcomes. I take the resource jargon out and put the outcome in
simple terms. I talk about the importance of higher education, how the
class will serve their needs, and another thing that I think demystifies the
process is that, my syllabus is very, very detailed. I break down what
we’re going to do each and every day. There’s a breakdown of grade
distribution, using a point system, so they [students] really know, even on
a day to day basis, what to expect, what pages to read, what assignments
we will do, the topics for each day. So, I think, looking at the student the
first day in my class, I give them a lot of information and I try to leave as
few question marks as possible. (Interview, month 18)
Randy continued to explain that he has developed an intuitive sense of how this works for
students by “reading faces and body language and energy” (Interview, month 18). He
also shares a second element for gauging student involvement; he gives tests and quizzes
throughout the semester.
Randy suggests that at times, faculty pass students who have not acquired the
skills related to the course outcomes. He sees the landscape of education changing in that
students have become very savvy with technology, but they are “less active reading long
books” (Interview, month 18). He also shared that at times, faculty care so much for their
students that they “don’t stick to the outcomes” (Interview, month18). He suggests “the
easiest thing to do is to pass students along” (Interview, month 18).
Randy has changed his approach to pedagogy because of his realization of
students’ technological savvy. For example, Randy shared that in the past, he might
reinforce the introduction of a topic by sharing an anecdote, but now he uses “a clip from
YouTube and he sees it as more powerful” (Interview, month 18). Randy also stated that
87
he changes his approach to instruction at least every semester, and he changes it the most
after he has had the opportunity for reflection.
When asked about his participation in the action inquiry project, Randy reported
that the most rewarding activity was visiting another community college which had
several models of promising practices. Randy stated that the college he visited was doing
“some great, innovative learning support service things” that he believed could be
incorporated at UCC as part of the newly formed BSI committee (Interview, month 18).
He also shared that UCC hasn’t “caught on to some of the cutting edge practices in terms
of supplemental instruction, freshman year experience, learning communities, and service
learning” (Interview, month 18). As a result of his exposure to promising practices,
Randy reported that in the past no one at his college was willing to implement these types
of practices, but with the BSI committee, and the funds that accompany the BSI program,
“they will be implemented now” (Interview, month 18).
When questioned further about the action inquiry project, Randy indicated that he
felt the hunches activity resulted in arbitrary views that established the base for further
exploration. However, he also shared that his participation had changed him. He stated,
“I’m sure I’ve changed; just my knowledge base has changed, the way personal lives
change, especially [when compared] to where I was before. I think I had this strong
notion of social justice and equity issues before, but I that after the project, they were
more informed rather than just passionate. They were a little more directed” (Interview,
month 18). Randy also offered several changes he had made as a result of participating in
the project. He has: 1) restructured course numbering system for his discipline (along
88
with Geraldine who is discussed next) to make these more “user friendly” and “so they
really represented different skills areas and different levels of structure whereas before
they were confusing”; 2) revised all courses and course outlines of record; and 3)
changed the course schedule to make courses more accessible to students (Interview,
month 18).
In all, Randy expresses his expertise in terms of how he connects with his
students and understands them and in his empowerment to implement change to reinforce
that connection. In addition to what I have already reported above, Randy explained that
when he’s “very engaged and involved, [students] are too; it’s because [instruction] is a
two way street; we’re feeding off each other” (Interview, month 18). As a result of
understanding his students so well, Randy, at times working with others, reports his
participation in a crucial aspect of adaptive expertise, implementing change. In this case,
the changes that were made resulted in greater accessibility to his discipline’s courses for
his students, and the changes were the result of participating in an activity of the project,
attending a symposium where external consultants suggested such change.
Geraldine
Geraldine was interviewed in month 18 after the project was finished. Geraldine
teaches content in several academic disciplines. She has served in leadership positions
and participated in most activities related to the action inquiry project. Geraldine also
participates in the Basic Skills Committee and has presented workshops with Randy
about how faculty can instruct students who are underprepared for college level courses.
89
Of the faculty interviewed, Geraldine reported undergoing the most change as a
result of participating in the project. Geraldine shared that her position as a faculty
member provided clear evidence that students need basic skills in order to succeed
(Interview, month 18). In addition, Geraldine reported that the observations and interview
she conducted outside her discipline during the action inquiry project helped her
recognize the types of resources available to her students for other disciplines. Geraldine
pointed to the changes she had made collaboratively with Randy to the course numbering
system, which resulted in offering classes at different times and on different days to reach
a larger segment of the student population, and expanding the number of classes offered
in their discipline (Interview, month18). Geraldine credits the external consultants at the
symposium hosted by CUE in month seven of the project with the idea for renumbering
their courses. Although the consultants had made the recommendation for a different
discipline, Geraldine and Randy used it for theirs.
In her interview, Geraldine also discussed changes on a more personal level. She
reported that the most important finding from the action inquiry project was that
we can make [higher education . . .] more accessible to students. [. . .] I
don’t know if we need to talk to them in their own language, but I do
know that we [instructors] have a lot of assumptions, and a lot of ideas
that we think make sense to students, and they don’t necessarily agree. I
think we need to continually work to find ways to reach them and
communicate with them. (Interview, month 18)
Here, Geraldine provided examples of how instructors assume that specific types of
writing are acceptable to assign to students. She points out that while instructors think an
essay topic that asks students to reflect on their experiences in the western part of the
world or America calls for thoughtful reflection, the assignment can “really go against the
90
grain of some students from other countries” because they have no experiences in our
culture, or do not value their experiences in America (Interview, month 18). Geraldine
believes that by not implementing pedagogy that includes the cultures of our students, we
make learning even more difficult. In this example, Geraldine shares her recognition of
how some assumptions can result in reinforcing the dominant group and avoiding dealing
with “hidden resistances and conflicts in human discourse” (Raelin, 2007). By
recognizing this, Grace displays several characteristics of adaptive expertise.
Geraldine also offered examples of simple phrases that when used in a classroom
can challenge student learning. For instance, Geraldine states, “We might talk and say ‘as
everyone knows,’ or ‘of course you realize’” to students who don’t know or can’t realize
making the students feel “inadequate without our even realizing it” (Interview, month
18). As a result of her exposure to new ways of thinking, Geraldine wonders “how do we
connect with students? Should students meet us halfway?” (Interview, month 18)
Geraldine’s continuing growth in expertise is represented in her willingness to reflect on
what she is doing and look for input in how to make her pedagogy even more accessible
for students.
Summary
As discussed in Chapter 2, the characteristics of gains in adaptive expertise
include: 1) an increased knowledge base; 2) improved problem solving skills moving
from defensive reasoning to productive reasoning; and 3) empowerment to act. As
reflected in the interview data above, project participants understood their work and
viewed their sources of expertise in terms of these characteristics.
91
As findings in this chapter suggest, the five project faculty participants shared
how they viewed their own expertise related to two areas: 1) instruction and 2) shared
governance. In the area of instruction, most participants viewed their expertise as their
willingness to make changes in how they delivered instruction. These changes often
resulted from reflections on the needs of their students. For example, Elizabeth
emphasized her work to increase students’ self-esteem by working to empower them to
develop strategies for success. Her example indicated she recognizes the multiple
domains of knowledge in her discipline and how she worked to provide students with
appropriate solutions, so they too could understand the discipline content. Beverly
perceived her expertise as enacted through her active solicitation of comments from past
and current students to help her improve learning in her classes. By soliciting input from
students, Beverly exemplifies the use of productive reasoning, which in turn is a
characteristic of adaptive expertise. Both Randy and Geraldine perceived their expertise
in their continued willingness to change assignments, syllabi, and even their own
vocabulary to encompass culturally responsive teaching. Finally, Grace’s reflection on
her own expertise offers perhaps the most succinct description of these teachers when she
says that she “likes to work, is driven, likes to grow, and wants to make a difference.” In
sharing these perceptions on their expertise, these faculty members reinforce Berliner’s
tenet that “the desire to be excellent” is the most important factor in developing expertise
(2004).
In addition, while some faculty participants perceived their expertise as limited to
instruction, other faculty project participants perceived their expertise as effecting change
92
at the institutional level as well. For example, while Elizabeth and Beverly shared their
perceptions on their own expertise relative to improving instruction, Grace, Randy, and
Geraldine viewed their expertise as moving beyond instruction to include institutional
level change. Grace shared how her expertise in understanding the institutional systems at
her college helped her implement a new program at the institution that focuses on student
success at the transfer level. Also, Randy and Geraldine shared how their expertise from
participating in the action inquiry project helped them work collaboratively to improve
course numbering and to schedule courses better, so more students have access to some
basic skills courses. In all, some respondents perceived their expertise as limited to
instruction while others perceived themselves as “change agents” in both instruction and
at the institutional level. Randy and Geraldine reported that they associated their
expertise with their participation in the action inquiry project.
93
CHAPTER FIVE: RESULTS
So far, this study has suggested that community college faculty members have
opportunities to increase their adaptive expertise to contribute as professionals to their
institutions by participating in assessment activities. This study has also maintained that
current assessment activities, often aligned with shared governance at community
colleges, do not generally result in increased adaptive expertise in faculty for two
reasons: 1) often faculty are not interested in the focus of these activities; or 2) faculty
resent the external demands driving these processes. However, research contends that an
increase in adaptive expertise can result from participating in the types of assessment
activities generated through action inquiry providing the structures of these activities
include elements of reflection—experience, collaboration, and examination of beliefs and
values.
An example of action inquiry that includes assessment activities of this type can
be found in the partnerships between the Center for Urban Education (CUE) and many
community colleges in various states. These partnerships pair professional researchers
from CUE and community college stakeholders to solve a locally identified problem
through action inquiry activities. These partnerships are designed to provide opportunities
for community college participants to engage in activities normally used by professional
researchers. These activities expose the college team participants to opportunities to
collect, analyze, discuss, and reflect on evidence related to a locally identified problem.
By participating in these types of activities the two barriers mentioned above are
overcome. Because action inquiry asks community college participants to determine the
94
problem to be solved, the issue of faculty interest as a barrier is decreased, and since the
project is internally rather than externally driven, the second barrier is eliminated.
Consequently, these partnerships seem to negate the two most challenging barriers to
increasing faculty expertise while offering activities that research contends can result in
increased adaptive expertise.
The activities embedded in the action inquiry project for this study provided
participants with the opportunity to examine instructional practices, current processes,
and their own attitudes. Research in Chapter 2 contends that these activities can result in
adaptive expertise identified by three indicators: 1) an expanded knowledge base; 2)
improved problem solving; and 3) empowerment to act. Research also contends that an
increase in expertise can result from participating in these action inquiry activities when
they include elements of reflection. In order to determine more precisely the relationships
between these concepts, and the effects on faculty participants at Urban Community
College, this chapter presents the findings from analyses of data collected on a purposeful
sample of community college faculty members who participated in one of the action
inquiry projects developed by the Center for Urban Education (CUE) between 2003 and
2008. In this chapter, I use methods described in Chapter 3 to determine if the
combination of the three elements of reflection and the action inquiry activities resulted
in an increase in adaptive expertise in five faculty participants. More specifically, this
chapter attempts to answer the research question: “Did participating in collaborative
action inquiry activities as a form of institutional assessment result in changing
perceptions of expertise among faculty participants?” Discussions consider where the
95
concepts, discussed in Chapter 2, interacted with each other during the project and the
results of any interactions.
In this chapter, I start by providing a brief overview of the partnership between
the Center for Urban Education (CUE) and Urban Community College. Next, I describe
separately the strategies and activities contained in each stage of action inquiry project
with special focus on any elements of reflection that emerge (see Table 2) for a summary
of how the concepts interact to increase adaptive expertise). This also helps provide
contextual awareness. As the strategies and activities of the stage are described, I include
any evidence of participants’ reactions to conducting research activities and to hearing
the results of others’ activities. Then, I discuss any evidence that indicated a strategy
and/or activity provided the best opportunity for increasing adaptive expertise. After that,
I offer discussion of any conditions or attitudes that emerged from the evidence that may
have prohibited an increase in expertise. Finally, it is important to note that some faculty
participants reported changes in their individual pedagogy or institutional processes as a
result of participating in this project. However, not all participants experienced this
change, especially in the areas of institutional assessment and equity.
Center for Urban Education: An Overview of the Project
From the outset, the goals of the project were to increase “institutional
effectiveness, efficiency, and equity” by developing strategies and tools that practitioners
at Urban Community College could use to develop “excellence in basic skills and a
strong transfer culture” (Interim Report, month six, p. 7). The first activity of the project,
the “hunches” activity, asked participants to “brainstorm factors affecting course
96
completion rates” of students who begin in basic skills courses with the intention of
moving to transfer level courses (“Hunches”). The activity was held just prior to the start
of the project, and it provided the first opportunity for Urban Community College team
members to interact as a group and to interact with the researchers from CUE. The
purpose of this activity, along with an activity from the first team meeting discussed later,
was to provide Urban Community College team members with evidence upon which to
frame the specific problem that would be the focus of the project, in this case, the
struggles students were having matriculating in math from basic skills to transfer level
classes.
Once the problem of math matriculation was selected, team members from Urban
Community College were given observation protocols and interview guides developed by
the researchers at CUE. By using these tools, Urban Community College participants
were able to discern effective institutional practices and collect concrete evidence about
aspects of the math matriculation pathway in order to eventually determine appropriate
solutions that could increase student success. Other strategies and tools suggested by
CUE for the project were focused more specifically on instructional areas, including a
syllabi review, but the team at Urban Community College decided not to explore these.
The team members also participated in visits to other community college sites where they
were introduced to promising practices. More detailed discussions on how these
strategies and tools worked within the stages of action inquiry and whether or not
participation by faculty resulted in a perceived increase in adaptive expertise is next.
97
Action Inquiry Stage One: Frame the Problem
The first stage of the action inquiry project included two primary activities that
allowed team members the opportunities to use their knowledge and experience of the
institution to identify, or frame the problem. The general focus was on identifying why
students were not succeeding at Urban Community College. This activity was
collaborative and asked all participants to share their “hunches” about why students were
struggling at their institution very publicly. The second activity, which happened during
the first month of the project, focused on analyzing statistical data on student success at
the college. As a result of participating in these initial activities, the team members from
the college determined that improving student success in the matriculation of math
courses from basic skills to transfer level courses should be the focus of the project.
The “Hunches” Activity
The purpose of the “Hunches” activity was to provide an environment in which
the group from the college could articulate the current institutional conditions that might
serve as barriers to student success. This activity was the first opportunity that Urban
Community College participants had to interact with each other and with the researchers
from the Center for Urban Education (CUE). The community college participants
discussed eight topics including: 1) conflict; 2) student support services; 3) institutional
culture; 4) practices and policies; 5) resources; 6) students; 7) faculty and control; and 8)
faculty and instructional practices. A review of field notes from the “hunches” activity
established that participating faculty members’ prior knowledge and experience regarding
the processes and beliefs embedded in the culture of their institution was extensive.
98
Because they had significant experience at the institution, they were able to articulate a
lengthy list of challenges to student learning. The field notes do not indicate any
dissension among the group in response to any of the issues raised during the activity. In
brief, meeting notes indicated that the experience of these professionals from Urban
Community College was impressive. Their experience provided these participants with
the knowledge to create a considerable list of barriers that might be interfering with their
students’ success, and a review of the notes also provided a baseline for recognizing the
level of experience of the team members when considering any effects new experiences
might have on increasing their knowledge and adaptive expertise.
To better understand the context of the institution at the start of the project, it is
important to look at the results from the “hunches” activity. Notable among the outcomes
from the “hunches” activity included the belief that the institution had “low expectations”
for its students as well as “ongoing employee issues” resulting in “bad attitudes and poor
morale” that administration does nothing to solve (“Hunches”). Additional results from
this activity indicated that Urban Community College does not require incoming students
to take assessment (placement) tests, nor are the students required to attend an
orientation. Other issues that arose included the lack of a retention program and the
confusion of the course numbering system (“Hunches”).
During the activity, many team members cited the existence of conflict at the
institution as a challenge to student learning during the activity. Statements made by
several participants indicated that there was “animosity between Urban Community
College and feeder high schools”; current polarization between instruction and student
99
services; and “conflict and tension between counselors and instructors leads to
compromised student services.” One participant summed up the effect of these conflicts
on students she shared that “limited communication between counselors and instructors
[had resulted] in conflict in advisement, expectations, and guidance” to students
(“Hunches”). The willingness of these college participants to deal with issues of conflict,
or to ignore these issues, occurs more than once during the project, and at times, results in
defensiveness. Defensive reasoning is discussed later in this chapter.
The “hunches” activity also resulted in the recognition that some instructional
practices needed to be examined. For example, participants encouraged “culturally
responsive teaching,” but they also believed that there was a lack of consistency in the
teaching at the college and that some faculty did not expect “this generation of students to
be ‘good’ enough” (“Hunches”). Additionally, the Urban Community College team
participants also believed that “students come to college underprepared” and “first
generation students have few or no role models at home” (“Hunches”).
A review of the complete “Hunches” list (see Appendix C) suggested that
participants willingly articulated beliefs and assumptions, either positive or negative.
Because of this willingness to share in a public arena, it is logical to believe that the
Urban Community College team members felt secure sharing their beliefs so publicly
during the activity. As mentioned earlier, further research uncovered that faculty
participants, who were the focus of this study, had tenure and many years of service at the
institution, and that some of them had served in administrative positions.
8
Therefore, it is
8
The range of service for these faculty participants was 11 – 36 years.
100
reasonable to assume that the experience of these senior faculty members resulted in a
significant knowledge base at the start of the project.
However, in spite of their experience and professional knowledge at the beginning
of the project, the first stage of action inquiry, framing the problem, did provide
additional ways for faculty participants to build on their prior experiences and expand
their existing knowledge base, opportunities to increase adaptive expertise. A review of
field notes from the first team meeting indicated that exposure to statistical data on
student success resulted in a significant degree of faculty surprise that the college was
“losing a lot of students,” and for some participants, the activity provided evidence that
their long-held beliefs about the challenges faced by students at the institution were
accurate (Meeting notes, month one).
Analyzing Statistical Data
Building on the “hunches” activity, the researchers from CUE helped faculty
participants from Urban Community College read and analyze statistical data on student
success during the first team meeting.
9
Field notes and interviews showed that although
the faculty participants were experienced in their years of service to the college, they had
not had much prior experience in analyzing data, a task normally assigned to researchers.
When exposed to the data at the first meeting, the field notes indicated that multiple
participants reacted by expressing surprise. Some simply stated “Wow,” while others
“appeared very engaged and sparked a series of questions” (Meeting notes, month one).
9
Team meetings were held monthly throughout the project unless classes were not in session or other
action inquiry activities replaced the team meeting.
101
Indicators of Adaptive Expertise
Expanded Knowledge Base
Analyzing statistical data exposed the participants to a new way of understanding
student success. The meeting notes suggested that this exposure to new evidence resulted
in faculty participants asking additional questions to expand their understanding of the
baseline data they were analyzing. Meeting notes stated that some participants questioned
which academic year was being reflected in the data or if there could be bias in the
statistics while other team members questioned a second set of data which provided an
annual benchmark (+3%) for improving student success. Questions were raised regarding
the source of the benchmark goal for each year (+3%), and particularly whether 3% had
been an arbitrary number. One full time faculty member raised an understandable
objection that “Some people may say that we have low expectations” because the 3%
figure seemed low. She also admitted she was “not accustomed to reading statistics”
(Meeting notes, month one).
One response from an Urban Community College faculty participant, who was
tenured and had experience at the institution over many years, was typical of the other
faculty participants from the college who were asked about this activity. In her first
interview in the spring of the project, Geraldine explained her enthusiasm at being
exposed to statistical data. Geraldine reported:
I never had the hard data or the statistics to confirm, [. . .] but working on
this project has shown me how difficult it is for some students and how the
basic skills can be a major obstacle for transfer. I guess it really
highlighted the difficulties that students have with math and how almost
insurmountable the math thing is for people who want to transfer
especially to CSUs.
102
Here, Geraldine articulated how participating in the activities to frame the project, in
particular being exposed to statistical data, had reinforced her beliefs about the
difficulties facing students. This activity also helped Geraldine expand her understanding
of two problems facing students who want to transfer: 1) how needing basic skills classes
is a “major obstacle” to transfer; and 2) how matriculating through the math courses is
“almost insurmountable” for some students. This is new understanding of the struggles
students experience in math for Geraldine as she is not part of the math department at the
college. Her statement suggested an expanded knowledge base, which research contends
is an indicator of increased adaptive expertise.
Since Geraldine is fairly typical of the tenured, long-term, faculty participating in
the project, it is possible to suggest that exposing community college faculty to statistical
data and helping them understand how to analyze and use it may help them increase their
expertise, especially if they have experience and knowledge of the institution prior to the
activity. In this case, by participating in the analysis of statistical data, participants had
the opportunity to increase their knowledge and understanding of two “major” or
“insurmountable” problems facing students who plan on transferring to a university.
Also, their participation in analyzing data provided them the opportunity to confirm or
nullify their previously held beliefs, as Geraldine did. Increasing knowledge of a problem
and examining her beliefs are two indicators of adaptive expertise discussed in Chapter 2.
In addition to Geraldine’s comments above, Table 4 which follows summarizes the
responses of additional faculty participants to this activity.
103
Table 4: Responses of faculty participants to analyzing statistical data
Participant Response
Grace I noticed that spring rates [student success] are lower. [. . .]. I see that
Hispanic is our biggest group. Some people may say we have low
expectations.
Beverly Is there any bias in the numbers? There are things that could influence
numbers, where students are coming from will make a difference. Maybe
the placement test is picking up the wrong thing.
Karina Do we have students who don’t want to transfer? We’re losing a lot of
students. . . .Wow!
Vicky Maybe we need to look at the cut-off score; we have data that “C”
students don’t do that well.
Richard Are we including students who received a “W”? How many semesters
does it take for them to succeed?
As the discussion continued, researchers from CUE helped Urban Community
College participants recognize that upcoming project research activities needed to focus
on the processes that team members could impact as articulated in the “hunches” activity
to verify or eliminate the hunches. This reminder helped the team develop three areas of
inquiry to determine: 1) the reason why students drop out of math matriculation between
elementary algebra and intermediate algebra; 2) what happens to students who take
intermediate algebra but fail to continue to the transfer level; and 3) why students are
waiting or taking so long to get through elementary algebra (Meeting notes, month one)?
These questions helped frame the project research activities that would occur in stage two
of the project which asked participants to collect evidence about the struggles students
were experiencing in the matriculation of math courses.
104
Action Inquiry Stage Two: Advocating for a Solution using Concrete Evidence
Stage two of action inquiry involves multiple research activities including
observations and interviews. As noted earlier, the researchers at the Center for Urban
Education (CUE) developed the observation protocol and the interview guides used by
the team members at Urban Community College. Discussion from this stage of the
project looks at how these faculty participants reacted to participating as researchers.
Discussion also includes participants’ reactions to sharing the results of the research
activities and results from the activities of others. Finally, any evidence of reflection in
the action inquiry activities that led to increasing adaptive expertise is included.
Specifically, this stage looks at problem solving abilities by analyzing the reactions of the
faculty participants using the characteristics of defensive and productive reasoning. Table
5 below offers a review of the characteristics of these types of reasoning used in problem
solving.
Table 5: Comparison of characteristics between defensive reasoning and productive
reasoning
10
10
Argyris, Chris. (1991). “Teaching smart people how to learn.” Harvard Business Review, 4 (2)
(Reflections, 1991), 4 – 15.
Defensive Reasoning Productive Reasoning
Avoids inquiry by others
Crafts conclusion in ways that permit others to
try to disconfirm
Avoids tasks that could result in
embarrassing interactions
Does not by pass embarrassment or threat;
engages them
Makes all inferences implicit Makes all inferences explicit
Shifts responsibility to others Openly illustrate how evaluations are reached
Positions are hardened Examines presumptions rigorously
Rivalries occur Encourages collaboration
Mistrust Tests validity of inferences
Behavior of distancing (arrives late or
misses meetings
Monitors implementation of choice vigilantly
105
As the table indicates, defensive reasoning assumes an entrenched position that avoids all
outside examination and often results in mistrust while productive reasoning is exhibited
when one willingly engages in rigorous reflection personally and publicly.
As the team moved forward to investigate the problem of student success in math
and to determine an appropriate solution, the action inquiry activities helped them collect
data about the students’ struggle with math. Results from their data collection activities
would provide concrete evidence for the team to discuss when making decisions about a
possible solution. Using concrete data as a basis for decision making is a characteristic of
productive reasoning. Additionally, productive reasoning calls for the sharing of data
collaboratively in order to expose how decisions are reached. Consequently, all results
from the collection of evidence were shared at monthly team meetings.
To investigate the problem of student success in math, faculty team members
collected evidence by conducting observations, interviews, or both. Some worked alone
while others worked in pairs, but all were asked to conduct their data collection in areas
of the institution outside their normal work environment. As mentioned before, it was
important that the results from these data collection activities be shared during monthly
collaborative team meetings.
Observations
Results from the observations of the supplemental student learning centers
conducted by the action inquiry team participants were reported out at the second team
meeting. Observations of the tutoring center were completed by two faculty participants
working together. Observations of the math learning center were conducted by faculty
106
participants on an individual basis. A review of the results of these observations showed
several findings that were relevant to both learning centers. Field notes indicated that
students could not find the centers easily and that if they did locate the centers, most
assistance was general in nature and not connected to classes. Overall, usage rates in both
centers were low, and results from the observations also indicated that both centers had
issues with access, friendliness, and availability. Perhaps more important was the
outcome reported in the observations that the students who needed help the most from the
centers were not the ones getting it (Meeting notes, second month).
When shared at the second team meeting, these observations resulted in
questioning what preparation student tutors, staff, or faculty working in the center
received in order to work in these facilities (Meeting notes, month two). As a result of
sharing these findings with the entire team, discussion then focused on the next inquiry
activities that should take place, and the team decided to survey all aspects of the centers
and proceeded to create respondent categories for future surveys in each center. Field
notes do not indicate any dissension in the group resulting from these shared results.
Interviews
At the end of the second team meeting, the Urban Community College
participants agreed to conduct interviews with lead faculty and staff in both learning
centers to “investigate and influence factors that promote success in basic skills and
transfer of underrepresented students” (“Conducting Interviews”). The “Conducting
Interviews” handout, developed by researchers at CUE, described the interviewing
process for the faculty participants involved. CUE also developed the “Guide to
107
Conducting Interviews,” and they went through the guide with the Urban Community
College participants in order to prepare them to act as researchers in this process. Some
of the “tips” included in the guide were taking notes, maintaining rapport, strategies, and
managing silence (“Guide to Conducting Interviews”).
One faculty participant from Urban Community College interviewed two math
faculty members regarding the math learning center. Her findings indicated that while
each of the faculty members considered the learning center helpful to students, both
admitted that the center’s effectiveness had never been assessed (Meeting notes, month
three). Independently, each of these faculty members shared his belief that students did
not take their education seriously, and that because students were not serious about their
education, the usage in the math learning center was low. Further findings indicated that
neither of the faculty members interviewed was certain that faculty members, especially
part-time faculty, knew about the services at the math learning center, but both of them
hoped that counseling faculty knew about its value. These two math faculty members also
indicated they did not know of any training that student tutors received, nor did they
believe there was any outreach activities in place to help students and faculty become
aware of this resource (Meeting notes, month three).
Another team member interviewed a staff member of the math learning center.
When the team member went to conduct the interview, he found the staff member’s
office door locked and the blinds closed (Meeting notes, month four). The participant
who went to conduct the interview articulated concern that if the staff member was to be
available to students, “then the students would have to know of his existence and
108
availability because he was hard to find” (Meeting notes, month four). Two additional
findings resulted from the sharing of this interview. First, the team was made aware that
the focus of the math learning center was modular courses not peer tutoring, and second,
a participant on the action inquiry team suggested that changing the staff member in the
learning center would be difficult due to his length in the position (Meeting notes, month
four).
Interviews conducted for the tutoring center had similar findings. Two faculty
participants interviewed staff members individually. Staff members stated that high
student tutor turnover and the lack of awareness of the center were contributing factors to
its low usage numbers (Meeting notes, month three). In addition, staff members
interviewed pointed out that “students are used to being ‘babied’ in K-12 system, so
asking them to find the center and set up their own appointments takes learning and
initiative” (Meeting notes, month three). Possible solutions to the problem of low usage
were also suggested by the staff members in the interviews. These included consolidating
the learning centers to make access easier for students. Another possible solution to
increase usage suggested that faculty require students to use the tutoring center or that the
center advertise more on campus.
Action Inquiry Stage Three: Assess the Solution
The final stage of the action inquiry process focuses on assessing the effectiveness
of the solution implemented to help solve the problem identified in stage one. The “Final
Report” on the CUE partnership with Urban Community College contains several
suggestions for improving student success. For example, the report enumerates areas
109
where changes could benefit student learning. Here, I provide a list of the
recommendations along with the faculty project participants who perceived their
expertise as helpful in implementing change in that area. These include:
1. working to link the results from the observations and interviews of the learning
centers to teaching practices and learning environments; (Elizabeth, Beverly,
Grace, Randy, and Geraldine)
2. considering how content and teaching practices can be linked to learning
resources, especially at the basic skills level; (Randy and Geraldine)
3. examining how the basic skills disciplines interrelate and how these courses
connect to other courses at the college; (Randy and Geraldine)
4. rethinking the curriculum to identify barriers to student success; (Randy and
Geraldine)
5. incorporate student voices and experiences; (Beverly)
6. develop equity-minded professionals: pedagogy and curriculum development;
(Beverly and Geraldine)
7. continuing outreach activities and improved signage for learning centers;
8. implementing mandatory orientation to student services for all students; and
9. training the tutors (pp. 27 – 32).
In addition to these recommendations, changes in expertise did occur among the faculty
project participants indicated by their self-reported perceptions of their expertise in both
110
their individual or collaborative practices. Overall analysis of whether or not the changes
implemented by these faculty participants, either in instruction or at the institutional
level, increased student learning is a project for the future.
Discussion
Experience: Barrier or Facilitator?
Research has indicated that experience is an element of reflection (Rodgers, 2002)
and that when reflection is combined with action inquiry activities the result can be an
increase in adaptive expertise. In this study, the experience of some faculty participants
did result in changes that indicate adaptive expertise. These changes occurred even
though these same faculty participants articulated frustration at their previous attempts to
effect change (see Table 6). Table 6 on the next page provides a summary of how the
participants’ knowledge and experience of the institution could have served as a barrier to
increasing their adaptive expertise.
Table 6: Summary of responses regarding potential for change based on experience
Interviewee Responses
Randy It just seems like they [the college] reward good work with more work.
It’s hard to get any direction when the committee is so large.
To be honest with you [CUE], and I hope it’s not heartbreaking, [. . .]
but the reality is, [. . .] we do a lot of work, we have these
conversations, and we’re on the same page, but at the end of the day, [.
. .] a lot of times things, priorities aren’t always based on research.
They might be based on politics or just somebody’s whim or just
what’s in vogue.
Grace We always talk and talk and talk and nothing really happens. It is
difficult to get anything done; it [change] moves so slow.
Geraldine The way you let people know about a project like this and about the
findings has to be handled very carefully because again people don’t
want to just hear that some outsiders came in and they told us this is
what we’re doing wrong.
111
The comments summarized in Table 6 above indicated that the prior experiences of these
seasoned professionals could cause them to rethink suggesting any changes for improving
student success, especially at the institutional level. However, some faculty participants
used the experiences from their participation in the action inquiry project to build on prior
experiences to act as agents of change in their instruction, at the institutional level, or
both (Rodgers, 2002).
Defensive Reasoning
At the start of the action inquiry project, faculty project participants often shifted
responsibility for student learning to students. One faculty member stated, “They chose to
go to school, right? No one forced them to be a college student. [. . . Students are] not
ready for college [nor do they] make adequate use of the [resources] offered on campus”
(Interview, month seven; “Hunches”). In addition, there were the statements indicating
that when the responsibility was not placed on students, the faculty participants often
shifted responsibility to other groups by saying, “A lot of issues we deal with are K-12
issues. [. . . And a] lot of these equity issues need to be addressed before [students arrive
at college]” (Interview, month five). These statements indicate a shifting of responsibility
to others, a characteristic of defensive reasoning, considered a barrier to adaptive
expertise (see Table 6, p. 99). This shifting of responsibility was apparent particularly at
the start of the project. As Argyris (1993) contends, defensive reasoning prohibits
opportunities to improve problem solving skills, and if problem solving skills don’t
improve, then everything remains status quo and the feeling of empowerment to effect
change doesn’t arise. At the start of the project, faculty participants did not appear to
112
engage in reflection to examine their own beliefs and potentially improve how they solve
problems, two processes necessary to increase adaptive expertise.
Additionally, later in the project, there were more examples of faculty participants
refusing to examine areas that could prove embarrassing or threatening, another
characteristic of defensive reasoning. For example, in the project’s fourth month, a
faculty participant was asked if she would share success rates (statistical data) for her
discipline; she replied, “No, they are not good” (Meeting notes, month four). In another
instance when an opportunity to investigate instructional practices beginning with a
review of syllabi was suggested by CUE researchers, the response from multiple Urban
Community College faculty participants was negative. One faculty participant was
“strongly against syllabi review” and another believed “students don’t pay attention to
syllabi” (Meeting notes, month four). By month four, the only exception to evidence of
defensive reasoning seemed to have occurred during the framing of the problem when
faculty members were exposed to new knowledge by analyzing statistical data. During
that activity, faculty participants did not appear concerned over their lack of expertise in
the analyses of data. This exception was discussed in earlier in this chapter.
In brief, it appeared that once the faculty participants got past the initial stage of
framing the problem, defensive reasoning strategies often halted any significant increase
in adaptive expertise, at least in most participants. The threat of embarrassment seemed to
dampen the enthusiasm these faculty participants had demonstrated in earlier inquiry
activities. Research contends that in order to improve problem solving skills, beliefs and
practices need to be exposed to possible disconfirmation in a collaborative setting.
113
Unfortunately, time and time again, defensive reasoning stopped participants from
engaging in activities that could have improved their problem solving skills, which in
turn, could have increased expertise. Therefore, upon initial reflection, defensive
reasoning seemed to be a major barrier to increasing adaptive expertise by improving
problem solving skills in a collaborative environment. However, as Chapter 4 indicates,
some faculty participants did share the perception that the changes they made in
pedagogy and institutional processes were associated with having participated in the
research activities of action inquiry.
Productive Reasoning
Here, I use findings from an interview with Geraldine conducted after the project
concluded to represent how one project participant perceived her expertise to be
associated with the action inquiry project. In the interview, Geraldine reported how
participating in the project resulted in several changes in her beliefs and assumptions. In
particular, Geraldine reported that her beliefs towards basic skills courses and the
students taking basic skills courses had changed as a result of participating in the project.
As a result of the changes in her beliefs and her increased knowledge, she discussed how
she and another faculty participant in the project, Randy, had worked together and
individually to implement change. She began by pointing out where changes still need to
be made:
I think we could find and concentrate on more ways to deliver basic skills
and make basic skills more accessible. [. . .] We talked about the whole
concept of the gatekeeper courses, [. . . so] is there a way that we could
make these courses more intensive so that students could go through them
more quickly so they aren’t looking at five years of math?
114
We talked about expanding our schedule, having classes at more spread
out times, so that we could offer classes to a larger segment of the
population. We’ve talked about a lot of things like that. [One faculty
member] went through and changed the course numbering system because
he thought that it was [. . .] sort of confusing. [. . .]
Evidence from Geraldine’s interview reveals how participating in the action research
project resulted in she and Randy’s working together to offer students easier course
numbering and more effective scheduling for basic skills courses, changes that were
institutional. In addition, Geraldine reported how she changed her beliefs regarding basic
skills, and she indicated a high level of trust in her own experiences. In this interview, she
does not indicate that she sought approval from others as changes were implemented. As
a whole, this evidence from Geraldine’s interviews revealed both a change in her beliefs
and a high level of empowerment—characteristics of adaptive expertise resulting from
assessing the solution to a problem, the third stage of action inquiry (see Table 2).
Conclusion
To answer the research question: “Does participating in collaborative action
inquiry activities as a form of institutional assessment result in perceptions of increased
expertise among faculty participants?” this chapter has presented findings using the
stages of action inquiry and areas where literature asserts the concepts of reflection,
action inquiry, and adaptive expertise meet, resulting in the opportunity for increased
expertise in participants. The discussion of each stage began by providing an
understanding of the types of activities used in that stage and how those activities worked
with an element of reflection to offer opportunities for increased adaptive expertise. Each
discussion included evidence from participant interviews and reviews of field notes to
115
provide additional context and credibility, and each discussion concluded by considering
which activities seemed to offer the most exposure for increased adaptive expertise and
what barriers prohibited an increase in such expertise.
Evidence indicated that at the beginning of the action inquiry project, faculty
participants from Urban Community College had significant experience and knowledge.
Since research contends that increasing one’s knowledge base can result by building on
prior experiences with current ones, the discussion focused on any activities in stage one
that led to increased awareness of the problem being studied. Evidence was presented
that these experienced faculty members became excited and more deeply engaged in the
project when exposed to activities normally reserved for researchers. In particular,
evidence exists that these faculty members were willing to expose themselves to potential
embarrassment because they believed the statistical data would help them expand their
knowledge. Research states that willingness to expose one’s self to embarrassment is a
characteristic of productive reasoning, which in turn, can lead to improved problem
solving abilities that can result in increased adaptive expertise.
As the project moved to the second stage of action inquiry—advocating on behalf
of a solution to the problem, these faculty participants worked as practitioner researchers
to collect evidence through observations and interviews using tools produced by their
partners from the Center for Urban Education (CUE). As a result of participating in these
activities, they were exposed to opportunities where their problem solving skills could be
improved, another characteristic of adaptive expertise. However, during this stage, the
faculty participants often exhibited characteristics of defensive reasoning, such as
116
mistrust, rivalries, and avoidance of issues that could result in embarrassment. Studies
contend that to improve problem solving skills, participants needed to develop productive
reasoning skills by engaging in situations where they are required to be explicit and seek
disconfirming dialogue. Analysis of evidence indicated that these stages of action inquiry
resulted in examples where defensive reasoning inhibited opportunities to increase
adaptive expertise through improved problem solving abilities. As a result of their
defensive reasoning, faculty participants did not experience a noteworthy change in their
problem solving skills.
The final stage of the project’s activities--implementing a solution and assessing
its effectiveness—coupled with examining values and beliefs, an element of reflection,
did result in the perception of improved reasoning skills which led to effecting changes in
instruction, at the institutional level, or both. In particular, one faculty participant,
Geraldine, reported that participating in the project resulted in her changing her
instructional practices and working to improve student access to basic skills classes. She
and another faculty participant, Randy, worked collaboratively to renumber the basic
skills courses, so students will be able to understand the sequence better. They have also
worked to improve how these basic skills courses are scheduled, so more students can
access them. Finally, Geraldine, Randy, and Grace are continuing to work on the
project’s goals by serving on a new committee focused on improving student learning in
basic skills.
Therefore, while initial analysis of the findings from the study indicated that
defensive reasoning was, in some instances, a barrier to adaptive expertise, further
117
analysis of the findings indicates that at least two of the faculty participants believed they
were empowered by participating in the action inquiry project. In fact, these two faculty
participants did go on to effect change in their own instruction and at the institutional
level. Therefore, findings indicate that some individual faculty participants did perceive
an increase in their expertise as a result of their participation in the action inquiry project.
They even reported they used their increased expertise to implement change in
instruction, at the institutional level, or both. However, findings also seem to suggest that
an increase in adaptive expertise is affected as much by the individual as by the project in
which they participate.
118
CHAPTER SIX: CONCLUSION
In this chapter, I share my personal narrative to help provide context for the
findings and recommendations that result from this study. As a full-time, tenured
community college faculty member, this study has helped inform my pedagogy and
shared governance contributions. Many times throughout the project, I saw myself in the
faculty project participants and I often compared my institution to the community college
where I conducted my research. The narrative also provides a context for the
recommendations I emphasize based on the findings of my study.
Personal Narrative
During the development of this study, I continued working as a full time, tenured
faculty member at one of three campuses in a large community college district in
southern California. My annual teaching contract reflects the typical work environment of
many full-time community college faculty members in California. The contract is split
50/50. The first 50% requires me to teach all levels of English at my “home” campus; the
other 50% makes me responsible for serving as the faculty co-chair of the District
Assessment Committee. In this role, I help facilitate assessment processes for all
campuses in my community college district.
Background
Prior to becoming a community college faculty member, I spent 23 years in sales,
sales management, and sales training for several global corporations. These corporations
provided ongoing classroom and field training to new hires. My corporate experience was
quite different from my academic experience where after an initial year-long orientation
119
when I was first hired, faculty development has consisted of the one-shot, half-day or day
long workshops that most researchers contend are not worthwhile. The experiences and
professional skills I learned in the business world included: human resource processes
such as recruiting, interviewing, hiring, training, managing, evaluating, and terminating
employees; communication skills for negotiating conflicts, making sales presentations,
and handling customer relations. I also had opportunities to improve my writing skills
during my years working for corporations by developing manuals and sales updates and
creating pedagogy for training new employees in the United States. As a corporate
director, I received mentoring for achieving my sales budget while using the
department’s financial resources effectively. One of the most important aspects of
training in the business world was learning how to work as a member of a team.
After leaving the business world, I had the opportunity to return to college and get
my degrees, a lifelong dream of mine. The skills learned in the corporate world helped
me negotiate some of the complexities I confronted when I was hired full time at my
community college, but when I arrived on campus as a full-time faculty member, I was
still lost. Nowhere in my pre-service training had I been exposed to creating pedagogy,
and I never expected to use my business skills when asked to contribute to the institution
through shared governance.
A New Faculty Member
During my first year on campus, I attended orientation workshops that met, for
the most part, on a weekly basis. Generally, these workshops took place for
approximately seven hours and they occurred throughout the academic year. These
120
workshops covered many topics that are important to community college faculty
members. Unfortunately, because I didn’t have any background in the field of higher
education, I could not connect the content of the workshops to my job teaching students.
Many references were made in the form of acronyms for which I had no frame of
reference, and fearing embarrassment, I failed to ask appropriate questions. As a result, I
filed away the information for future use. I was still lost for the most part when the
orientation ended. In retrospect, the workshops were well organized and well thought out.
They covered all the areas of administration and institutional processes that a new faculty
member may want to know. However, without a basic understanding of the educational
system, they were not as helpful as they might have been for me.
Finding #1: Developing expertise by building on experience and knowledge
Most of my colleagues at the campus, who had actually been my teachers earlier,
assumed I arrived with an understanding of the institution and its processes. In fact,
during my first semester as a full time faculty member, these colleagues elected me to
represent the department on the curriculum committee. Even though I had only one year
of experience teaching part-time, I was not concerned over this appointment because I
believed I would either be trained to participate on this important committee, or I would
be given information to study on my own. I was wrong. I fulfilled the one year
appointment to the committee, but like Grace mentioned in chapter four, I listened and
observed.
A significant finding from this study was the importance of faculty members’
experience prior to the action inquiry project and how experience is essential for an
121
increase in any type of knowledge: discipline content, pedagogy, or ability to participate
in shared governance. Just as the concept of scaffolding in teaching and learning
suggests, when developing a plan for faculty development, faculty members’ previous
experiences and knowledge should first be identified to create a development plan in
which new experiences and knowledge collected during development activities, can build
on existing experiences and knowledge. This provides the type of continuity that Dewey
(1910) deems necessary for learning to occur. In my personal story, had my lack of
experience and knowledge in the field of education been recognized, the orientation
workshops might have included a basic introduction to education that others might not
have needed. Because my previous knowledge and experience, or lack of it, was not
recognized, it has taken me years to develop both my experience and my knowledge in
order to be able to contribute productively in the multiple roles I have assumed. By
determining the level of knowledge and experience new faculty have in the areas of
instruction and shared governance, a development plan can introduce the new employee
to the “meaning” of the institution and its expectations and processes more quickly,
resulting in the new faculty member contributing more effectively as a professional in a
much faster time frame.
Finding #2: Using faculty interest to drive development
Since that committee assignment over 10 years ago, I have actively accepted other
roles and responsibilities that I thought would help me understand my campus, my
college district, and the state processes. This is similar to Grace’s behavior reported in
chapter 4. However, even though I sought these types of committees, it has taken me
122
more than a decade to collect the information to understand the complexities of working
at a community college, and I still haven’t been able to find the time to understand the
state budgeting process.
Researchers (Alexander, P.; Slevin) contend that faculty will contribute
significantly if allowed to participate in assessments focused on issues of interest to them.
In my case, I sought out roles and committees related to my interests. For example, I
decided to learn more about the statistical data that drives many decisions at my campus,
so I ran for and was elected as department chair. By chairing the department, I was
exposed to the type of statistical data, such as student retention, student persistence,
student success, and the state efficiency ratio that contributes to resource allocation
processes at my campus.
More recently, I sat in meetings where the visiting team from the Accrediting
Commission for Community and Junior Colleges (ACCJC) asked the campus Associate
Dean of Student Success how she helped faculty and staff reach an extremely high level
of success in assessing student learning. The associate dean described meeting with each
faculty member who requested her help in designing an assessment plan and listening to
his/her “story” relative to outcomes assessment. By listening to each one individually, she
was able to identify where each faculty member’s interests were focused. She then helped
each one develop an assessment design that included that interest as well as the
institution’s expectations. By doing this, the associate dean overcame the defensive
reasoning of some faculty and offered them ways of assessing learning that may result in
productive reasoning in the future.
123
Currently, California community college faculty participate on committees that
contribute to shared governance processes such as hiring and evaluation, program review,
outcomes assessment, budget and resource allocation, and strategic planning. Many do
not have a working knowledge of the complex features of these processes, nor do they
recognize the interrelationships of these processes. It is the relationship between teaching
students in the classroom and advocating on behalf of students in shared governance that
calls for increased adaptive expertise, but most faculty are not particularly interested in
the focus of these committees. Some of these committees have been constructed in
response to external demands and others may serve to support administrative agendas.
Often, these committees are not learning groups. These committees do not expose faculty
members to the type of research activities found in action inquiry. If faculty development
groups could be formed based on faculty interests, then faculty resistance could be
decreased.
Finding #3: Using collaborative/team learning to develop faculty
One way to increase professional expertise might be through the development of
collaborative/team learning groups. Instead of using committees formed for purposes of
shared governance, collaborative teams that are focused on issues of interest for faculty
may be a better alternative for increasing expertise. Rodgers (2002) and Garvin (1993)
contend that socially constructed learning activities provide opportunities to increase
expertise because these activities provide a way “to take in new information, process it
and share it among the members of the organization in steady and regular ways” (Kruse,
2000, p. 362). Team learning can end in either a non-productive response or a productive
124
response that leads to significant learning (Kruse p. 363). In this study, evidence
indicated that faculty participants built awareness of institutional contexts outside their
normal work environment through the collaborative reporting out of findings.
Another consideration might be to include activities from action inquiry, such as
observations and interviews, in the activities of these learning groups. Research activities
might expose faculty members, such as me, to new types of activities while providing
other, more experienced faculty members, opportunities for revisiting skills they may or
may not currently be using. By allowing faculty to work on committees that interest
them, institutions may develop learning teams that develop significant new learning
(Kruse, 2000).
Finding #4: Providing the environment that results in productive reasoning
Faculty development can occur when a safe nurturing environment is provided. In
the action inquiry project at Urban Community College, evidence indicated several
examples of defensive reasoning. The instances of mistrust and ongoing rivalries suggest
an environment where failure is not allowed. In my own experience, I can recall one
occasion when I received very negative feedback; this was particularly disturbing
because as a new faculty member, I then began to worry that I would not get tenure and I
shut down. Without an atmosphere where failure was allowed, I did not dare expose any
of my insecurities to public examination. I couldn’t even expose them to semi-private
examination, as the individual who was critical of my performance was also the one I had
relied on for help in pedagogical development. By causing me to worry about my tenure,
125
I retreated to defensive reasoning and this caused me to avoid any public sharing that
could have led to better understanding of my position within the institution.
As a result of this study, I now actively seek input from my students regarding
multiple aspects of the course I am teaching. For example, in freshman composition, I
have developed a syllabus that explains my belief that knowledge is socially constructed.
In addition, I ask the students to finish developing parts of the syllabus (rules and
behaviors as well as the grading scale) during the first week of class. Students enjoy
being asked to serve as both teacher and learner, especially when their teacher identifies
herself as a learner and teacher as well. As a community of learners, we negotiate the
rules of the classroom. This has been particularly effective in the area of cell phones. To
date, a student in each of my classes will suggest that all cell phones be turned off during
class. Because the suggestion originates with a student, I don’t have to deal with phones
ringing or buzzing during the instruction, nor do I have to embarrass a student if her/his
phone rings during class.
If an institution wants faculty who feel empowered to act, then providing a safe
and nurturing environment can result in characteristics of productive reasoning. Since
productive reasoning asks that practitioners rigorously examine their beliefs and
assumptions in the presence of others, a supportive collaborative culture is necessary to
achieve “learning from others” (Dill, 1982; Garvin), so the institution needs to create an
environment where risk taking is supported and transparency exists at all levels. The
faculty participants in this study seemed very open to sharing their beliefs about the
possible reasons that students were failing at UCC during the hunches activity. However,
126
this same activity resulted in several statements that indicated an ongoing atmosphere of
mistrust and competitiveness that resulted in an institution that did not seem to be as
student-centered as it might have been.
Finding #5: The role of collaboration and leaders
Another important element in establishing an environment for developing
productive reasoning is the role of college administrators. College leadership is vital to
establishing the collegiality and transparency that can result in increased professionalism.
At Urban Community College, the collaborative nature of the action inquiry project
provided opportunities for several participants, Grace, Geraldine, and Randy, to effect
change at both the instructional and institutional level. In fact, all participants felt
empowered to change at the instructional level. This indicates that for some participants,
the mistrusts and rivalries that existed at the start of the project did not serve as a barrier
for feeling empowered to change.
One example of how faculty and administrative leaders work collaboratively
occurred at my campus during the 2008-09 academic year. Having recently arrived on
campus, the Vice President realized that a limited percentage of courses offered on the
campus had been assessed for student learning. As a result, my campus President and the
Vice President set high expectations for faculty and staff to conduct assessments in all
course sections to meet a goal of 100% assessed by the end of 2010 academic year. Since
that time, the campus has completed outcomes assessment in almost 100% of its sections
and the campus will meet that goal before the deadline. We were able to accomplish this
feat because campus leaders provided ongoing support to the faculty in the role of the
127
aforementioned Associate Dean of Student Success. They identified the indeterminate
situation in the course-based assessment; articulated an expectation in a public forum that
encouraged dialogue; provided training workshops on how to assess learning; and
provided ongoing support that helped faculty overcome any challenges they experienced.
This support is ongoing, and while this is unique, it seems possible that this type of
support and collaboration can help faculty develop the abilities to contribute in other
areas of the campus as well.
Finding #6: Using the practitioner-as-researcher model for developing faculty
Exposing faculty members to activities used primarily by researchers can result in
increased expertise. Researchers at the Center for Urban Education (CUE) at the
University of Southern California have conducted research on the results of working
with community colleges in action inquiry partnerships. The research contends that
change at the institution results from a “deeper awareness among faculty members,
administrators, or counselors, of a problem that exists in their local context” (Bensimon,
Polkinghorne, Bauman, and Vallejo, 2004, p. 105). Research also contends that
exposure to these activities results in decision-making using evidence rather than
anecdotes.
During this study, I participated in observations and interviews using protocols
created by researchers at CUE. As a result of their expertise, I learned what a protocol
should include and I became more experienced using these methods for collecting
evidence to inform decisions and findings. Garvin (1993) contends that a necessary
“building block” to learning is requiring that data – not assumptions – provide the
128
“background for decision making” (p. 81). I have learned that statistical data is a
significant element in institutional shared governance processes in the community
colleges. In California, the Accountability Reporting for Community Colleges (ARCC)
includes statistical data in multiple categories that gauge institutional effectiveness. As
a department chair, I had to understand other formulas used by the state which are
complex, and I needed to be able to argue, based on an understanding of information in
those formulas, on behalf of resources for the department. As a faculty professional
using statistical information, I needed to be able to discuss student success, retention,
and persistence rates, and longitudinal data on student success. Chairs at my campus
need to understand complex formulas described in the California Code of Regulations
(Title 5) such as the ratio between weekly student contact hours and full-time equivalent
faculty (WSCH: FTEF) as well as the efficiency ratio with a state of California standard
of 525. Chairs and other faculty leaders need to learn how to read and analyze statistical
data, so they can learn to use the data to inform their decisions. In fact, all faculty could
benefit from workshops that encourage statistical data analysis, a skill normally
considered in the domain of researchers.
The impact of the action inquiry project on my expertise
Participating in the action inquiry project described in this study has rewarded me
in multiple ways; some of these ways I have already mentioned. I now realize that I was
not unique in feeling a bit “lost” at the beginning of my community college career. The
action inquiry project has reinforced my appreciation for the collegiality and transparency
that I enjoy with the administrative leadership at my campus. Similar to the expectations
129
faculty have for students in our classrooms, where students often rise to the expectations,
my campus is unique in that every single classified staff member, administrator, faculty
member, and even some students are expected to participate in the participatory shared
governance of the campus, including resource allocation. This governance process
supports the standards implemented by the Accreditation Commission for Community
and Junior Colleges (ACCJC), and has other duties as well. Additionally, the Vice
President of Educational Services has worked closely with the faculty chairs and assistant
chairs to articulate a transparent process that allows the entire campus to follow the
distribution of resources. Most importantly, if the campus president makes changes in the
prioritization of the distribution of resources established by the campus wide strategic
planning body, she attends the next meeting of that body to explain the reasons for the
change. Above all else, the campus is constantly engaged in discourse about student
learning.
My experience is not unique. Many of the faculty members who began at the
institution with me have moved on or now suffer from burnout. I can’t imagine how a
newly hired faculty member without 23 years of business management experience could
respond to the expectations her pre-service training did not prepare her to meet,
especially at a college where participation is expected and valued. The findings in this
study cry out for future community college faculty members to be exposed and trained in
the professional skills they need to be effective teachers of a diverse and changing student
population as well as effective contributors in decision making institutional processes.
130
REFERENCES
Academic Senate of California Community Colleges. AB 1725, 1988. Accessed from the
Internet, August 30, 2008 at
http://www.asccc.org/localsenates/AB1725.htm
Accrediting Commission for Community and Junior Colleges (ACCJC)2002).
“Accreditation standards.” Accessed from the Internet May 2, 2008 at
http://www.accjc.org/standards.htm
---- (2007). “Accreditation reference handbook.” Accessed from the Internet May 2,
2008 at
http://www.accjc.org/standards.htm
Alexander, F. K. (1998). The endless pursuit of efficiency: The international movement
to increase accountability and performance in higher education. Association for
the Study of Higher Education. Conference papers. November 1998.
---- (2000). The changing face of accountability: Monitoring and assessing institutional
performance in higher education. The Journal of Higher Education, Vol. 71, No.
4 (July/August 1000).
Alexander, P. A. (2003). “The development of expertise: The journey from acclimation to
proficiency.” Educational Researcher, 32, 8, Nov., 2003, 10-14.
American Association for Higher Education, (1995). (AAHE) Bulletin, 48 (2), (November
1995), 7-9.
Angelo, Thomas. “Doing assessment as if learning matters most.” AAHE Bulletin, May
1999 accessed from the Internet at
http://www.bmcc.cuny.edu/iresearch/upload/DoingAssessment_Angelo.pdf
Argyris, C. (1993). Knowledge for Action: A Guide to Overcoming Barriers to
Organizational Change. San Francisco: Jossey-Bass, 1993.
---- (1991). “Teaching smart people how to learn.” Harvard Business Review, 4 (2)
(Reflections, 1991), 4 – 15.
Bakersfield College. “Learning Assessment” accessed from the Internet in March 2009 at
http://www.bakersfieldcollege.edu/assessment/
Barnett, B. G. (1995). “Developing reflection and expertise: Can mentors make the
difference?” Journal of Educational Administration, 33, 5, 1995, 45-59.
131
Bauman, G.L. (2005). “Promoting organizational learning in higher education to achieve
equity in educational outcomes.” New Directions for Higher Education, 131 (fall
2005) 23 – 36.
Benner, P. (2004). “Using the Dreyfus model of skill acquisition to describe and interpret
skill acquisition and clinical judgment in nursing practice and education.” Bulletin
of Science Technology Society, 2004, 24, 186-199.
Bensimon, E. M. (2007). “The underestimated significance of practitioner knowledge in
the scholarship on student success.” The Review of Higher Education, 30, 4,
summer 2007, 441-469.
Bensimon, E. M., D.E. Polkinghorne, G. L. Bauman, and E. Vallejo, 2004. “Doing
research that makes a difference.” The Journal of Higher Education, 75/1,
(Jan./Feb. 2004), 104 – 126.
Bensimon, E. M. and A Neumann. Redesigning Collegiate Leadership: Teams and
Teamwork in Higher Education. Baltimore, MD: Johns Hopkins University Press,
1994.
Berliner, D. C. (1991). “Educational Psychology and pedagogical expertise: New
findings and new opportunities for thinking about training.” Educational
Psychologist, 26:2, 145-155.
Berliner, D. C. (2004). “Describing the behavior and documenting the accomplishments
of expert teachers.” Bulletin of Science Technology & Society, 24, 3, June 2004,
200-212.
Birmingham, C. (2003). “Practicing the virtue of reflection in an unfamiliar cultural
context.” Theory into Practice, 42, 3, summer 2003, 188-194.
Boud, D. and D. Walker (1998). “Promoting reflection in professional courses: The
challenge of context.” Studies in Higher Education, Jun 1998, 23, 2, 191-206.
California Community Colleges, Academic Senate (1996). “Program review: Developing
a faculty driven process.” Educational Resources Information Center, U.S.
Department of Education.
California Community Colleges Assessment Institute. Accessed from the Internet in July
2009 at http://virtual2.yosemite.cc.ca.us/mjcinstruction/CAI/Resources/index.htm
132
California Community Colleges Chancellor’s Office. Data Mart. Accessed from the
Internet June 2009 at
http://www.cccco.edu/ChancellorsOffice/Divisions/TechResearchInfo/MIS/Data
MartandReports/tabid/282/Default.aspx
California Systems Strategic Plan 2006. Education and the economy: Shaping
California’s future today,” (2006). California Community Colleges Systems
Office.
Accessed from the Internet at http://strategicplan.cccco.edu/
Center for Student Success. (2007, February). Basic Skills as a Foundation for Success in
California Community Colleges. Sacramento, CA: California Community
Colleges Chancellor's Office. Retrieved (February 2007), from
http://www.cccbsi.org/publications
Clark, R. E. and F. Estes. Turning Research into Results: A Guide to Selecting the right
Performance Solutions. Atlanta: CEP Press, 2002.
Cranton, P. (1994). “Self-directed and transformative instructional development.” The
Journal of Higher Education, 65, 6, (Nov. – Dec., 1994), 726-744.
Dehoff, K., Hauser, R., Jones, F., Jones, L., Neilson, G., Park, T., Shorten, D., Spiegel, E.
(2001). Best practices transfer: Unleashing the value within. McLean, VA: Booz
Allen Hamilton. Retrieved October 12, 2007 from:
http://www.boozallen.com/media/file/80576.pdf.
Dewey, J. (1910). How We Think. Boston: D. C. Heath & Co., 1910.
Dill, D. D. (1982). “The management of academic culture: Notes on the management of
meaning and social integration.” Higher Education, 11/3 (May, 1982), 303 – 320.
---- 1999). “Academic accountability and university adaptation: The architecture of an
academic learning organization.” Higher Education, 38/2 (1999), 127 – 154.
Dowd, A. C. (2005). “Data don’t drive: Building a practitioner-driven culture of inquiry
to assess community college performance.” Lumina Foundation for Education
Research Report, Dec. 2005.
Dowd, A. C., Malcom, L., Nakamoto, J. and E. M. Bensimon 2007. “Institutional
researchers as teachers and equity advocates: Facilitating organization learning
and change.” Paper presented at the Association for the Study of Higher
Education, Louisville, KY, Nov. 2007.
133
Dowd. A. C. and V. P.Tong (2007). “Accountability, assessment, and the scholarship of
‘best practice.’” Higher Education Handbook of Theory and Research, J. C.
Smart, ed. V. XXII, 57 – 119.
Dreyfus, H. L. and S. E. Dreyfus (2004). “The ethical implications of the five-stage skill-
acquisition model.” Bulletin of Science Technology & Society, 24, 3, 2004, 251-
264.
Dreyfus, S. E. (2004). “The five-stage model of adult skill acquisition.” Bulletin of
Science Technology & Society, 24, 3, June 2004, 177-181.
Ericsson, K. A. (2000). “How experts attain and maintain superior performance:
Implications for the enhancement of skilled performance in older individuals.”
Journal of Aging and Physical Activity, 8, 346-352.
Ewell, P. T. (1991). “To capture the ineffable: New forms of assessment in higher
education.” Review of Research in Education, 17 (1991), 75 – 125.
FactBook, (2006). Urban Community College website. Accessed 4 Jan 2008 from the
Internet.
Farrington-Darby, T. and J. R. Wilson (2006). “The nature of expertise: A review.”
Applied Ergonomics, 37, 2006, 17-32.
Garvin, D.A. (1993). “Building a learning organization.” Harvard Business Review,
(July-August 1993), 78 – 91.
Greenwood, D. J. and M. Levin. “Reform of the social sciences and of universities
through action research.” In Handbook of Qualitative Research, 3
rd
ed. N. K.
Denzin and Y. S. Lincoln (Eds). Thousand Oaks: Sage Publications, Inc., 2005.
43-64.
Grubb, W. N. Honored but Invisible. New York: Routledge, 1999.
Grubb, W. N.and N. N. Badway (2005). “From compliance to improvement:
Accountability and assessment in California community colleges.” Higher
Education Evaluation and Research Group.
“How College Governance Affects You.” Accessed from the Internet October 20, 2008 at
http://4faculty.org/read.jsp?type=read&dest=119r1.jsp&mod=119&b=r&next=2
Kruse, S. D. (2000). “Creating communities of reform: Continuous improvement
planning teams.” Journal of Education Administration 39 (4), (November 2000),
359-383.
134
“Letter to President” (2007). California Benchmarking Project. Center for Urban
Education. University of Southern California, 2007.
Levin, J. S., S. Kater, and R. L. Wagoner, 2006. Community College Faculty: At Work in
the New Economy. New York: Palgrave MacMillan, 2006.
Leveille, D. E. (2005). An emerging view on accountability in American higher
education. Center for Studies in Higher Education. University of California,
Berkeley. May 2005.
Licklider, B. L. (1997). “Breaking ranks: Changing the inservice institution.” National
Association of Secondary School Principals, NASSP Bulletin, 81, 585, 9.
Love, J. M. (1985). Knowledge transfer and utilization in education. Review of Research
in Education, 12, 337 – 386.
McKeon, D. (1998). Best practice: Hype or hope? TESOL Quarterly, 32(3), 493-501.
Melguizo, T., L. S. Hagedorn, and S. Cypers (2007). “The need for
remedial/developmental education and the cost of community college transfer:
Calculations from a sample of California community college transfers.”
Presented: April 2007 to American Education Research Association, Chicago, and
November 2007 to Association for the Study of Higher Education, Kentucky.
Milner, R. H. (2003). “Teacher reflection and race in cultural contexts: History,
meanings, and methods in teaching.” Theory into Practice, 42, 3, summer 2003,
173 – 180.
National Center for Education Statistics. Table P62, “Percentage of teaching faculty in 4-
year and 2-year institituons working full-time and part time.” Accessed from the
Internet on Jun 21, 2008 at http://nces.ed.gov/surveys/ctes/tables/p62.asp
Noffke, S. E. (1997 ). “Professional, personal, and political dimensions of action
research.” Review of Research in Higher Education, 22 (1997), 305 -343.
Nunn, R. (2008). “A network model of expertise.” Bulletin of Science Technology &
Society, 28, 5, Oct 2008, 414-427.
Patton, M. Q. (2002). Qualitative Research & Evaluation Methods, 3
rd
ed. Thousand
Oaks, CA: Sage Publications, 2002.
Polkinghorne, D. A. (2004). Practice and the Human Sciences: The Case for a Judgment-
Based Practice of Care. New York: State of New York University Press, 2004, 97
– 127.
135
Raelin, J. A. (2007). “The return of practice to higher education: Resolution of a
paradox.” The Journal of General Education, 56, 1, 2007, 57 – 77.
Reason, P. (2005). “Three approaches to participative inquiry.” Handbook of Qualitative
Research. N. K. Denzin and Y. S. Lincoln, eds. Thousand Oaks, CA: Sage
Publications, YEAR. 324k-339.
Rodgers, C. (2002). “Defining reflection: Another look at John Dewey and reflective
thinking.” Teachers College Record. 104 (4), 842 – 866.
Rousseau, C. and W. F. Tate (2003). “No time like the present: Reflecting on equity in
school mathematics.” Theory into Practice, 42,3, summer 2003, 210-216.
Rueda, R. and A. Marquez, 2007. “Reconceptualizing institutional impat from a
sociocultural lens: A case study view from a successful partner in a university
based equity project.” Paper presented at 32
nd
Annual Association for the Study of
Higher Education Conference, Louisville, KY, Nov. 2007.
Schon, D. A. The Reflective Practitioner: How professionals think in action. New York:
Perseus Books, 1983.
Slevin, J. F. 2001. “Engaging intellectual work: The faculty’s role in assessment.”
College English, 63, 3, Jan, 2001, 288-305.
Spellings, M. (2006). “A test of leadership: Charting the future of U.S. higher education.”
U.S. Department of Education. Accessed from the Internet September 2006 at
http://www.ed.gov/about/bdscomm/list/hiedfuture/index.html
Stake, R. E. (1995). The Art of Case Study Research. Thousand Oaks, CA: Sage
Publications, 1995.
Stanton-Salazar, R. D., (1997). “A social capital framework for understanding the
socialization of racial minority children and youths.” Harvard Educational
Review. 67 (1), 1 – 40.
Stringer, E. T. Action Research. 2
nd
ed. Thousand Oaks, CA: Sage, 1999.
Sunal, D. W., E. Wright, J. B. Hodges, and C. S. Sunal (2000). “Barriers to changing
teaching in higher education science courses.” Paper presented at annual meeting
of National Association for Research in Science Teaching, New Orleans, LA,
April, 2000.
Tucker, Sue. Benchmarking: A guide for educators. Thousand Oaks, CA: Corwin Press,
Inc., 1996.
136
Urban Community College, Interim report to the president. February, 2008.
Urban Community College. Final report to the president. February, 2009.
Villegas, A. M. and T. Lucas (2002). “Preparing culturally responsive teachers:
Rethinking the curriculum.” Journal of Teacher Education, 53, 1, 20-32.
Yin, R. K. Case Study Research: Design and Methods. 3
rd
Ed. Thousand Oaks, CA: Sage,
2003.
Young, R. B. No Neutral Ground: Standing by the values we prize in higher education.
San Francisco: Jossey-Bass, 1997.
137
APPENDICES
APPENDIX A:
CENTER FOR URBAN EDUCATION: FACULTY INTERVIEW GUIDE,
TUTORING/LEARNING LABS & ASSESSMENT
Center for Urban Education:
Faculty Interview Guide
Tutoring/Learning Labs & Assessment
For CUE use only.
Interviewer Name: ____________________________________
Respondent Code: ______________________ Institution Code:
____________________
Date of Interview: _____________________ Time of Interview:
__________________
Notes:
Participant Information
Name:_____________________________
Institution:__________________________
Title/Rank: ____________________________ Full-Time or Part-Time:
__________
Contact Information: Email:_____________________ Telephone: _________
138
Introductory Script: Project Overview
The goal of this study is to investigate factors that promote success in basic skills
education at your campus. Our interview today will last approximately one and half
hours during which I will ask you about your roles and responsibilities and instructional
views and practices. Here is a 1-page summary of the project along with my own bio and
interests.
All information you share with me during our interview will be kept in strictest
confidence. For example, your identity will not be revealed, nor will the information you
share with me be used against you or influence your teaching assignments. Your
participation in this study is voluntary. How much you decide to share with me in the
interview is completely at your discretion. You may decline to respond to any questions I
pose throughout the interview and you may withdraw from the study at any point.
Do you have any questions for me before we begin?
I. Roles and Responsibilities
Let’s begin with some questions about your roles and responsibilities on campus:
1. What is your current position/title?
• How long have you been in this position? Are you full-time or part-time (ask
for details on how many courses part-time or full-time employment entails)?
• What courses do you teach? Generally, how many students are enrolled in
your courses?
[Prompt for details: number of courses taught, what types/level of courses do
you teach – e.g. basic skills, etc.?]
2. What responsibilities do you have in the [learning/tutoring/transfer] center?
• What activities do these responsibilities entail?
• Are these activities expected of you in your job title? If not, how did you
come to do them?
II. Instructional Views & Practices
3. Can you walk me through what happens when a student arrives at the
[learning/transfer/lab] center? [Prompts:
• How are they greeted? What information are they given about the center/lab?
• How are their needs assessed?
4. What are the demographics [race/ethnicity, gender, age ,parents] of students who
utilize the center/lab? Are the support services/tutoring offered in the center provided
with these demographics in mind? [If yes] How?
139
5. What are the different ways you work with students at the [center/lab]?
[Prompts: individualized tutoring, group work, review homework , etc.]
• How do you decide what to focus on when working with students?
• Describe a typical session with a student? (activities/length of
time/process of tutoring)
6. In your experience, what practices do you do at the [learning/tutoring/transfer] center
that promote student success?
7. What types of instructional training is provided for faculty/tutors/counselors who
work within the center? What types of training or orientation is provided for faculty
regarding the resources (computer technology, manipulatives, etc.) at the center/lab?
8. What do you consider your three greatest challenges in tutoring/counseling right
now?
a. Probe: How do these challenges manifest in your basic skills levels classes?
b. Probe: Where do you go for support or help with these challenges?
III. Contextualizing Basic Skills / Developmental Education and Equity
9. How do you view the problems of student success in basic skills education at your
campus?
[Probe for knowledge on . . .
-rates of student success in the different basic skills levels courses
- racial/ gender gaps in achievement
-number of students who migrate from basic skills into college credit
coursework
-number of basic skills who transfer to 4-year college
10. What institutional factors do you feel help you do your job?
Prompt for Institutional practices: departmental structure, course
curriculum, etc.
-resources: such as professional development, counseling, administrative
support technological teaching tools, etc.
-policies: grading, assessment, campus initiatives
11. What institutional factors create barriers for student success?
140
12. The notion of “equity” in educational settings can have many different interpretations.
• What does it mean to you?
• How would you explain achieving equity on your campus to someone in your
department?
III. Impact of Assessment and Evaluation
13. How are the activities in the learning center/lab assessed or evaluated? [Describe the
process].
• How often does the evaluation occur?
• Where you involved in the assessment process? [If yes,] describe your
participation.
• What do you see as the connection between the evaluation processes and student
success?
14. How has the assessment of the center/lab made a difference in how you think about
your work?
• What practices have you done differently since the activity? [Prompt for
details what they are doing differently, e.g. what changes have they enacted.
To get details of what they are doing you may need to ask them to compare
their new practices to the old practices before the assessment activity].
• How have you incorporated these practices in your instruction in ways that
impact students’ experiences in the labs? [If applicable,] in your classroom?
• Do you think the activity promotes equity on campus? [If yes,] How? [If
not,] Why not?
15. How has information about [the findings of the assessment or re-organization of the
center] been disseminated across your campus (e.g. email, flyers, workshops,
administrative meetings, etc.)?
• What information/training did you receive about the findings of the assessment?
[Probe: Was the information clear/ helpful/ connected to your work? Explain.]
16. What has been students’ reaction to any changes at the center/lab that were created by
the assessment process?
[Probe: How has it impacted their engagement in your courses? Impacted their
participation? Impacted their achievement?]
17. What would you say are three things that still need to occur to improve student
success in basic skills education in the center/lab?
That completes my questions…..
18. Are there any important points you have that I did not ask you about?
19. Do you have any additional questions or comments about my project?
141
APPENDIX B:
CENTER FOR URBAN EDUCATION FACULTY INTERVIEW GUIDE, ROUND 2,
ACTION INQUIRY AND COMMUNITY COLLEGE FACULTY ADAPTIVE
EXPERTISE
Center for Urban Education:
Faculty Interview Guide
Action Inquiry and Community College Faculty Adaptive Expertise
For CUE use only.
Interviewer Name: ____________________________________
Respondent Code: ______________________ Institution Code:
____________________
Date of Interview: _____________________ Time of Interview:
__________________
Notes:
Participant Information
Name:_____________________________
Institution:__________________________
Title/Rank: _____________________________ Full-Time or Part-Time:
__________
Contact Information: Email:__________________Telephone: _________
142
Introductory Script: Project Overview
Thank you again for meeting with me for a second interview. When we met last, we
discussed the Math/Science Center, your experiences working with students in the center
and/or your experience in the classroom. We also discussed your beliefs regarding basic
skills education, equity, and the role of assessment. The purpose of today’s interview is to
discuss the nature of your experience in the California Benchmarking Project,
specifically related to your perception whether or not the experience has resulted in
opportunities for significant learning or not.
All information you share with me during our interview will be kept in strictest
confidence. For example, your identity will not be revealed, nor will the information you
share with me be used against you or influence your teaching assignments. Your
participation in this study is voluntary. How much you decide to share with me in the
interview is completely at your discretion. You may decline to respond to any questions I
pose throughout the interview and you may withdraw from the study at any point.
Do you have any questions for me before we begin?
I. Action Inquiry Assessments and Tools
A. Recall: think back to the types of activities provided by the California
Benchmarking Project in which you participated. The project offered a “hunches”
meeting, monthly team meetings, interview protocols, observation protocols,
development of the interim report, the symposium in March 2007, and visits to
peer colleges to view best practices?
1. What activities did you participate in during the CBP?
• Which activity impressed you the most? Why?
• Did any activity cause concern? Why?
2. How did participating in the activities of the CBP help you understand your
institution differently?
• What are the institution’s goals?
• What are the institution’s strategies to achieve the goals?
• What types of choices do you have for participating at the institution?
• What factors outside the institution impact the goals?
3. Did your participation in the activities influence how you approach new problems
or contexts? In what ways? Can you provide an example?
• Do you feel more confident in new contexts?
• Have you developed different relationships?
143
4. Do you prefer working individually or as a member of a group? Has your
preference changed because of your participation in the CBP?
• Have your social interactions at the institution changed? How?
• Do you feel a part of a learning group? Please explain.
• Are you more open to multiple opinions?
5. How has participating in the assessment processes impacted your beliefs or
assumptions? about students? about the institution? about teaching? about
yourself?
[Prompt for details if changes have occurred—ask for comparison between old
beliefs/assumptions and new ones.] Why do you think they have changed?
6. How has participating in the assessment activities affected how you think about
your own work?
• What practices have you done differently since the activity? [Prompt for
details what they are doing differently, e.g. what changes have they enacted.
To get details of what they are doing you may need to ask them to compare
their new practices to the old practices before the assessment activity].
• How have you incorporated these practices? in the classroom? in ways that
impact the institution?
• Do you think participating in the assessment activities has impacted your
knowledge or expertise? [If yes,] How? [If not,] Why not?
7. How has participating in the assessment activities impacted new learning
opportunities for you?
8. What do you feel is the most important finding that resulted from the CBP?
Have you made any changes as a result of this finding? Where? How? Why?
9. How has information about the CBP assessment activities been disseminated
across your campus (e.g. email, flyers, workshops, administrative meetings,
etc.)?
• What information/training did you receive about the findings of the
assessment?
[Probe: Was the information clear/ helpful/ connected to your work? Explain.]
• Have any institutional changes taken place as a result of the CBP findings?
That completes my questions…..
14. Are there any important points you have that I did not ask you about?
15. Do you have any additional questions or comments about my project?
144
APPENDIX C: COMPLETE RESULTS FROM “HUNCHES” ACTIVITY
Project Orientation Meeting with with with with Urban Community College
“Hunches” Exercise Notes
Overview: The team was asked to brainstorm factors affecting course completion rates of
students who start in non-degree credit courses as they attempt to progress to the first
transfer-level course. Below is a list of influencing factors organized by themes that
emerged from the brainstorming exercise.
Practices and Policies
• Lack of an institutionalized and mandated activity that addresses and bridges the
gap between institutional /classroom student behavior expectations and student
behavior patterns brought with them from external environments.
• Lack of planned retention plan or program that involves campus-wide employees
and monitors progress, involvement, implementation, outcomes, impact, etc.
• Is retention an important priority, not just bringing in students?
• Course numbering is confusing to students.
• Does institution make student success a priority? Allocation of resources needed.
• Better coordination with feeder high schools. There’s a lot of animosity between
the college and feeder schools, but not sure why.
Student Support Services
• Not having mandatory orientations puts many students at a disadvantage due to
their lack of foundational knowledge.
• Coordination between faculty and student services; need consistency.
• Assessment center: Make its use mandatory. Is it easy to find? Is it welcoming,
intimidating? Personnel have changed. Same issues with Math/Science center.
• Is the assessment instrument and placement accurate? If not, it can’t be
successful.
• Do students get advised when they first come?
• Tutors in the lab need more discipline-specific training.
• Instruction and student services are polarized. They’re viewed as 2 separate
things, but they should not be.
145
Institutional Culture
• Campus climate that has no expectations process where all employees participate
in actions designed to provide students with positive acknowledgement and
affirmation in a consistent and ongoing manner.
• Conflict and tension between staff and faculty lead to compromised student
services.
• Ongoing campus employee issues result in bad attitudes and poor morale that do
not demonstrate that students are valued, and poor administrative monitoring of
such attitudes and behaviors.
• Counselors have a long history of being looked down upon; develop inferiority
complex. Tends to result in their doing anything to be considered “faculty” at the
cost of their primary role of counseling students.
• Limited communication between counselors and instructors leads to conflict in
advisement, expectations, and guidance.
• Cross pollination- students more likely to listen to/believe what they hear from
their peers. Administration is last info resource for students, they aren’t heeded.
• Low expectations.
• Ivory tower syndrome- We’re not always aware of how it affects our students, but
it can be controlled if we are aware of it.
Students
• Success rates increase with each level because those that reach the higher levels
have stronger skills or greater aptitude.
• International students start with higher skill levels.
• Some students are just not ready yet.
• Low success rates for elementary algebra and below due to negative perceptions
that faculty expect that students should already know material.
• Subject has a bad reputation: everyone “fails” and is bad at it.
• There are too many students who are transfer-ready but “fail to launch.”
146
• Students in the lowest level classes don’t progress to the transfer level. More
pronounced by race and ethnicity. Need to set education goals?
• Student success is limited in critical areas because students don’t make adequate
use of the Learning Assessment Center, or utilize study groups.
Things we Can’t Control
• Students come to the college under-prepared as evidenced by the large numbers in
elementary algebra or below.
• Weak foundations.
• First-generation students have few or no role models at home.
• Students whose parents/siblings not college graduates are less successful.
Faculty
• Good teaching, and lack of consistency in good teaching.
• Acquiring and assessing innovative teaching methods.
• Culturally responsive teaching.
• Most students in the 18-30 age groups are not aural or print learners. Instead they
are visual learners. If instructors don’t realize this and change/add methods to
accommodate this, students might not learn as well.
• Lack of consistency with teaching goals, expectations, etc.
• Content does not address what students need.
• Too little genuine interaction between institution and students.
• Modeling- looking at one self. Work with colleagues. What can I do differently?
• Sharing ideas with other faculty. How much opportunity for professional
development? Is there follow-through and consistency?
• Faculty does not expect this generation of students to be “good” enough.
Resources
• Are there sufficient numbers of counselors for students?
147
• Funding mechanisms are compartmentalized; too independent.
• Grants are here today, gone tomorrow. Continuity is a problem.
Abstract (if available)
Abstract
This study examines the potential for developing the adaptive expertise of community college faculty members through participation in action inquiry. Prior research contends that for the research activities of action inquiry to result in adaptive expertise, they must include the elements of reflection: experience, collaboration, and examination of values and beliefs. Therefore, this study focuses on the opportunities six full-time community college faculty members who participated in such an inquiry project had for reflection. It then examines therelationship between those reflective opportunities and self reported changes in faculty members’ perceptions of their expertise in instruction and shared governance. The findings support the view that faculty participants in action inquiry may gain expertise to make changes in either individual or institutional practices as a result of their participation. However, those changes may well be limited or circumscribed due to a lack of experience or knowledge of institutional and governance processes.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Remediating artifacts: facilitating a culture of inquiry among community college faculty to address issues of student equity and access
PDF
Math faculty as institutional agents: role reflection through inquiry-based activities
PDF
Changing the landscape of institutional assessment on transfer: the impact of action research methods on community college faculty and counselors
PDF
An alternative capstone project: gap analysis of districtwide reform implementation of Focus on Results
PDF
Achieving equity in educational outcomes through organizational learning: enhancing the institutional effectiveness of community colleges
PDF
Evaluating the impact of CUE's action research processes and tools on practitioners' beliefs and practices
PDF
An alternative capstone project: A gap analysis inquiry project on the district reform efforts and its impact in narrowing the Hispanic EL achievement gap in Rowland Unified School District
PDF
Practitioner reflections and agency in fostering African American and Latino student outcomes in STEM
PDF
The responsive academic practitioner: using inquiry methods for self-change
PDF
The effects of campus friendships and perceptions of racial climates on the sense of belonging among Arab and Muslim community college students
PDF
Connecting parents and schools: examining perceptions and building bridges to a more unified education community
PDF
The influence of organizational learning on inquiry group participants in promoting equity at a community college
PDF
Community college faculty member perspectives of workforce development-oriented public-private partnerships
PDF
Designing equity-focused action research: benefits and challenges to sustained collaboration and organizational change
PDF
The effects of online courses for student success in basic skills mathematics classes at California community colleges
PDF
Faculty as institutional agents for low-income Latino students in science, technology, engineering, and mathematics fields at a Hispanic-serving institution
PDF
Factors affecting the success of older community college students
PDF
How a Jesuit university president inspires and maintains a shared vision during a strategic planning process and its implementation
PDF
An examination of collaboration amongst student affairs and academic affairs professionals in an action research project
PDF
Sustaining success toward closing the achievement gap: a case study of one urban high school
Asset Metadata
Creator
Tschetter, Sheryl
(author)
Core Title
Developing adaptive expertise among community college faculty through action inquiry as a form of assessment
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
12/02/2009
Defense Date
09/01/2009
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
action inquiry,adaptive expertise,assessment,community college faculty,OAI-PMH Harvest
Place Name
California
(states)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Dowd, Alicia C. (
committee chair
), Levin, John S. (
committee member
), Rueda, Robert S. (
committee member
)
Creator Email
btrchtr@yahoo.com,tschette@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-m2779
Unique identifier
UC1330489
Identifier
etd-Tschetter-2595 (filename),usctheses-m40 (legacy collection record id),usctheses-c127-282759 (legacy record id),usctheses-m2779 (legacy record id)
Legacy Identifier
etd-Tschetter-2595.pdf
Dmrecord
282759
Document Type
Dissertation
Rights
Tschetter, Sheryl
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Repository Name
Libraries, University of Southern California
Repository Location
Los Angeles, California
Repository Email
cisadmin@lib.usc.edu
Tags
action inquiry
adaptive expertise
community college faculty