Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Exploring generative AI in business education: faculty perceptions and usage
(USC Thesis Other)
Exploring generative AI in business education: faculty perceptions and usage
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Exploring Generative AI in Business Education: Faculty Perceptions and Usage
Gohar Karamyan
Rossier School of Education
University of Southern California
A dissertation submitted to the faculty
in partial fulfillment of the requirements for the degree of
Doctor of Education
May 2025
© Copyright by Gohar Karamyan 2025
All Rights Reserved
The Committee for Gohar Karamyan certifies the approval of this Dissertation
Artineh Samkian
Stephen Aguilar
Kimberly Hirabayashi, Committee Chair
Rossier School of Education
University of Southern California
2025
iv
Abstract
This study applies social cognitive theory (SCT) to understand how personal, behavioral,
and environmental factors impact business faculty members’ perceptions and usage of
generative artificial intelligence (GenAI) tools like ChatGPT in their courses. The purpose
of this study is to explore how business faculty in a private research university perceive the
impact of GenAI on teaching and learning, and how, if at all, they utilize this technology in
their courses. This study applied a phenomenological approach and collected data through
in-depth semistructured interviews with 15 full–time faculty members across six
departments within a business school. The SCT construct of triadic reciprocity informed the
design of the interview protocol which focused on personal, behavioral, and environmental
factors related to participants’ experiences with GenAI in teaching and learning. Transcripts
were then analyzed using thematic analysis to interpret themes and patterns that emerged
from the data. Findings from this study identified potential opportunities and challenges for
instruction presented by GenAI and emphasized a need to leverage benefits while mitigating
risks. Participants also shared a variety of ways in which they used GenAI tools for
instructional purposes. This study provides actionable insights to inform professional
development and strategic initiatives for integrating GenAI into teaching and learning.
Keywords: generative artificial intelligence (GenAI), social cognitive theory (SCT),
ChatGPT, business education, self–efficacy, triadic reciprocity
v
Dedication
To my parents. My debt to them is immeasurable.
vi
Acknowledgments
I am incredibly grateful to all who have encouraged and supported me throughout my
doctoral journey. Thank you to my dissertation chair, Dr. Kimberly Hirabayashi, for her
patience, guidance, reassurance, and expertise. I would also like to thank my committee
members, Dr. Artineh Samkian and Dr. Stephen Aguilar, for their feedback and insight toward
shaping my study.
To my cohort members, I am thankful for their friendship and camaraderie – being in
community with them was one of the most joyful parts of this experience.
Thank you to my family for being a lifelong source of warmth, love, laughter, and
comfort. My gratitude for them transcends words. Спасибо за всё.
Thank you to my husband, Nate, for making hard days better and good days brighter. For
refilling my water bottle and supplying me with snacks, coffee, and tea while I spent hours at my
desk. I am deeply grateful for his love and support.
Lastly, thanks to my sweet cat, Milo, for being my constant writing companion through
late nights and early mornings.
vii
Table of Contents
Abstract.......................................................................................................................................... iv
Dedication ....................................................................................................................................... v
Acknowledgments.......................................................................................................................... vi
List of Tables ............................................................................................................................... viii
Literature Review............................................................................................................................ 3
Theoretical Foundations.................................................................................................................. 9
Positionality .................................................................................................................................. 10
Methods......................................................................................................................................... 11
Findings......................................................................................................................................... 15
Research Question 1 ................................................................................................................. 17
Research Question 2 ................................................................................................................. 30
Discussion..................................................................................................................................... 39
Implications for Practice ............................................................................................................... 43
Recommendations for Future Research ........................................................................................ 47
Conclusion .................................................................................................................................... 48
Appendix A................................................................................................................................... 55
viii
List of Tables
Table 1: Participant Information 20
Table 2: Faculty Suggestions for Professional Development About GenAI Tools 26
Table 3: Faculty Usage of GenAI Tools in their Courses 30
1
Exploring Generative AI in Business Education: Faculty Perceptions and Usage
Generative Artificial Intelligence (GenAI) is a type of artificial intelligence capable of
producing new content such as text, videos, images, and sounds (Sanchez-Ruiz et al., 2023).
Large Language Models (LLMs), a subset of GenAI, are trained on vast amounts of text data and
can generate human-like responses prompted by user input. One of the most popular LLMs is
ChatGPT, which was launched in November 2022 by OpenAI and broke records by gaining 100
million active users in just 2 months (Meyer et al., 2023).
The emergence of ChatGPT and other GenAI tools has already disrupted a range of
industries (Dwivedi et al., 2023), and many researchers claim that it will have a transformative
effect on academia (Chan & Hu, 2023; Sanchez-Ruiz et al., 2023). Many emphasize the
importance of properly integrating GenAI into higher education teaching and learning (Chan,
2023; Lo, 2023), which would require building relevant competencies in GenAI and fostering AI
literacy (Long & Magerko, 2020; Southworth et al., 2023). According to Chan (2023),
understanding stakeholders’ perceptions of GenAI is critical for educators and policymakers to
make informed decisions about integration.
The field of Artificial Intelligence in Education (AIEd) has existed for over 30 years, and
many studies investigate student and faculty views and university applications of AI tools such
as chatbots (Akinwalere & Ivanov, 2021). However, due to the novelty of chatbots and other
platforms powered by GenAI, there is limited research that explores student and faculty
perceptions of this new technology (Chan & Hu, 2023) and its proper integration into university
learning and teaching (Chan, 2023), and studies that do exist mostly apply quantitative
methodologies.
2
The purpose of this study is to explore how faculty members at a business school
perceive the impact of GenAI on teaching and learning and to identify how, if at all, they are
using GenAI tools in their courses. This study used a qualitative phenomenological approach and
collected data through in-depth semi-structured interviews. Participants were full-time faculty
members from a business school at a private research university in the western region of the
United States. Business education is highly interdisciplinary, and this school offers coursework
in areas such as communications, finance, marketing, entrepreneurship, business economics,
management and organization, and data science. Business school instruction also tends to focus
on real-world relevance and application. These characteristics will benefit the study by providing
a wider range of viewpoints and experiences on the use of AI in an academic context.
Despite much scholarly discussion (Chan & Hu, 2023; Dwivedi et al., 2023), there is a
lack of empirical research examining how faculty are using technology like ChatGPT for
instruction, which could provide insights into the current integration of GenAI tools and inform
curriculum design, best practices, and training needs. There is also a lack of empirical studies
exploring faculty perceptions of GenAI tools, particularly those using a qualitative approach and
focusing on the experiences of business school faculty. The potential benefits of this emerging
technology may be missed if this area remains underexplored. Some of these benefits might
include enhanced learning and teaching outcomes, student support services, individualized
learning, increased collaboration, addressing workforce needs, and time-saving automation.
There is also a risk of overlooking potential challenges that could lead to adverse learning and
teaching outcomes, bias and inaccuracy, issues with data privacy, and poor implementation
strategies (Akinwalere & Ivanov, 2021). This study aims to bridge this gap by investigating how
3
faculty within a business school perceive and use GenAI tools in their instruction to highlight
practical applications, challenges, and implications for AI integration strategies.
Literature Review
With the launch of ChatGPT and the proliferation of GenAI tools, scholarly discussion
and research in Artificial Intelligence in Education (AIEd) is becoming far more prevalent
(Kooli, 2023). However, there is still much to learn about the impact and application of this new
technology on higher education, and more empirical data is needed to develop these
understandings. This literature review will offer a brief overview of the field of AIEd and will
explore the potential benefits and challenges of using GenAI tools in higher education.
Context of AIEd in Higher Education
The field of AIEd has existed for over 30 years and is focused on using artificial
intelligence to create adaptive learning environments that support teachers, learners, and
administrators (Akinwalere & Ivanov, 2021, Luckin et al., 2016). Common areas of interest in
AIEd have included AI tutoring systems, smart classrooms, personalized learning and teaching
strategies, assessment, and automation of administrative tasks. Much of the research before
ChatGPT focused on opportunities presented by learning analytics, which involves applying AI
to analyze massive amounts of learner data that can then help inform decisions about instruction,
performance, enrollment, assessment, and other educational experiences (Luckin et. al, 2016;
Southgate, 2019).
There are some examples of universities that have implemented AI programs. The
POLIMI Graduate School of Management in Italy worked with Microsoft to develop an AIpowered platform that provides students with personalized material that fills the gaps between
their current studies and their career goals. The Rensselaer Polytechnic Institute has an
4
immersive lab that uses AI to help participants build conversational skills by interacting with AI
characters (Akinwalere & Ivanov, 2022). There are also examples of universities using chatbots
prior to the development of modern LLM technology such as ChatGPT. For example, Northern
Virginia Community College implemented the NOVA nighthawk chatbot, which could answer
students’ questions about coursework, enrollment, and registration to improve retention efforts
(Barrett et. al, 2019). It is unclear the effect this chatbot had on retention rates, but it did allow
the college to collect chat data that could help inform decisions regarding student affairs and
student support services moving forward.
However, according to Akinwalere and Ivanov (2022), “in terms of [AIEd’s] direct
impact on teaching and learning, much has been promised but, as yet, little has been
accomplished” (p. 2). Despite these examples of AIEd in action, it is challenging to find
evidence of the success of these programs. In fact, many of the studies that discuss the potential
benefits of AIEd do so in an aspirational way. Luckin et al. (2016) describe a near future where
the sophistication of AI-supported learning analytics will be capable of providing immediate
information about learners’ needs. Aguilar (2017) argues that technology driven by learning
analytics brings us a step closer to personalized learning that addresses the needs of each student,
thus dismantling the practice of teaching “toward the non-existent average student” (p. 1).
Akinwalere and Ivanov (2022) explain that many of the goals for AI in higher education are just
on the verge of actualization.
The creation of GenAI tools such as ChatGPT is a turning point for the field of AIEd and
has the potential to transform teaching and learning (Kasneci et al., 2023). As an example, due to
previous limitations in technology, learning analytics systems have relied on quantitative metrics
such as log history or page views. With the creation of GenAI, these systems can introduce
5
qualitative metrics that allow for simultaneous formative feedback and better support for
personalized learning (Dai, Liu, & Lim, 2023). This development in technology offers many
opportunities and challenges for higher education. It is important for university leaders to
understand student and faculty perceptions of GenAI tools to make better decisions for
integrating this technology into higher education (Chan & Hu, 2023) and developing training and
support initiatives that meet all stakeholders’ needs (Chan, 2023).
Benefits of GenAI Tools in Education
One significant benefit of using technology like ChatGPT in education is its ability to
provide highly personalized and individualized guidance to learners (Dwivedi et al., 2023;
Kasneci et al., 2023; Kooli, 2023; Rahman & Watanobe, 2023; Wang et al., 2023). For example,
they can provide simultaneous formative feedback on learners’ writing and can create qualitative
records to monitor students’ progress (Dai, Liu & Lim, 2023). GenAI can also generate
personalized content within specific parameters that follow an instructor’s learning objectives
while also meeting the needs and interests of each learner (Zhu et al., 2023). This level of
individualized support may be more difficult to achieve without these intelligent systems (Zhu et
al., 2023).
Another advantage of GenAI tools is that they can be used to assist both students and
faculty with research (Chan & Hu, 2023; Kooli, 202; Rahman & Watanobe, 2023). Technology
like ChatGPT is highly efficient at compiling, summarizing, and synthesizing vast amounts of
information. As such, it can support students and faculty in facilitating a literature search,
summarizing texts, brainstorming, and helping to create hypotheses through data analysis (Chan
& Hu, 2023). To generate more accurate responses, GenAI can be trained on domain-specific
knowledge for better and more reliable technical output (Kasneci et al., 2023). An opinion paper
6
by Dwivedi et al. (2023) gathers over 40 contributions from experts across various fields to
provide their perspectives on ChatGPT’s capabilities. Multiple experts emphasize the
opportunities that this technology presents for increasing efficiency in the research process,
particularly with improving library database features and assisting with the literature review
process. One contributor from the education field suggests that with ChatGPT, “this shift in
focus from text writing to doing research highlights the evolution of academic work. [...] The
emphasis shifts from the writing that summarizes our findings to the findings themselves”
(Dwivedi et al., 2023, p. 24).
Technology like ChatGPT can also serve as an assistant for writing (Chan & Hu, 2023),
teaching (Lo, 2023; Rahman & Watanobe, 2023), research (Kasneci et al., 2023), and
administrative tasks (Dwivedi, 2023). Their ability to assist with writing can help students better
organize and outline their papers (Kanesci et al., 2023) and can be a particularly helpful tool for
non-native English speakers (Chan & Hu, 2023; Roumeliotis & Tselikas, 2023). GenAI tools can
also help educators create lesson plans, activities, and assessments which could potentially save
them hours of work in teaching preparation (Rahman & Watanobe, 2023). They can make the
research process more efficient and “allow researchers to focus on more complex, creative
aspects of their work” (Kooli, 2023, p. 8). Finally, they can automate and streamline
administrative tasks by generating summaries, reports, and drafting messages (Atlas, 2023).
Challenges of GenAI Tools in Education
Due to the ability of GenAI tools to generate human-like output, academic integrity is a
major concern for using this technology in an academic context (Chan, 2023; Dwivedi et al.,
2023; Kasneci et al., 2023; Kooli, 2023; Rahman & Watanobe, 2023; Sanchez-Ruiz et al.,
2023;). This concern is primarily related to students using tools like ChatGPT to author their
7
assignments, and then submit those works as their own. This has sparked debate about how to
best maintain academic integrity while still leveraging AI as a complementary tool and has
challenged both students and instructors to redefine their definition of cheating (Chan, 2023).
Despite much debate and discussion, a recent study examining university AI policies revealed
that fewer than one-third of the top 500 global universities they analyzed had any policies in
place that addressed ChatGPT (Xiao, Chen & Bao, 2023).
Another challenge of using tools like ChatGPT is the potential for incorrect and
unreliable outputs. What makes this particularly difficult to overcome is that ChatGPT can be
confidently wrong. Sam Altman (2022), the CEO of OpenAI, claims that “ChatGPT is incredibly
limited but good enough at some things to create a misleading impression of greatness” and
admits that there is much work to do in relation to its robustness and truthfulness (Rahman &
Watanobe, 2023). This challenge requires users to be able to think critically about content
generated by AI to determine if it is reliable (Chan & Hu, 2023).
The use of GenAI in education also raises several ethical concerns such as issues with
data privacy (Dai, Liu, & Lim, 2023) and the potential for bias and discrimination (Southgate et
al., 2019; Southgate, 2020). Student data is often sensitive, and the large amounts of student
information collected by these platforms may be vulnerable to data breaches and unauthorized
access. To mitigate this issue, universities must establish clear security policies and users of this
technology (students, faculty, staff) must be aware of how their data will be collected, stored,
and used (Kasneci et al., 2023). Moreover, GenAI is often trained on massive amounts of
internet data that may reflect existing biases and stereotypes. One scholar expresses their concern
that “because much of the data in the training set might be produced from a predominantly white,
male, Western, English- speaking perspective, the data would likely be heavily skewed to reflect
8
those structures” (Dwivedi et al., 2023, p. 31). To address this issue, Southgate et al. (2019)
present their Education, Ethics and AI (EEAI) framework which introduces five pillars of AI
ethics - awareness, explainability, fairness, transparency, and accountability. However, in a later
paper, Southgate (2020) also gives the disclaimer that despite this framework, discrimination by
AI systems is incredibly difficult to identify, and we do not currently have the technology to
effectively detect and prevent algorithmic discrimination.
Faculty Perspectives on Artificial Intelligence in Education
Faculty tend to have positive views on the use of AI in education (Cardon et al., 2023;
Mohamed, 2023). However, data shows that faculty are more optimistic about their own use of
the technology in areas such as lesson planning, grading, assessment, and administrative tasks
(Mohammed, 2023) and express more concern for the potential negative effects of AI on
students, such as issues with academic integrity, poor development of writing skills, overreliance
on the technology, and decreased critical thinking (Cardon et al., 2023).
An essay on ChatGPT in business education affirms faculty concerns about the
disproportionate impact of this technology by arguing that it can deskill novices while upskilling
experts, which authors refer to as the “equity paradox” (Valcea, Hamdani, & Wang, 2024). They
propose that faculty should address this by developing curricula that emphasize the responsible
and ethical use of AI tools. Nithithanatchinnapat et al. (2024) similarly argue that business
education must adapt to ensure students can succeed in AI-driven industries and call on business
faculty to integrate AI competency into their curricula. They acknowledge that faculty alone
cannot make this change without appropriate policies, support, and resources from university
leaders, and encourage all stakeholders to act proactively to avoid a widening gap between
educational outcomes and workforce needs.
9
AI Literacy: Building Competencies in Artificial Intelligence
A salient conclusion across most research that investigates the impacts of artificial
intelligence on education is the need for training programs to ensure that all stakeholders are AI
literate (Almaraz-Lopez, Almaraz-Menendez, & Lopez-Esteban, 2023; Cardon et al., 2023; Chan
& Hu, 2023; Dai, Liu & Lim, 2023; Dwivedi et al., 2023; Mousavi et al., 2023; Southgate, 2019;
Southworth et al., 2023). AI literacy can be defined as “a set of competencies that enables
individuals to critically evaluate AI technologies; communicate and collaborate effectively with
AI; and use AI as a tool online, at home, and in the workplace” (Long & Magerko, 2020, p. 2).
Valcea et al. (2024) argue that building competency in using AI tools is particularly important in
business education because business students may eventually be tasked with implementing AI in
organizations, which would require a nuanced understanding of its social, economic, and ethical
implications. Addressing these needs requires instructors to build expertise in different areas of
AI literacy, and university administrators to provide the necessary support and strategic policy
decisions that will drive the successful integration of AI technologies into teaching and learning
(Cardon et al., 2023; Nithithanatchinnapat et al., 2024).
Theoretical Foundations
Social cognitive theory (SCT) is the theoretical framework that guides this study. SCT
was proposed by Bandura (1986, 1997, 2001) and introduces constructs such as triadic
reciprocality and self-efficacy that provide a helpful lens for examining the interaction between
faculty perceptions, behaviors, and institutional contexts explored in this study.
Triadic reciprocality is a framework within SCT that describes the reciprocal relationship
between personal, behavioral, and environmental factors and suggests that these three elements
can interact to shape an individual’s actions. Personal factors can include an individual’s beliefs
10
and attitudes. A key component of this is the concept of self-efficacy, which is an individual’s
belief in their ability to accomplish a specific task or goal. Self-efficacy can impact motivational
outcomes and allows researchers to predict certain behaviors (Schunk & DeBenedetto, 2020).
For example, faculty who feel more efficacious in using generative AI tools may be more likely
to integrate them into their courses. While this study did not use a psychometric assessment to
measure self-efficacy, the qualitative insights gathered about participants’ experiences with this
technology had implications for their self-efficacy which is further discussed in the findings.
Behavioral factors reflect an individual’s actions, choices, and persistence (Schunk, 2012).
Examples of this within the context of this study may include how frequently faculty use GenAI
tools, the effort they invest in learning this new technology, or how they choose to integrate it
into their courses. Environmental factors can include policies, social influence, and institutional
support (Schunk & DeBenedetto, 2020). Modeling is a key component of environmental
influence that is relevant to this study’s focus on institutional support as well as participants’ use
of GenAI. Demonstrations of GenAI tools to faculty can shape how they, in turn, model these
tools for their students.
SCT and the construct of triadic reciprocality will offer a lens through which to examine
the interplay between faculty perceptions of the impact of GenAI tools on their instruction, their
usage of these tools in their practice, and the role of institutional support and resources.
Positionality
I am currently an employee at an institution similar to the research site. To address any
concerns that may arise from this, I strongly emphasized confidentiality and assured participants
that their raw data would remain secure and protected. I was also mindful of potential biases
stemming from my role at a similar site and took steps to develop an instrument designed to
11
minimize subjectivity on my part. This included piloting the instrument with colleagues and
ensuring that questions were neutral and open-ended. Additionally, I remained self-reflective
about my positionality throughout each step of the research process so that my findings were
rooted in the data and not shaped by my biases or assumptions.
Methods
Research Questions
This study explored faculty members’ perceptions and utilization of generative AI tools
in their instruction, which was examined through the lens of social cognitive theory (SCT). The
study took place at a business school within a private research university on the West Coast of
the United States and aimed to capture the perspectives of full-time faculty across a variety of
sub-fields.
1. How do faculty members within a business school at a private research university
perceive the impact of generative AI on teaching and learning?
2. How, if at all, are faculty members within a business school at a private research
university using generative AI in their courses?
Context of the Study
This study took place at a business school within a large private research university on
the West Coast of the United States. As of June 2023, the business school serves approximately
3,500 undergraduate students, 1,800 master’s and PhD students, and 236 full-time faculty
members. This business school offers a variety of interdisciplinary undergraduate, MBA,
specialized master's, and PhD programs. The decision to conduct this study within a business
school was due to its rich interdisciplinary environment that offers diverse perspectives on the
application of new technology like ChatGPT. The school setting being studied has already
12
implemented AI-related policy and has led discussions and workshops about integrating GenAI
tools into instruction, thus faculty were more likely to be well-informed about AI and could
provide meaningful insights.
Participants
Study participants were full-time faculty members at a business school within a private
research university. Full-time faculty were selected over part-time or adjunct faculty because
they tend to be more involved in the school’s long-term goals and have greater influence over
course design. A total of 15 participants across six departments were selected to participate in the
study (see Table 1). Departments included business communications (four participants),
management and organization (four participants), data science and operations (three
participants), marketing (two participants), finance (one participant), and entrepreneurial studies
(one participant).
Table 1
Participant Information
Participant Name (Pseudonym) Approximate Years of Experience
Alex 5
Andrew 15
Catherine 20
Dante 25
Grant 40
Jack 20
James 5
Maya 5
Michael 5
Paul 15
Robert 20
Sam 15
Sarah 20
Sofia 10
Susan 20
13
Instrumentation
This study collected data using a semi-structured interview protocol that was developed
through the lens of SCT. The construct of reciprocal determinism was central to the development
of this protocol to examine how personal, behavioral, and environmental factors interact to shape
participants’ beliefs and utilization of GenAI tools for instruction. After some initial introductory
and demographic questions, questions 5–9 asked faculty members to share their experiences
about how GenAI tools have impacted teaching and learning, and to reflect on potential benefits
and challenges that these tools have introduced to their work. These questions were designed to
answer the first research question by understanding personal factors that influence their attitude
toward GenAI tools. Questions 10–13 asked participants how they utilize GenAI tools in their
courses and aimed to address the second research question by examining their behaviors with
this technology. Questions 14–18 aligned with the first research question by gaining a deeper
understanding of personal factors that may be shaping participants’ use of GenAI tools as well as
the institutional support and environmental factors that influence how this technology impacts
their instruction.
Data Collection
This study utilized a qualitative phenomenological approach, a methodology that aligns
with the aim to understand participants’ collective perceptions and experiences with using
generative AI tools for teaching and learning (Lochmiller & Lester, 2017). Prior to data
collection, the study was reviewed and approved by the institutional review board to ensure
compliance with ethical standards. Participants were then recruited using a mixed sampling
approach (Lochmiller & Lester, 2017), which included convenience sampling by choosing my
place of employment as the research site, purposeful sampling by targeting full-time faculty
14
across various business departments, and snowball sampling through participant
recommendations. To align with the research questions, participants were required to have taught
at least one class since spring 2023, which was the first full semester following the release of
ChatGPT. I aimed to recruit at least two participants from each department to be able to identify
experiences or patterns unique to specific areas of business education. I was able to achieve this
level of representation in four out of six departments but was only able to interview one
participant each from finance and entrepreneurial studies. However, faculty members from those
two departments shared perspectives and experiences consistent with those of other participants
and did not indicate any outliers.
Data was collected between June and August 2024 through in-depth, semi-structured
interviews lasting approximately 45–60 minutes, totaling approximately 12 hours of recorded
data. Recruitment during the summer presented some challenges due to limited faculty
availability, but a combination of direct email outreach and colleague referrals allowed me to
conduct a sufficient number of interviews to achieve data saturation. Interviews took place on
Zoom, a virtual meeting platform, to increase accessibility, and were recorded with participants’
consent. Video recordings are stored on Zoom Cloud with access protected by dual-factor
authentication and will be permanently deleted one year after the study’s completion.
Data Analysis
The data was analyzed using a thematic analysis to interpret themes and patterns that
emerged from the transcripts. The analysis was conducted in multiple phases as outlined by
Lester, Cho, and Lochmiller (2020) in their guidelines on qualitative research practices. Audio
files were compiled and transcribed using Otter.ai transcription software and then manually
15
reviewed to increase accuracy and familiarity with the data. Atlas.ti, a qualitative data analysis
software, was then used to code, memo, categorize, and analyze the data.
Multiple cycles of coding were conducted to identify and synthesize themes that
addressed the study’s research questions (Manyam & Panjwani, 2019). An initial set of codes
was developed based on the research questions and theoretical framework. The codes included
categories such as “opportunities,” “challenges,” “usage in instruction,” and “other usage,” as
well as constructs from social cognitive theory (SCT), including “self–efficacy,” “personal
factors,” “behavioral factors,” and “environmental factors.” The analysis also used descriptive
and in-vivo coding derived from participant data during the coding process.
Findings
The following section presents findings from the data collected in this study, which
aimed to understand how business school faculty members at a private research university
perceived the impact of technology like Generative AI on teaching and learning and how they
used this technology in their courses. A total of 15 faculty members were interviewed to gather
these insights. A brief discussion of faculty participants’ perceptions of their confidence in using
GenAI for instruction and the implications for self-efficacy establishes context for the findings.
The findings are then organized by research question, and each subsection is further divided into
major themes that emerged from the data.
Faculty Perceptions of Confidence and Implications for Self-Efficacy in GenAI Use
Faculty participants’ perceptions of their confidence in using GenAI tools for
instructional purposes appeared to connect with their usage behaviors, which may have
implications for their self—efficacy. One participant (Catherine) considered herself “fairly
computer-literate”, but admitted to having limited experience with GenAI and chose not to use
16
these tools due to a lack of trust in the technology. Jack, who described himself as “dangerously
incompetent with ChatGPT”, attributed his lack of confidence to his belief that the technology is
ineffective and lacks utility. This perspective ultimately led to his disinterest in exploring its
potential applications or integrating it into his instruction. The experiences of these faculty
participants illustrate how their negative beliefs about the technology influenced their usage and
may act as a potential barrier to building self—efficacy.
Eight faculty members expressed moderate confidence levels and saw potential in using
GenAI tools for instruction but found it challenging to find specific ways to integrate them into
their coursework. Sarah shared, “It's not that I don't understand the technology, it’s that there's so
many different ways to use it, and there's so many things I just don't know about it yet.” These
participants were likely to encourage students to experiment with GenAI tools for brainstorming
and approaching new content but found it challenging to formally and creatively integrate this
technology into their courses. This suggests that at the time, they may not have yet developed the
confidence needed for more innovative integration of these tools. One of the eight participants,
Sofia, felt moderately confident in her abilities and used GenAI tools in relatively extensive and
highly integrated ways within her assignments. However, she felt her sense of confidence was
limited by the “lightning-fast evolution of AI” which left her feeling like she was “not on solid
footing.” Despite describing behaviors that indicate a high level of competence with using
GenAI for instruction, Sofia’s perceptions of her own confidence imply that the rapid evolution
of this technology may lead to quick and constant changes in self-efficacy.
Two faculty participants expressed high levels of confidence (Maya and Sam) and
described consistently and proactively engaging with GenAI tools, frequently integrating them
into their instruction. Maya expressed a strong interest in these tools, and emphasized that she
17
was “the type of person that seeks out resources, regardless of if they're introduced to me or not.”
Similarly, Sam claimed that he used GenAI tools “all the time” and frequently experimented
with different use cases, which increased his confidence in using the technology. He shared, “It
doesn't feel intimidating to me. I can see what it can do. I know what it cannot do very well.”
Both participants seemed to approach using GenAI with a mastery mindset, which may indicate
high levels of self-efficacy. Although this study did not use a psychometric assessment to
formally evaluate self-efficacy, participants’ discussions about their confidence and experiences
with GenAI tools provided qualitative insights that revealed how these perceptions and behaviors
may possibly influence their self-efficacy.
Research Question 1
Research question 1 aimed to explore participants’ perceptions of the impact of
generative AI on their instruction. Interviews revealed opportunities, challenges, disruptions, and
other complexities that GenAI introduces to the academic landscape. Faculty insights shed light
on how, and to what extent, this emerging technology impacted all areas of their work.
Leveraging Opportunities for Enhancing Instruction
Many faculty members recognized the potential for GenAI tools to positively impact their
instruction. Ten faculty participants expressed that technology like ChatGPT presented new
opportunities for enhancing teaching and learning. These participants believed that when used
appropriately, GenAI tools could deepen critical thinking and serve as a helpful starting point by
facilitating brainstorming or generating initial ideas.
Enhancing Critical Thinking. Six participants believed that when used properly, GenAI
tools can enhance critical thinking. Sofia specified, “As long as students use ChatGPT and other
generative AI platforms as tools and not as replacements for critical thinking, they are on the
18
right path to using generative AI to supercharge their abilities as critical thinking business
leaders.” The other five faculty participants voiced similar perspectives and viewed GenAI tools
as a double-edged sword. They shared that it can be incredibly valuable to users who pair it with
their own knowledge and viewed it as a tool that could help them bridge the gap between what
they know and what they hope to understand more deeply. At the same time, Sofia and other
participants expressed concern that although there was great potential and opportunity, there
were very few ways for faculty to prevent the use of AI as a substitute for thinking. Sofia
described it as a “work in progress”, and others agreed that they were still navigating how to best
incorporate it into their instruction and promote opportunities for students to engage with these
tools more critically. In contrast, one participant (Jack) believed that GenAI tools in their current
state completely lacked utility for instruction and were detrimental to critical thinking. He
bluntly stated, “I think it's making humanity dumber.”
GenAI as a Starting Point. Six participants found that technology like ChatGPT was a
particularly effective tool for brainstorming and generating ideas at the start of a project. James
asserted, “ChatGPT is really good at helping students set the stage on problems, and giving them
a pretty good first stab at how to solve them.” Two of the six participants agreed that GenAI
tools had the ability to make projects that may seem daunting, more approachable and
manageable. Susan shared, “It can raise issues that maybe they wouldn’t have identified, and you
can get that ‘aha moment' where learning occurs.” As such, they believed that using GenAI tools
as a starting point allowed students to tackle challenging assignments with more confidence and
creativity. Faculty who recognized this potential benefit actively encouraged students to use
these tools as a first step in their work.
Navigating Concerns Over Students’ Use of GenAI Tools
19
All 15 faculty members expressed that the proliferation of GenAI tools forced them to
confront new challenges and concerns over students’ potential misuse of this technology. These
challenges included the risk of students over-relying on tools like ChatGPT, concerns about the
quality of their work, and issues related to academic integrity and trust. Due to the evolving
nature of GenAI, participants struggled to find effective, long-term solutions to mitigate these
challenges.
Managing Students’ Overreliance on GenAI. Nine faculty members shared that the
rise of GenAI tools created concern about diminished critical thinking skills due to students’
potential overreliance on this technology. Andrew stressed, “If students are using it stealthily to
write and pretend that it’s theirs and their thinking, then I think it becomes a crutch and a danger
to the learning process.” This sentiment was echoed by many faculty, particularly those who
taught business communications, who described their primary objective as teaching students how
to communicate effectively and strategically with different audiences. Five of the nine faculty
members emphasized that good communication required students to think critically about their
writing, and they argued that some students may be depending too heavily on tools like
ChatGPT, which negatively affected their learning.
Sam, who actively encouraged his students to use AI tools as part of their learning
process, explained, “If you’re not pairing expertise with the AI, you're basically coming in as a
novice, and then relying on the AI, then you're not able to really discern whether or not you're
getting good, accurate, precise or helpful information.” Faculty who wanted to leverage the
benefits of GenAI while also addressing its challenges struggled to ensure that students were
developing essential skills while using these tools as a supplement to, rather than a replacement
for, critical thinking. Sam, Andrew, and six other faculty participants who showed concern over
20
students’ overreliance on GenAI tools had to consider various strategies to mitigate this concern,
including having clear guidelines for students on the appropriate use of AI, disclosure policies,
and the implementation of other assessment methods such as live, oral presentations. Faculty
acknowledged that these solutions were not perfect but hoped that they could improve over time.
Addressing Issues of Academic Integrity. Eight participants shared that students’ use of
GenAI raised significant concerns about academic integrity. Sarah shared, “There is an ongoing
debate about whether allowing students to use generative AI equates to plagiarism.” For many
participants, defining what it means to use GenAI tools with integrity was a moral gray area and
could differ depending on the nature of an assignment, across disciplines, and depending on each
instructor’s expectations. As such, many faculty felt conflicted about where to draw the line
between the appropriate use of AI and academic dishonesty.
Five faculty members expressed frustration that there was no reliable tool for detecting
AI in students’ work, leaving them to rely on their own judgment. Andrew emphasized, “I don't
think faculty should be thinking that they're policemen. I think that's a bad way to go into
teaching or using this as a tool.” He continued to explain that it was harmful for faculty to
approach this issue from the perspective of policing and surveilling students’ behavior rather
than focusing on building transparency with students around ethical AI use. Other participants
similarly expressed discomfort about taking on a policing role despite the challenges around
monitoring students’ use of AI.
Distrust Towards Students. Maya, one of the faculty participants who was frustrated
with the inability to reliably detect AI use, described the mental impact that this had on how she
thought about her students:
21
It's created a different type of stress for me, in a way, of being very biased and critical,
especially when I get a first assignment from a student at the beginning of the term. I
don't know the performance of the student yet. I don't know if this is a B student yet or an
A student, but all of a sudden I get an influx of A papers. And my previous teaching
experience has told me that not every student is an A student. So now, how do I start to
evaluate? So there's a little bit of stress there, there's a little bit of distrust there, and
there's definitely bias there, because there are students in there who are the A students.
But now I'm questioning whether or not some kind of generative AI tool was used to
facilitate their paper. And it's not fair to them, quite honestly. So that's changed the way
that I go into the classroom.
This sentiment highlights reciprocal determinism between students' use of GenAI tools and
faculty perceptions, which can shape their instructional practices and classroom dynamics. Four
of the five participants who expressed frustration with the lack of reliable detection software
shared that the rise of GenAI combined with insufficient detection methods had fueled distrust,
which could be harmful to their relationship with their students. Moreover, while some
participants felt that there was no way to truly know whether a text was written by AI, some
others felt confident that they could identify AI-generated writing. As Maya mentioned, this
could increase the risk of bias and lead to faculty making unfair judgments about certain
students.
Sam acknowledged the potential harm assumptions about students’ academic integrity
could have on faculty-student relationships, and his approach to this issue was to give grace.
“Students are naturally going to take the path of least resistance, and the path of least resistance
is going to be using AI.” This generalization may not apply to all students, but it reflected Sam’s
22
belief that faculty members should avoid taking immediate punitive measures to address
improper use of AI. He argued that with GenAI tools being widely integrated in the workplace,
but condemned in some academic settings, students were receiving mixed messages that could
create a disjointed experience for them, and ultimately lead to them looking back on their
education in a negative way. He believed that when students misuse AI, faculty should use that
as a teachable moment and an opportunity to reflect on how they could redesign their
coursework in a way that better suited this new educational landscape. Two other participants
(Maya and Susan) shared their perspective that issues related to academic integrity were an
opportunity for faculty to have open and honest discussions with their students about the
responsible use of this technology.
Gaps Between Academia and Industry. With GenAI tools being integrated across a
variety of industries, five faculty participants felt that there was a misalignment between what
occurred in the classroom with what was occurring in organizations. Michael explained that
faculty need information from organizations about how they are using AI in the workplace
because it would help them prepare their students, stating, “A lot of the companies that hire from
our school are using AI... and students will go into the workplace and they're going to be using
these tools.” Four of these five participants agreed that it was crucial for faculty to gain
knowledge about how different industries were using GenAI tools so that they could adjust their
instruction accordingly. Sam pointed out that “this is a moment where higher education can get
very far behind industry unless we’re aligning with what is happening in corporations and
organizations across the world.” The other four agreed with his perspective and felt that
especially as business faculty, their instruction should mirror the environment that students
would encounter in organizations.
23
Managing Continuous Change
Since GenAI tools first gained popularity with the release of ChatGPT, these tools have
continued to evolve at a rapid pace. Ten faculty members expressed that this required them to
continuously rethink and adapt their instructional practices and policies. Andrew shared,
I think we're all learning what the heck it is, and I think it's evolving faster than we're
learning. I'm just putting my toe in the water of understanding what it is and what it can
do, and by the time I get my foot in the water, it'll probably have changed.
This sentiment was widely echoed among participants who found it difficult to manage the
constant changes. Given the time constraints of their busy schedules, some expressed uncertainty
about the value of spending their time learning to use a tool that continued to evolve. Susan
questioned, “How much do I want to invest in learning something new that might change again?”
Others, while acknowledging these frustrations, felt responsible for continuing to learn and adapt
to these tools to prevent a gap between their instruction and how students were engaging with
their content.
Adapting Instructional Practices. Many participants have had to adapt their
instructional practices in response to these changes. Six faculty members emphasized that they
have had to consider new course designs and methods of assessment. Grant shared that his
students’ potential misuse of GenAI tools has pushed him to change the design of his course such
that it is more difficult for students to rely solely on technology like ChatGPT to complete
assignments. He adjusted his assignments by asking students to do live presentations or recorded
videos rather than written reports. Two of the six (Michael, Sam) shared that they reverted to
administering exams in class, using paper and pencil. They acknowledged that these were short-
24
term solutions and hoped that in the future they would be better equipped to implement more
long-term solutions.
Faculty members who felt the most pressure to adjust their instructional practices seemed
to be those with higher levels of confidence in using GenAI tools. Maya, who reported feeling
confident in using technology like ChatGPT, and who described herself as “very pro-tool” when
referring to her stance on leveraging GenAI for instruction, expressed her feelings towards
redesigning her courses:
I almost now feel forced and responsible to ensure that they really are diving into the
content at a deeper level. For an upcoming class that has given me the most challenges
when it comes to the improper use of generative AI tools, I'm asking for a more
cumulative assignment at the back end of the course. I'm actually flipping assignments
around at this point to ensure that they're going to apply learnings from week 1 all the
way to week 15. So I am reworking and reimagining the course.
She emphasized that this was a time-consuming process that was complicated by the fact that
GenAI tools were in a constant state of flux. Other participants agreed that they felt responsible
for rethinking their instructional methods but were challenged by limited time and the
uncertainty of navigating an unfamiliar, emerging technological landscape.
Developing AI Classroom Policies. To address issues raised by students’ use of GenAI
tools, the business school where this study took place required all faculty to have an AI policy in
their syllabus. The school provided three options to choose from: no AI usage permitted, AI
usage permitted with disclosure on some specific assessments, and AI usage encouraged.
Although policies could vary depending on the course, two participants (Catherine and Jack)
25
tended to choose the first, most restrictive option, ten chose the second option, and three chose
the third, most permissive option.
Faculty members explained that the goal of implementing these policies was to set clear
expectations and guidelines for students but acknowledged that they must be flexible and
adaptable as the capabilities and utilization of GenAI tools continued to change. Three faculty
members found that co-constructing this policy with their students was a good solution for
fostering a greater sense of accountability. Sam argued, “Students who feel like they are able to
have some place in developing the policies for classes are usually going to buy in a little bit more
to what you do with AI in the course.” He also emphasized his belief that GenAI tools created an
environment that required faculty members to be agile and able to make adjustments midsemester in moments where this technology might undergo “dramatic changes and
improvements, almost overnight.” Two participants (Maya and Sofia) agreed with this
perspective but acknowledged that the impact of these sudden technological changes may be
more difficult to adapt to in practice. Faculty members who made comments that indicated
higher levels of confidence in their use of GenAI tools expressed more flexibility and openness
toward having transparent conversations with their students about their expectations. However,
some participants who were less confident found it more difficult to have nuanced discussions
about what constitutes acceptable use of these tools.
Navigating Support and Resources. The constant change and uncertainty around the
integration of GenAI led many faculty members to seek both institutional support and external
resources to better understand and apply these new tools. Around the time when ChatGPT was
released, the business school where this study took place organized a series of faculty-led
workshops that explained how faculty might use GenAI tools in the classroom. Nine participants
26
attended at least one of these sessions and many expressed that they provided helpful and
actionable steps towards beginning to integrate AI into their instructional practices.
Since then, GenAI tools continued to grow and evolve, creating a greater need for
guidance, training, and institutional support for integrating them into the classroom. Maya
shared, “Where I don’t feel as empowered is having maybe a community of colleagues that are
trying to solve the same problems as I am with generative AI tools.” Other faculty members
agreed that while the school’s introductory workshops were a helpful starting point, different
departments within the business school had unique needs with regard to integrating GenAI tools
into their courses. Four faculty participants expressed that they would be more interested and
involved in resources that were tailored to the needs of their department. In an effort to bridge
this gap, Maya and a few others reached out to colleagues teaching similar courses to collaborate
on leveraging GenAI tools for instruction.
There was consensus among most faculty members that more specialized support and
resources were needed, but there were varying opinions on what this support might look like.
Table 2 outlines specific preferences and suggestions from twelve faculty members regarding
professional development on GenAI.
Table 2
Faculty Suggestions for Professional Development about GenAI Tools
Type of Institutional Support Faculty Participant(s)
27
Workshops
Interactive, faculty-led workshops tailored to specific
subject area needs
Asynchronous, voluntary, self-paced learning options
Interdisciplinary collaborations (e.g. School of
Engineering)
Focus on AI productivity tools like Microsoft Copilot
Dante, Maya, Robert,
Sarah, Sofia, and Susan
Andrew, Grant
Andrew
Jack
Paid Access to GenAI Tools
Paid access to premium AI tools (like ChatGPT) for
students and faculty
Maya, Michael, Sam
Case Studies
Case studies on industry applications of GenAI to align
instruction with real-world practices
Alex, Susan
Communities of Practice
Faculty groups who teach similar courses share insights
and best practices for GenAI integration
Maya, Susan
Expert Support
University AI experts/consultants to review course designs
and recommend GenAI integration
Robert, Sarah
Resource Hub
Centralized resources (e.g. in a learning management
system) to integrate GenAI tools and build AI
competencies
Sam, Susan
As shown in Table 2, participants’ suggestions for professional development about
GenAI tools can be categorized into six areas: workshops, communities of practice, paid access
28
to GenAI tools, case studies, expert support, and a centralized resource hub. Workshops,
particularly those delivered in a live synchronous format and tailored to specific subject area
needs, were the most popular recommendation. Sarah emphasized the benefit of faculty-led
workshops customized to specific disciplines:
Not everyone teaches the same kinds of things. So obviously there are different needs for
people that, for example, teach in finance or quantitative fields. There's also a certain
amount of credibility when faculty are saying, ‘This is how I used it, and this is what
worked and didn't work.’…I think there is something special about having other faculty
actually really show you, like, this is what happens in my classroom, and this is how I do
it.
Sarah attended some of the early faculty-led AI workshops organized by the business school and
found them beneficial. She and others believed that making the content more relevant to specific
subjects would further enhance the impact of these sessions.
Other methods of institutional support were also recommended. Three participants said
the university should consider paying for students and faculty to have access to the advanced
version of ChatGPT and other AI tools. Michael explained that students who could afford the
subscription fees had an advantage over others, and university-funded access would help create a
more equitable environment. Other recommendations included developing case studies that
might help faculty connect course materials to real-world applications, establishing communities
of practice to promote collaboration and problem-solving on issues related to GenAI integration,
providing one-on-one support from AI experts, and developing a central hub for guidelines and
training resources. Although there were different opinions on the specifics of how support should
be implemented, many faculty members agreed that establishing systems of institutional support
29
to build AI competency was essential for managing the changes brought about by the rise of
GenAI tools.
The findings in this section reveal that faculty participants perceived GenAI tools to have
both positive and negative impacts on teaching and learning. Some participants believed these
tools created opportunities to enhance critical thinking and allowed students to approach
assignments with more confidence and creativity. Others expressed concern that GenAI tools
created issues with academic integrity and an overreliance on the technology which for some
fueled feelings of distrust toward students. Faculty found it difficult to mitigate these challenges
given the evolving nature of this technology, and many emphasized the need for institutional
support and professional development that would help them navigate these complexities more
effectively.
The findings also highlight the concept of reciprocal determinism, which is central to the
theoretical framework for this study. Faculty insights reveal how personal, behavioral, and
environmental factors interacted to shape their experience with this technology. Faculty
perceptions of GenAI tools, such as their beliefs about how these tools impacted critical thinking,
appeared to influence their instructional behaviors, including whether and how they encouraged
students to engage with this technology. Similarly, environmental factors, such as institutional
support and the availability of reliable AI-detection tools, interacted with personal factors, such
as perceptions of students’ work. This led to behavioral changes, including adjustments to
classroom policy and assessment. These findings demonstrate the construct of reciprocal
determinism and illustrate how personal, behavioral, and environmental factors interacted to
shape participants’ experiences with GenAI.
30
Research Question 2
Research question 2 aimed to identify how, if at all, business school faculty utilized
generative AI tools in their courses. During the interviews, 14 of the 15 participants reported
using GenAI tools in some capacity in their teaching. One participant (Jack) avoided using this
technology because he believed it was detrimental to learning. Several themes emerged
regarding the use of this technology, which will be broadly categorized into two areas: planning
for instruction, and usage in instructional activities. Table 3 provides an overview of the various
ways faculty participants described using this technology in their courses. I will also briefly
address how faculty utilized GenAI tools outside of their teaching.
Table 3
Faculty Usage of GenAI Tools in Their Courses
Purpose Faculty Participant(s)
Planning for Instruction
Developing syllabi
Generating lesson plans
Conceptualizing
courses, workshops, and programs
Creating learning objectives
Generating quiz questions
Alex, Michael, Sam, Susan
Michael, Sam, Susan
Alex, Sam, Susan
Susan
Sam, Sarah
31
Usage in Instructional Activities
Facilitating critical thinking about AI
tools through in-class demonstrations
Brainstorming and generating ideas
Creating opportunities for deeper in-class
learning by offloading busy work and
enabling asynchronous skill development
Leveraging AI as a tool for multilingual
learners
Andrew, Catherine, Dante, James, Paul, Sam,
Sofia
Alex, Grant, Maya, Paul, Robert, Sofia, Susan
Alex, James, Maya, Michael, Sam, Sofia
Dante, Grant, Maya, Robert
Planning for Instruction
Five participants utilized technology like ChatGPT as a tool for instructional planning.
They applied it to tasks such as developing syllabi, lesson plans, and learning objectives. They
also used it as a tool to help conceptualize new courses and learning programs. Participants all
emphasized that they used the technology as a complement, not a substitute, to their own work.
Developing Course Syllabi. Four participants used ChatGPT to help them develop their
syllabi. Michael explained, “ChatGPT is very good for sort of cleaning up the syllabus, more as a
refinement tool, rather than a generation tool.” He shared that ChatGPT offered an efficient way
to ensure that the wording in his syllabus clearly communicated his classroom policies to
students, and saved him some time from busy work. Susan used ChatGPT to develop a syllabus
template for a new course she was teaching, which gave her ideas for some helpful language and
descriptions to incorporate. Overall, the four participants who used ChatGPT as a tool to develop
their syllabi found it to be useful as either a starting point for developing new material or as a
means to refine existing material.
32
Generating Lesson Plans. Of the four participants who were previously mentioned,
three also used ChatGPT to help generate lesson plans (Alex, Sam, Susan). Susan found it
particularly helpful for aligning her learning objectives with her classroom activities, and for
designing more interactive lessons. Other participants similarly used the technology to refine
their learning objectives and generate ideas for their lesson plans.
Conceptualizing New Courses and Programs. These same three participants also
shared that they used ChatGPT to conceptualize and plan new courses, programs, workshops,
and other offerings. Alex used it to create an outline for a potential new course. He claimed it
helped him “get from zero to one. Not having a notion of where to start, having a starting point
that's a little farther than just zero, and then being able to kind of build on that.” The other
participants expressed similar sentiments about using ChatGPT to brainstorm and establish a
foundation for them to continue to develop their ideas. Susan also described using it to plan a
workshop for a highly specialized group of subject matter experts. Although she had expertise in
the framework she planned to teach, she was able to use ChatGPT to tailor the workshop to the
specific needs of the group, ensuring that the content was relevant to her audience. These three
participants found the tool to be valuable in the early stages of developing courses and other
learning programs, allowing them to shape and customize their offerings to meet the needs of
their learners.
Using GenAI for Instructional Activities
Thirteen faculty members used GenAI tools for instructional activities. The purpose and
degree to which they incorporated it into activities varied between participants. Several of them
broadly encouraged students to use it as a starting point for some projects or assignments, while
others provided specific guidelines on how students should apply it to their learning.
33
Facilitating Critical Thinking About GenAI Tools. Several faculty used GenAI tools
in activities to facilitate students’ critical thinking about the tools themselves. Five participants
used classroom demonstrations to teach students about the benefits and limitations of using
technology like ChatGPT, and three participants asked students to evaluate AI-generated outputs.
The goal of these instructional activities was to prompt students to think critically about these
tools and to make informed judgments when using them in their projects, assignments, or realworld scenarios.
Five participants used technology like ChatGPT in classroom demonstrations to help
students understand its strengths and weaknesses for various tasks. The purpose was to
encourage discussions with students about the tool’s capabilities, rather than to advance course
content or teach students how to use GenAI directly. The five faculty members who used GenAI
in this way found their demonstrations helped facilitate critical conversations with their students
about the potential benefits and limitations of using this tool, and gave them greater insight into
how their students were thinking about this new technology.
Some faculty used these demonstrations to address concerns about how students may be
misusing technology like ChatGPT. One of these concerns was a noticeable decline in the
diversity of thought in written submissions since the release of ChatGPT. To address this,
Catherine shared that she ran a variety of prompts and exercises through ChatGPT in her classes
to show students that “if they’re all looking up the same topic, they all get the same responses.”
Other participants found that it was equally important to show students how to leverage
this tool. Sam explained how he incorporated ChatGPT into an activity on crisis communication
by using it to create strategic messaging around different types of crises and aligning the output
with different communication models. At the same time, he emphasized to students that it was
34
important for them to have their own knowledge of these models so that they could apply their
best judgment to these situations in the real world. Other participants who discussed the benefits
of this tool similarly expressed that they discouraged students from using it as a replacement for
human thought and knowledge, and hoped that students would instead use it as a means to
deepen their critical thinking.
Four of the five participants who led these classroom demonstrations shared the
sentiment that students must critically evaluate AI-generated outputs, being open to how they
present new opportunities for learning while ultimately relying on their own knowledge and
judgment to assess and adapt those outputs. Paul used image-generating AI to highlight both the
capabilities and limitations of this technology:
I ask ChatGPT or Midjourney to draw a bicycle, and the bicycles look beautiful. From a
distance, they look like really good bicycles. But once you start looking closely, they
make no sense. Like the kickstand will be integrated into the chain, or there will be no
chain at all. It's the kind of thing that looks nice from a distance but would never generate
a functional bicycle. And this is sort of how I describe, you know, what's commonly
called hallucinations and things like that in ChatGPT. Like, if you don't know enough
about bicycles, you look at that picture and say, oh yeah, that's an awesome bicycle. But
if you try to ever do anything with that picture, it's going to fail. So I think that's sort of
how I describe it to them. You can use it as part of your process, but you can't rely on it
to do a finished product at all.
Three participants facilitated activities that asked students to critique and evaluate
responses provided by ChatGPT. For example, Dante had students prompt ChatGPT to identify
rhetorical devices used in contemporary advertising and then asked them to critique its response.
35
He believed that this kind of engagement with AI helped increase their critical thinking skills.
The two other faculty members (Sam and Sofia) agreed that by evaluating AI-generated outputs,
students could understand where human knowledge and judgment are essential.
Brainstorming and Generating Ideas. Seven participants incorporated GenAI into their
activities as a tool to help students brainstorm and generate initial ideas. Alex explained that
“ChatGPT is great at getting people from zero to one, so they’re not staring at an empty page.
Giving them something modular to work with, so that they can adapt it and improve it.” Others
similarly expressed that the biggest hurdle for students is often getting started, and using this
technology as a first step provided them with momentum and a solid foundation to build upon.
Andrew asked students to use ChatGPT to generate ideas at the start of a semester-long creative
project. He observed that in their initial class discussion, students presented more thoroughly
developed ideas and plans than in previous classes, which enabled deeper and richer
conversations about how to shape and apply those ideas throughout the semester.
Creating Opportunities for Deeper In–Class Learning. Six faculty participants used
GenAI tools to create opportunities for deeper in-class engagement with the course material.
Some accomplished this by using GenAI tools to offload busy work and free up class time for
higher-level engagement with the content. Others used GenAI to encourage students to build new
skills asynchronously, allowing them to be better prepared to engage in instructional activities
and discussions during class.
Content Offloading. Four faculty members indicated that they used GenAI tools to
offload busy work which created opportunities for deeper learning and enhanced critical thinking
in class. Michael explained,
36
With my students, I expect them to spend less time doing the busy work trying to find
facts and figures on the web by actually getting that from generative AI. Be a savvy
consumer, double check it, of course, and then think more about the higher level valueadd things around, you know, what does this mean for an organization? How will it
impact an organization's strategy? I think it’s a good thing because it's moving us to a
higher plane of thinking, moving away from all this busy work, which I don't think
students ever liked.
Michael went on to give specific examples about asking students to use ChatGPT to understand
challenges and market trends within unfamiliar industries, which he felt was more efficient than
a traditional internet search. This prepared them to have more engaging and critical discussions
in class.
Other participants agreed that using GenAI tools to manage some of the more tedious or
technical tasks in assignments created in–class opportunities for students to focus on more
integrated and complex problem-solving that they may encounter in a real-world context.
According to faculty, this added incredible value to their discussions and encouraged more
thorough and sophisticated exploration of the content. Sofia shared that in previous semesters,
she would spend up to three weeks teaching students technical skills needed to use certain digital
platforms. She explained that these skills were not a primary learning goal but were necessary for
students to be able to navigate the software. Now that ChatGPT could apply these foundational
skills on behalf of students, Sofia claimed that it freed up those three weeks of instruction to
focus on other areas and gave her students more time to engage with different elements of the
software. Three other faculty members shared this perspective and highlighted the benefits of
using GenAI to offload busy work, which created more time in class to focus on deeper learning.
37
Building Skills Asynchronously. Four participants used GenAI to encourage students to
build new skills asynchronously, which allowed them to be better prepared and engage more
critically with instructional activities. Alex nudged his students to use ChatGPT to practice
conducting interviews by prompting it to role-play as a potential customer. He believed allowing
them to develop this skill in an environment where they do not feel judged could increase their
confidence and improve their abilities.
Two faculty members (Maya and Sofia) integrated the use of various GenAI tools into
their assignments and activities to replicate projects students might encounter in the professional
world. Sofia asked students to use text, image, and video-generating AI platforms to develop a
marketing campaign, explaining, “I'm a big believer in encouraging the students to use AI in
their assignments because I think that mirrors the real world.” These two faculty members
emphasized the importance of bridging academic learning with real-world industry expectations
to ensure that students were successful in their careers. Allowing students to asynchronously
develop their skills related to AI use allowed them to engage more critically in discussions about
the application of these tools in real-world scenarios.
Tools for Multilingual Students. The business school had a substantial population of
international students, and four participants expressed their approval of using technology like
ChatGPT to help multilingual students manage language barriers. Maya shared,
I see that a lot of international students will take the time to use ChatGPT to edit and
revise and look at their grammar to make more concise arguments against what they're
trying to say. It is a fantastic tool for them.
The other three participants agreed that international students who are less confident in their
English skills face unique challenges that GenAI tools could help them overcome. Not only
38
could the technology help them edit and revise their work, but it could also explain the reasoning
behind these changes, which participants believed could help improve students’ language skills.
Dante actively encouraged students for whom academic English was not a primary language to
use technology like ChatGPT for support. “I tell them: write your paper and then have ChatGPT
edit it for clarity, grammar, and fluency. That helps their work so much.” He believed that GenAI
helped multilingual students better communicate their knowledge and ideas, and saw it as a
valuable tool for writing.
Non-Instructional Usage
In addition to using technology like ChatGPT for their courses, several faculty members
found it useful for other professional tasks outside of class. Participants explored using it in ways
such as generating summaries of articles (James, Maya, Sam, Susan), drafting letters of
recommendation (Maya, Sofia), running topic models (Alex), and generating thematic analyses
(Sarah). Although this is outside the primary scope of this study, it reflects how faculty were
leveraging this technology in broader professional settings.
The findings in this section demonstrate the diverse ways business school faculty in this
study used GenAI in their courses. Faculty leveraged technology like ChatGPT for instructional
planning by using it to develop syllabi and lesson plans, conceptualize new courses, and generate
course materials. They also integrated GenAI into their instructional activities to foster critical
thinking about these tools, help students generate ideas for projects and assignments, offload
busy work, and promote asynchronous skill–building to enhance deeper learning during class
and support multilingual students. The findings further demonstrate the concept of reciprocal
determinism and how various factors interacted to shape faculty usage of GenAI tools in their
courses. Faculty perceptions of the opportunities and limitations of this technology influenced
39
how they facilitated critical thinking about these tools and modeled their use to students. For
example, Sam’s positive perceptions of GenAI led him to focus his classroom demonstrations on
innovative and creative ways students could apply the technology to their coursework. In
contrast, Catherine’s skepticism toward GenAI tools shaped her demonstrations and discussions
with students to emphasize the limitations of this technology. This example illustrates how
personal factors, such as beliefs and attitudes toward GenAI tools, impacted these faculty
members’ behavior. Understanding the dynamic between personal, behavioral, and
environmental factors and how they interact to influence the usage of GenAI can help inform
efforts to support faculty in adopting this technology.
Discussion
This study examined how 15 faculty members at a business school within a private
research university perceive the impact of generative AI on their instruction and how they utilize
this technology in their courses. Participants shared opportunities and challenges that GenAI
tools create for teaching and learning and strongly emphasized the need to mitigate challenges
presented by the integration of this new technology. Overall, the findings highlight the
complexities of integrating GenAI, which encompass multiple factors such as participants’
perceptions, behaviors, institutional support, and the rapid advancement of this technology.
These actionable insights reveal how business school faculty within this institution are
navigating AI-driven changes and can be used to inform curriculum design as well as the
development of strategic initiatives to support systemic AI integration in teaching and learning.
Connecting Findings to Existing Research
Consistent with previous studies (Chan & Hu, 2023; Dai et al., 2023), many participants
observed that technology like ChatGPT creates opportunities to enhance critical thinking and
40
serves as a helpful starting point that encourages creativity by helping students brainstorm and
ideate. The findings also reveal faculty concerns about integrating GenAI into their instruction
that are aligned with existing research exploring technology like ChatGPT as an educational tool
(Chan, 2023; Chan & Hu, 2023; Rahman & Watanobe, 2023; Sanchez-Ruiz et al., 2023). Faculty
participants described navigating concerns such as students’ overreliance on GenAI and
expressed frustration with a lack of reliable AI detection tools to address academic integrity
issues. These concerns have caused faculty to approach student work with a new sense of bias
and distrust, a dynamic that has not been substantially addressed in current literature. Participants
also shared concerns about potential gaps between classroom practices involving GenAI tools
and industry expectations and emphasized the importance of gaining knowledge about how
different organizations are using this technology. These gaps indicate a need for stronger
relationships between the business school and industry to ensure students are equipped to
navigate real-world applications of AI once they enter the workforce. Given that faculty are
navigating these complexities within the context of a rapidly evolving technological landscape,
many expressed frustration over feeling the need to continuously adapt their instructional
policies and practices around the use of GenAI and felt more specialized institutional support and
resources are needed. This aligns with existing research on AI literacy which emphasizes the
importance of developing core competencies in AI to mitigate challenges that arise from
integrating these tools (Cardon et al., 2023; Laupichler et al., 2022; Southworth et al., 2023).
The findings of the present study also reveal how faculty participants are utilizing GenAI
in their courses both for instructional planning and in instructional activities, which is consistent
with previous research about how AI might function as an educational tool (Kascneci et al.,
2023; Kuleto et al., 2021; Rahman & Watanobe, 2023). Faculty in this study used GenAI to
41
create and refine course materials and conceptualize new courses or learning programs, finding
these tools to be a helpful supplement to their own work. Some faculty participants used GenAI
to help students think more critically about these tools by demonstrating examples of its
strengths and limitations for completing various tasks, which led to open discussions with
students about how they are using and thinking about this technology. Faculty also incorporated
these tools to facilitate brainstorming, help students more confidently approach new content
areas, and maximize time for deeper learning by reducing time spent on busy work. A few
faculty participants redesigned assignments to integrate a variety of GenAI platforms that
simulated projects students might encounter in their careers. These participants emphasized the
importance of aligning classroom learning with real-world industry expectations. The broader
implication of the various uses of GenAI identified in this study is that its responsible,
intentional, and strategic integration into teaching and learning has the potential to transform
instructional practices and serve as a valuable tool for developing learners.
Understanding Faculty Experiences Through a Social-Cognitive Lens
The findings of this study exhibit an interplay between a variety of personal, behavioral,
and environmental factors related to faculty experiences with GenAI tools that align with the
concept of reciprocal determinism in social cognitive theory (SCT). Faculty perceptions of
GenAI, both positive and negative, influenced their instructional behaviors. For example,
participants who believed that these tools are detrimental to critical thinking were less likely to
encourage students to use them in their projects and assignments, while those who believed
GenAI tools could enhance critical thinking were more likely to encourage their use and
incorporate them into their instruction. Additionally, the lack of tailored institutional support and
resources related to the integration of GenAI tools constrained how some faculty incorporated
42
these tools into their courses. Some faculty expressed that with proper guidance, they would be
more willing to integrate these tools into their instruction. The findings also illustrated a
reciprocal relationship between faculty perceptions of students’ potential misuse of GenAI tools
and how they approach assessment. Some participants expressed that the lack of reliable AI
detection software along with an increase in students’ usage of GenAI (environmental factors)
has led to feelings of distrust towards the integrity of students’ work (personal factors), which in
turn can affect their approaches to assessment (behavioral factors) and harm student-faculty
relationships.
Participants’ insights revealed varying levels of confidence and competence in using
GenAI in teaching and learning, which could be indicative of their self-efficacy with regard to
integrating this technology into their courses at the time of these interviews. The findings
revealed that participants who were less confident were less likely to engage with GenAI tools
and incorporate them into their courses. Conversely, those who expressed higher levels of
confidence were more likely to use GenAI tools in their courses and to encourage students to use
them as well. These patterns suggest that support and resources that increase faculty members’
confidence in using GenAI could encourage them to integrate these tools into their courses more
effectively while also supporting the development of their confidence in this area.
This study revealed that GenAI tools could present great opportunities for enhancing
instruction while also introducing significant concerns that demand a proactive approach toward
mitigating potential risks. By understanding how this technology impacts faculty and how they
are using it, this institution can implement systems of support that maximize the benefits of
GenAI while minimizing challenges.
43
Implications for Practice
The findings from this study demonstrate a need for institutional support to help faculty
leverage opportunities, mitigate challenges, and navigate complexities related to the integration
of GenAI tools into their instruction. These findings suggest that fostering AI literacy, addressing
ethical concerns, and aligning education with industry expectations are all critical areas for
faculty and administrators to consider when making decisions about practice and policy. The
following recommendations are based on these implications and highlight the importance of
equipping faculty with the appropriate knowledge and resources to confidently navigate the
emerging and evolving role of AI in higher education. While these recommendations were
developed for the specific context of the business school in this study, they can be adapted by
other organizations with similar concerns about the integration of GenAI tools.
Developing AI Literacy
As GenAI tools continue to expand, evolve, and disrupt instruction, university
administrators should develop change management strategies that incorporate AI literacy to
facilitate effective AI integration. These strategies should aim to provide institutional support
and resources for the successful integration of AI and encourage inter- and cross-departmental
collaboration to establish communities of practice. They should also aim to create clear
guidelines for AI integration that can inform faculty practices.
Administrators and decision–makers within this business school should prioritize
developing tailored and comprehensive professional development initiatives to build AI literacy.
Although this study focused on faculty, these initiatives should also extend to students so that
there is alignment between faculty efforts to integrate GenAI tools into instruction and students’
abilities to engage with the technology competently and responsibly. A study by Almaraz-Lopez
44
et al. (2023) concludes that students’ interest in AI exceeds their knowledge about using this
technology, emphasizing the importance of AI literacy initiatives that center the needs of both
students and faculty. To implement these recommendations, administrators might begin by
outlining baseline AI literacy standards that address the most common concerns or goals across
the school. While there may be certain core competencies with regard to using AI tools that are
helpful for all learners of this technology, departments should also build on these foundational
skills with professional development tailored to the unique needs of their faculty. This level of
customization reflects participants’ suggestions from the findings which emphasized different
needs across subject areas. Professional development might be delivered in the form of
synchronous and asynchronous workshops, which many participants preferred. These workshops
could incorporate other faculty suggestions such as interactive elements, case studies, and
opportunities to begin fostering communities of practice.
AI literacy initiatives can cultivate what Walczak and Cellary (2023) refer to as a “culture
of responsible AI use” (p. 96) which can help mitigate many of the challenges expressed by
faculty in this study. Researchers offer various frameworks for developing AI literacy that
administrators can use to inform their initiatives. For example, Kong and Zhang (2021) offer a
conceptual framework for AI Literacy comprising cognitive, affective, and sociocultural
dimensions. These serve to educate participants on the basic concepts of AI, empower them to
engage with AI, and inform them of ethical considerations. Another framework proposed by Ng
et al. (2021) aims to foster AI literacy by aligning it with the cognitive domains in Bloom’s
Taxonomy. A more recent paper introduces a theoretical framework focused on building GenAI
competencies for specific contexts and emphasizes identifying specific objectives of the
application prior to using the technology (Su & Yang, 2023). Cardon et al. (2023) propose a
45
framework based on the perspectives of communication instructors, which centers students and
emphasizes a shared effort between administrators and instructors to support the development of
AI literacy. These frameworks, combined with the findings from this study, offer a strong
foundation for developing AI literacy initiatives tailored to diverse needs.
Based on concerns related to academic integrity and faculty acknowledging a new sense
of bias and distrust when approaching student work, initiatives to develop AI literacy should also
incorporate a focus on the ethical integration of AI tools. Despite a lack of reliable AI detection
software, some faculty in this study claimed to be able to recognize when students have
inappropriately used GenAI in their assignments. However, current research on instructors’
abilities to detect AI-generated text (Fleckenstein et al., 2024) reveals that they are not able to
reliably identify its usage in students’ written assignments. Faculty assumptions about students’
misuse of GenAI tools have significant ethical implications, particularly when these assumptions
may be informed by unconscious biases that influence how faculty assess and interact with
students. Implicit grading bias based on factors such as students’ ethnic and racial backgrounds
is a documented issue (Malouff & Thorsteinsson, 2016), and the integration of GenAI tools
without initiatives that focus on ethical use can amplify these biases.
Ensuring Equitable Access to GenAI Tools
To promote equitable access to this new technology, the university should provide free
access to premium versions of technology like ChatGPT. Earlier this year, the University of
California, Los Angeles (UCLA) became the first institution in California to offer its students,
faculty, and staff access to ChatGPT Enterprise (Thomas, 2024), a premium version of ChatGPT
that UCLA administrators hope can enhance the school's instructional and research experiences.
Providing similar access would promote equity and ensure that all members of the university
46
community can benefit from opportunities offered by GenAI tools regardless of their financial
circumstances.
Aligning Education with Industry Expectations
This study’s findings reveal a disconnect between classroom learning and industry
expectations. To bridge this gap, business faculty can gain insights into how organizations are
integrating GenAI tools into their practices by reviewing relevant case studies and attending
professional conferences that highlight AI applications in organizational contexts. University
administrators can also develop partnerships and joint initiatives with organizations that create
opportunities for faculty to explore industry practices related to AI that can inform their
instruction.
Limitations
The limitations of this study include the use of a single data source, possible sampling
bias, and a narrow temporal context. The findings examined how participants’ perceptions and
behaviors might reflect self-efficacy, but the study did not use a psychometric assessment to
measure it directly. Findings provided valuable qualitative insights but prevent definitive
conclusions from being drawn about self-efficacy as a measurable construct within the study
design. This study relied solely on in-depth interview data to determine findings, and although
the data consists of diverse perspectives from participants across various business departments,
the lack of additional sources limited the ability to cross-verify findings with other types of data.
Moreover, the use of convenience sampling may have introduced certain biases that influenced
participants’ responses. Although no formal power imbalance exists between myself and the
faculty within the business school, my role as a fellow employee may have had some influence
on their responses. A mixed sampling approach was applied to mitigate these potential
47
challenges. Lastly, given the speed at which GenAI tools are evolving, the perspectives captured
in this study reflect a specific timeframe within a changing environment. Participants’ attitudes
and experiences with this technology will likely shift as this technology continues to advance.
Recommendations for Future Research
The findings from this study reveal several opportunities for future research. This study
narrowly focused on capturing the perspectives of business faculty within a single institution,
and future research might consider expanding the focus to other stakeholders, disciplines, and
universities. Different areas of study are likely experiencing diverse impacts from the
introduction of this new technology, and it may be beneficial to understand faculty perspectives
across different fields and explore how factors related to these fields impact the integration of
GenAI tools. Given that most studies about AI in education recommend implementing AI
literacy initiatives, future research should also examine different frameworks and strategies for
professional development. Furthermore, future research should center on students’ voices, as
they are a major stakeholder group in higher education and are directly impacted by decisions
about AI integration. Faculty participants in this study acknowledged that GenAI has the
potential to support learners with diverse needs, such as multilingual students, and future studies
should examine how GenAI can potentially enhance support for these needs. Furthermore, based
on concern over a misalignment between classroom practices and industry applications, future
research should explore how business schools can bridge this gap to promote real-world
applications of GenAI tools. Additionally, as managing change is a significant challenge for
faculty, further studies could explore how higher education institutions could facilitate successful
change management related to AI integration. Lastly, due to the speed at which AI tools are
evolving, the perspectives captured in this study may be fleeting and represent only a brief
48
moment in time. Future research might consider conducting longitudinal studies to explore how
university stakeholders’ perspectives and practices evolve over time, which could provide
helpful insights into the long-term outcomes of integrating this technology.
Conclusion
This study examined how 15 full-time business faculty members within a private research
university perceive the impact of technology like ChatGPT on their instruction and explored how
they use this technology in their courses. The findings revealed that faculty participants
experienced both opportunities and challenges related to integrating GenAI tools, and shed light
on how faculty navigate the complexities this technology has introduced since its release. The
integration of GenAI into educational contexts is an emerging area of study with limited
empirical research, particularly qualitative studies, that explores faculty perceptions of these
tools. Findings from this study, supported by rich and nuanced responses from faculty about their
experiences, provide significant implications for practice and could help inform curriculum
design and professional development initiatives.
49
References
Almaraz-López, C., Almaraz-Menéndez, F., & López-Esteban, C. (2023). Comparative Study of
the Attitudes and Perceptions of University Students in Business Administration and
Management and in Education toward Artificial Intelligence. Education Sciences, 13(6),
609.
Altman, S. [@sama]. (2022, December 10). ChatGPT is incredibly limited but good enough at
some things to create a misleading impression of greatness. [Post]. X.
https://twitter.com/sama/status/1601731295792414720?lang=en.
Al Shakhoor, F., Alnakal, R., Mohamed, O., & Sanad, Z. (2024). Exploring Business Faculty’s
Perception About the Usefulness of Chatbots in Higher Education. In Artificial
Intelligence-Augmented Digital Twins: Transforming Industrial Operations for
Innovation and Sustainability (pp. 231-244). Cham: Springer Nature Switzerland.
Atlas, S. (2023). ChatGPT for higher education and professional development: A guide to
conversational AI.
Akinwalere, S. N., & Ivanov, V. (2022). Artificial intelligence in higher education: challenges
and opportunities. Border Crossing, 12(1), 1-15.
Bandura, A. (1986). Social foundations of thought and action. Englewood Cliffs, NJ, 1986(23-
28), 2.
Bandura, A. (1997). Self-efficacy: The exercise of control. Macmillan.
Bandura, A. (2001). Social cognitive theory: An agentic perspective. Annual review of
psychology, 52(1), 1-26.
50
Cardon, P., Fleischmann, C., Aritz, J., Logemann, M., & Heidewald, J. (2023). The Challenges
and Opportunities of AI-Assisted Writing: Developing AI Literacy for the AI Age.
Business and Professional Communication Quarterly, 23294906231176517.
Chan, C. K. Y. (2023). A comprehensive AI policy education framework for university teaching
and learning. International Journal of Educational Technology in Higher Education,
20(1), 1-25.
Chan, C. K. Y., & Hu, W. (2023). Students' Voices on Generative AI: Perceptions, Benefits, and
Challenges in Higher Education. arXiv preprint arXiv:2305.00290.
Dai, Y., Liu, A., & Lim, C. P. (2023). Reconceptualizing ChatGPT and generative AI as a
student-driven innovation in higher education.
Dwivedi, Y. K., Kshetri, N., Hughes, L., Slade, E. L., Jeyaraj, A., Kar, A. K., ... & Wright, R.
(2023). “So what if ChatGPT wrote it?” Multidisciplinary perspectives on opportunities,
challenges and implications of generative conversational AI for research, practice and
policy. International Journal of Information Management, 71, 102642.
Fleckenstein, J., Meyer, J., Jansen, T., Keller, S. D., Köller, O., & Möller, J. (2024). Do teachers
spot AI? Evaluating the detectability of AI-generated texts among student essays.
Computers and Education: Artificial Intelligence, 6, 100209.
Kadaruddin, K. (2023). Empowering education through generative AI: Innovative instructional
strategies for tomorrow’s learners. International Journal of Business, Law, and
Education, 4(2), 618–625.
Kasneci, E., Seßler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., ... & Kasneci,
G. (2023). ChatGPT for good? On opportunities and challenges of large language models
for education. Learning and individual differences, 103, 102274.
51
Kooli, C. (2023). Chatbots in education and research: A critical examination of ethical
implications and solutions. Sustainability, 15(7), 5614.
Kumar, R., & Mindzak, M. (2024). Who wrote this? Detecting artificial intelligence–generated
text from human-written text. Canadian Perspectives on Academic Integrity, 7(1).
Lo, C. K. (2023). What is the impact of ChatGPT on education? A rapid review of the literature.
Education Sciences, 13(4), 410.
Lochmiller, C. R., & Lester, J. N. (2015). An introduction to educational research: Connecting
methods to practice. Sage Publications.
Long, D., & Magerko, B. (2020, April). What is AI literacy? Competencies and design
considerations. In Proceedings of the 2020 CHI conference on human factors in
computing systems (pp. 1-16).
Malouff, J. M., & Thorsteinsson, E. B. (2016). Bias in grading: A meta-analysis of experimental
research findings. Australian Journal of Education, 60(3), 245-256.
Ng, D. T. K., Leung, J. K. L., Chu, K. W. S., & Qiao, M. S. (2021). AI literacy: Definition,
teaching, evaluation and ethical issues. Proceedings of the Association for Information
Science and Technology, 58(1), 504-509.
Ng, D. T. K., Wu, W., Leung, J. K. L., Chiu, T. K. F., & Chu, S. K. W. (2023). Design and
validation of the AI literacy questionnaire: The affective, behavioural, cognitive and
ethical approach. British Journal of Educational Technology.
Ngo, T. T. A. (2023). The perception by university students of the use of ChatGPT in education.
International Journal of Emerging Technologies in Learning (Online), 18(17), 4.
52
Obenza, B. N., Salvahan, A., Rios, A. N., Solo, A., Alburo, R. A., & Gabila, R. J. (2023).
University Students' Perception and Use of ChatGPT: Generative Artificial Intelligence
(AI) in Higher Education.
Rahman, M. M., & Watanobe, Y. (2023). ChatGPT for education and research: Opportunities,
threats, and strategies. Applied Sciences, 13(9), 5783.
Roumeliotis, K. I., & Tselikas, N. D. (2023). ChatGPT and Open-AI Models: A Preliminary
Review. Future Internet, 15(6), 192.
Salkind, N. J., & Frey, B. B. (2021). Statistics for people who (think they) hate statistics: Using
Microsoft Excel. Sage publications.
Schunk, D. H. (2012). Learning theories an educational perspective. Pearson Education, Inc.
Schunk, D. H., & DiBenedetto, M. K. (2020). Motivation and social cognitive theory.
Contemporary educational psychology, 60, 101832.
Southworth, J., Migliaccio, K., Glover, J., Reed, D., McCarty, C., Brendemuhl, J., & Thomas, A.
(2023). Developing a model for AI Across the curriculum: Transforming the higher
education landscape via innovation in AI literacy. Computers and Education: Artificial
Intelligence, 4, 100127.
Kong, S. C., & Zhang, G. (2021). A conceptual framework for designing artificial intelligence
literacy programmes for educated citizens. In Conference proceedings (English paper) of
the 25th Global Chinese Conference on Computers in Education (GCCCE 2021) (pp. 11-
15). Centre for Learning, Teaching and Technology, The Education University of Hong
Kong.
Lester, J. N., Cho, Y., & Lochmiller, C. R. (2020). Learning to do qualitative data analysis: A
starting point. Human resource development review, 19(1), 94-106.
53
Lochmiller, C. R., & Lester, J. N. (2015). An introduction to educational research: Connecting
methods to practice. Sage Publications.
Manyam, S. B., & Panjwani, S. (2019). Analysing interview transcripts of a phenomenological
study on the cultural immersion experiences of graduate counselling students. SAGE
Publications, Limited.
Mohamed, A. M. (2023). Exploring the potential of an AI-based Chatbot (ChatGPT) in
enhancing English as a Foreign Language (EFL) teaching: perceptions of EFL Faculty
Members. Education and Information Technologies, 1-23.
Mousavi Baigi, S. F., Sarbaz, M., Ghaddaripouri, K., Ghaddaripouri, M., Mousavi, A. S., &
Kimiafar, K. (2023). Attitudes, knowledge, and skills towards artificial intelligence
among healthcare students: A systematic review. Health Science Reports, 6(3), e1138.
Ng, D. T. K., Leung, J. K. L., Chu, S. K. W., & Qiao, M. S. (2021). Conceptualizing AI literacy:
An exploratory review. Computers and Education: Artificial Intelligence, 2, 100041.
Nithithanatchinnapat, B., Maurer, J., Deng, X., & Joshi, K. D. (2024). Future Business
Workforce: Crafting a Generative AI-Centric Curriculum Today for Tomorrow's
Business Education. ACM SIGMIS Database: the DATABASE for Advances in
Information Systems, 55(1), 6-11.
Sánchez-Ruiz, L. M., Moll-López, S., Nuñez-Pérez, A., Moraño-Fernández, J. A., & VegaFleitas, E. (2023). ChatGPT Challenges Blended Learning Methodologies in Engineering
Education: A Case Study in Mathematics. Applied Sciences, 13(10), 6039.
Su, J., & Yang, W. (2023). Unlocking the power of ChatGPT: A framework for applying
generative AI in education. ECNU Review of Education, 20965311231168423.
54
Tominc, P., & Rožman, M. (2023). Artificial Intelligence and Business Studies: Study Cycle
Differences Regarding the Perceptions of the Key Future Competences. Education
Sciences, 13(6), 580.
Thomas, S. (2024, September 20). UCLA to become first California university to offer ChatGPT
Enterprise accounts. Daily Bruin. https://dailybruin.com/2024/09/20/ucla-to-become-firstcalifornia-university-to-offer-chatgpt-enterprise-accounts
Valcea, S., Hamdani, M. R., & Wang, S. (2024). Exploring the impact of ChatGPT on business
school education: Prospects, boundaries, and paradoxes. Journal of Management
Education, 48(5), 915-947.
Wang, T., Lund, B. D., Marengo, A., Pagano, A., Mannuru, N. R., Teel, Z. A., & Pange, J.
(2023). Exploring the Potential Impact of Artificial Intelligence (AI) on International
Students in Higher Education: Generative AI, Chatbots, Analytics, and International
Student Success. Applied Sciences, 13(11), 6716.
Xiao, P., Chen, Y., & Bao, W. (2023). Waiting, Banning, and Embracing: An Empirical Analysis
of Adapting Policies for Generative AI in Higher Education. arXiv preprint
arXiv:2305.18617.
Zhu, C., Sun, M., Luo, J., Li, T., & Wang, M. (2023). How to harness the potential of ChatGPT
in education?. Knowledge Management & E-Learning, 15(2), 133.
55
Appendix A
Interview Protocol
Introduction:
Thank you for participating in my study! I appreciate you taking the time to share your insights
and experiences. The interview should take approximately one hour. Does this still work for
you?
Before we begin, I’d like to give you a few reminders about this study. Please feel free to
interrupt me with any questions.
I am a doctoral candidate at USC Rossier and an employee at XXXXXX. The purpose of my
study is to understand how generative AI impacts instruction for faculty at this business school,
and to explore how faculty might be using generative AI in their courses.
I understand that I may be asking you to share some sensitive information, and I want to assure
you that your responses are completely confidential. Your name will not be shared with any
faculty, staff, or students outside of the research team. Direct quotes or a summary of your
responses may appear in my dissertation, but I will use a pseudonym and will do my best to
ensure that your responses can not be used to identify you.
You have already received a copy of the Study Information Sheet, but I’d like to reiterate that
your participation in this study is entirely voluntary, and you are free to withdraw at any time.
Do I have your permission to record our conversation so that I can revisit it in the future and
ensure that I am representing it accurately in my research?
Part I Introductory Questions
1. How long have you been teaching/at this business school?
2. Do you teach primarily undergraduate students, graduate students, or both?
3. What classes have you taught since January 2023?
4. Could you describe what you believe are the core learning outcomes you hope students
achieve from taking courses within your department?
Part II: Core Questions
5. There are those who say that generative AI will change business education. How, if at all,
has generative AI changed your work as an instructor?
6. Can you describe some specific ways, if any, that generative AI may be beneficial for
instruction?
56
7. Can you describe some specific ways, if any, that generative AI creates challenges for
instruction?
8. What are your thoughts on using generative AI as an instructional tool?
9. What are your thoughts on encouraging students to use generative AI in their
assignments?
a. What are your concerns, if any, about encouraging students to use generative AI
in their assignments?
10. How, if at all, do you use generative AI for instruction in ways such as lesson planning,
creating course materials, giving feedback, etc.
11. We’ve been talking about the use of generative AI in your instruction - how, if at all, do
you use it in other ways outside of your instruction (e.g. research, letters of
recommendation, administrative tasks, etc.)?
12. Does your syllabus include any policy regarding the use of generative AI, and if so, could
you please describe the policy?
13. How, if at all, have you altered assignments due to concerns over students' use of
generative AI?
14. How confident do you feel using generative AI as a tool for instruction?
15. What affects or contributes to your level of confidence?
16. Where do you receive most of your information on generative AI in the classroom?
17. Can you describe what type of institutional support, if any, you’ve received from the
department or university on integrating generative AI into your courses?
18. What forms of support do you find most helpful for improving your ability to integrate
generative AI into your courses, and why?
Part III Closing Question
19. Is there anything about your experience with understanding and using generative AI for
academic purposes that we haven’t discussed yet, and if so, could you please share your
thoughts?
Closing
Thank you so much for sharing your thoughts and experiences with me! Your insights are
extremely valuable to this study, and I truly appreciate your time. If there is anything else you
would like to add or clarify, please do not hesitate to reach out.
Abstract (if available)
Abstract
This study applies social cognitive theory (SCT) to understand how personal, behavioral, and environmental factors impact business faculty members’ perceptions and usage of generative artificial intelligence (GenAI) tools like ChatGPT in their courses. The purpose of this study is to explore how business faculty in a private research university perceive the impact of GenAI on teaching and learning, and how, if at all, they utilize this technology in their courses. This study applied a phenomenological approach and collected data through in-depth semistructured interviews with 15 full–time faculty members across six departments within a business school. The SCT construct of triadic reciprocity informed the design of the interview protocol which focused on personal, behavioral, and environmental factors related to participants’ experiences with GenAI in teaching and learning. Transcripts were then analyzed using thematic analysis to interpret themes and patterns that emerged from the data. Findings from this study identified potential opportunities and challenges for instruction presented by GenAI and emphasized a need to leverage benefits while mitigating risks. Participants also shared a variety of ways in which they used GenAI tools for instructional purposes. This study provides actionable insights to inform professional development and strategic initiatives for integrating GenAI into teaching and learning.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Encoding counter-memories: artificial intelligence as a tool for APIA community empowerment
PDF
Perceptions and approaches to Armenian genocide education among reputably effective public secondary school educators in greater Los Angeles
PDF
Syllabus review equity-oriented tool as a means of self-change among STEM faculty
PDF
Responsible AI adoption in community mental health organizations: a study of leaders’ perceptions and decisions
PDF
Developmental education pathway success: a study on the intersection of adjunct faculty and teaching metacognition
PDF
Failing toward success: a learn-from-failure mindset and its impact on institutional effectiveness in higher education
PDF
Supporting faculty in preparing entrepreneurship: an exploration in the context of active learning
PDF
Academic beliefs and behaviors in on-campus and online general education biology classes
PDF
Sense of belonging among Black business students at a predominantly White institution
PDF
Collaboration, capacity, and communication: Leaders’ perceptions of innovative work behavior across hybrid and remote work environments
PDF
An examination of factors that affect online higher education faculty preparedness
PDF
The examination of academic self-regulation, academic help-seeking, academic self-efficacy, and student satisfaction of higher education students taking on-campus and online course formats
PDF
Examining the role of virtual reality in environmental education
PDF
Understanding burnout in non-denominational clergy: a social cognitive approach
PDF
Media literacy education: a qualitative inquiry into the perspectives teachers hold about teaching media literacy
PDF
Exploring primary teachers' self-efficacy and technology integration in early reading instruction
PDF
Student academic self‐efficacy, help seeking and goal orientation beliefs and behaviors in distance education and on-campus community college sociology courses
PDF
Culturally relevant sex education for Black adolescents: a study of Black sexuality educators
PDF
Race in the class: exploring learning and the (re)production of racial meanings in doctoral coursework
PDF
Success in the sticky: exploring the professional learning and instructional practices that are sticky for distinguished secondary STEM educators of students historically…
Asset Metadata
Creator
Karamyan, Gohar (author)
Core Title
Exploring generative AI in business education: faculty perceptions and usage
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Educational Leadership
Degree Conferral Date
2025-05
Publication Date
01/30/2025
Defense Date
01/09/2025
Publisher
Los Angeles, California
(original),
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
business education,ChatGPT,generative artificial intelligence (GenAI),OAI-PMH Harvest,self–efficacy,social cognitive theory (SCT),triadic reciprocity
Format
theses
(aat)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Hirabayashi, Kimberly (
committee chair
), Aguilar, Stephen (
committee member
), Samkian, Artineh (
committee member
)
Creator Email
gkaramya@usc.edu,goharsop@usc.edu
Unique identifier
UC11399FVGK
Identifier
etd-KaramyanGo-13780.pdf (filename)
Legacy Identifier
etd-KaramyanGo-13780
Document Type
Dissertation
Format
theses (aat)
Rights
Karamyan, Gohar
Internet Media Type
application/pdf
Type
texts
Source
20250131-usctheses-batch-1239
(batch),
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
business education
ChatGPT
generative artificial intelligence (GenAI)
self–efficacy
social cognitive theory (SCT)
triadic reciprocity